Is it possible to use Google partner models like Mistral or Anthropic with OpenAI compatibility on Vertex AI?

I usually use the following request to interact with Google models on Vertex AI that support OpenAI-compatible API calls:

payload = {
“model”: “google/gemini-2.5-flash”,
“messages”: [
{
“role”: “user”,
“content”: “Explain to me how AI works”
}
],
“max_completion_tokens”: 200
}

response = requests.post(
“https://aiplatform.googleapis.com/v1/projects/name-project/locations/global/endpoints/openapi/chat/completions”,
headers=headers,
json=payload
)
print(response.json())

Now, I want to use Anthropic and Mistral models that are offered in Model Garden on Vertex AI with a similar workflow, just by changing the model name.

Is this possible? Has anyone tried this, or has any guidance on how to do it?