Conversational agent error

Hi guys,

I’m following this tutorial to build an AI agent on GCP. I’ve followed the tutorial before and it’s all worked fine, but when I’ve tried to recreate it over the last few days, I hit this error when completing the step “We’re all set. Let’s open the toggle simulator again and ask the same questions (i.e. What’s the best way to reach Wakanda?)

com.google.apps.framework.request.BadRequestException: Unsupported LLM config with identifier 'default' for language 'en'.

I can chat with the agent fine up to this point, but as soon as I ask it a question that triggers the use of the data store tool, I get that error. I’ve tried to troubleshoot but haven’t found this error online anywhere. Gemini is also throwing up its hands at it. I’m kind of a newbie so wanted to ask if anyone has come across it here, or has any ideas? I’ve followed the tutorial to a T so if you want to try to recreate it, just follow these steps. Appreciate any help you can give me!

Hi @fc123,

I tried to replicate your case and got the same error. See screenshot below:

However, I tried changing the Tool Setting from Default to Customize, the agent is now recommending places using the provided information from a text file and the error is gone. See screenshot below:

The guide was last updated in March 2025, so it’s possible there have been recent changes to how default settings are handled for Vertex AI agents and their tools.

1 Like

Thank you very much! This solved it. It seems that the UI has changed since last week; some of the settings (e.g. grounding) that were previously visible on the Tool page are now only visible & editable if I do Customize under Tool settings. Thanks again

thank you so much I have been trying to figure out this error all week!