I’ve tried to set my current project to the quota project based on the link above but it doesn’t seem to rectify the issue. In addition, I tried calling other APIs like Big Query and didn’t run into this problem.
Also, the Cloud Function was ran in the GCP console so technically the credentials are pre-authorised…Anyone have run into a similar problem before?
If there’s some incompatibility, you could try pinning the dependency to 1.32.0, too - but I’d recommend trying to fix forward to 1.36.0 or newer first.
I managed to solve it. Anything which would require OAuth won’t work because Cloud Functions is on the back-end side and OAuth at some point would try to popup a browser for authentication.
Anything which would require some type of OAuth mechanics would fail because Cloud Functions is on the back-end and the system would want to popup a browser to ask for credentials and complete the OAuth auth flow.
After some trial and error I was able to resolve this issue. For context, my function is gen2 , triggered via http request, and is written in python.
Where I was getting tripped up here was in the “test function” feature in the console. My function never executed properly pre-deployment, so I recommend following the troubleshooting steps surfaced in the error message (https://cloud.google.com/docs/authentication/troubleshoot-adc#user-creds-client-based), deploy your function and then test either from CLI or the ‘Testing’ tab in the console.
What I think fixed the issue for me was running this from the terminal
If you’re using Functions v2, you can configure this on the underlying Cloud Run service. Once configured, using any of the client libraries should ‘just work’ using the workload identity of the service, no additional configuration required.
I’ve followed the page to get a NodeJS service working locally and it calls Vertex AI fine. But when I deploy to Cloud Run I get a 403. How do I “configure this on the underlying Cloud Run service”? Thanks.