I try to create a dataproc cloud serverless runtime template to get a spark environment notebook in a vertex ai workbench instance.
If click on the “New Runtime Template” in category Dataproc Serverless Notebooks I get directly the error " failed to list the clusters" and i am not able to save the creation dialog.
What is the problem? Error makes not much sense because its a serverless thing. or?
https://cloud.google.com/vertex-ai/docs/workbench/instances/create-dataproc-enabled#serverless-spark
Dataproc api was already active. the dataproc metadata API not, but this helped nothing error is still showing up.
This plugin should be pre-installed on vertex-ai. A dataproc cluster should not be necessary becaus I would like to use the “Dataproc Serverless Notebooks” → New Runtime Template icon. So no cluster should be necessary or?
I think a solved the problem. A couple of permission roles solved it for the compute engine service account. Also the VPC subnet of the default network hast to be configured that the “Private Google Access” Option is set to ON. A clear guideline that these things are necessary if you start your journey from Vertex AI as a strating point would be helpful:-)
My team has experienced the same issue failed to list the clusters when running the dataproc cluster on the 2.2.50-debian12 image.
The dataproc cluster was created from gcloud command without any issues, when trying to run on the jupyter lab on the cluster, after a few minutes the error failed to list the clusters appears and freezes the jupyter lab. After this event pops up, we can not even ssh to the master node. After some time the cluster is stale (jupyter lab not opened) you can login again, but the issue repeats.