I am teaching course on Big Data and Cloud computing. Throughout January 2024, I prepared a series of tutorials and exercises using Google Cloud Dataproc. In most cases, I simply opened up GCP, navigated to Dataproc, clicked the Create Cluster button and then accepted all of the defaults (1 manager 2 workers each with n2-standard-4, 500GB disk). The only change I made was to include the Jupyter Notebook support. I would run a cluster for about an hour and then Delete it. This worked without any issues and I must have repeated these steps dozens of times without issue.
Last week I attempted to demonstrate this for my class and I immediately ran into multiple quota issues. Lack of available CPUs for my region (us-central1), lack of disk, etc. Making matters worse, even though my cluster was not created, the Quotas screen still showed that disk space in use.
I saw that my quotas were all reduced to just 500 GB disk and 8 CPUs so the default Dataproc configuration will not run. I was able to successfully appeal and get some increases in disk quota but I could only get an increase to 12 CPUs. I was told that I had to contact my sales manager (which we don’t really have).
Can anyone shed any light as to why my quotas were lowered without informing me?
More importantly, what can we do when the Quotas page shows resources in use but there are clearly no resources being allocated?
One possibility I am seeing here is that, your project might be moved to a different org within your GCP account. Quotas are reset to the default value when a project is moved to a new org.
It can also be that your project was suspended due to non-payment or other issues. Again, quota usage will be reset and the resources will be released. You would need to resolve the suspension issue and then request an increase in quota either via Cloud Console or contacting Google Cloud Support.
For this case, it can be that resources were allocated but not released properly. You will need to wait for it to be released automatically, or just manually release them. If this will not fix the issue, I recommend you to contact Google Cloud Support.
Note: As a teacher, you may be eligible for a free trial or educational grant. You can check your eligibility and apply for the free trial or educational grant on the link that I provided. This will give you access to a higher quota for a limited time, which can be useful for demonstrating and teaching purposes.
Thanks for your notes. My Organization did not change and yes we participate in Google Cloud For Education. I receive a coupon/credit that I apply to my Billing account and that takes care of any charges. There is no issue with payment as far as I can tell.
The frustrating thing about the Resources/Quote consumption is that I can not locate any resources in use. For example, after a failed attempt to launch a Dataproc cluster, I will see that I am supposedly using 500GB of disk. But when I check all of the Persistent Disk settings, I do not see anything listed. These allocations remain ‘sticky’ for up to 12 hours in some cases. From a practical standpoint, this means I can not launch any new clusters due to lack of quota.
Evidently I am not the only one experiencing this. There are similar issues with Quiklabs failing to launch clusters and other faculty report similar hangups.