I’m currently using Cloud Run for my application, and I need to connect it to a persistent storage solution. The purpose is to store user data for future sessions. While considering different options, I found that Google Cloud Storage Fuse for Cloud Run does not meet my requirements. Additionally, I attempted to use GCP Filestore by mounting it as a network file system onto a Cloud Run service. Although it works, Filestore is quite expensive, and its lack of dynamic storage options is a drawback. Even if I only utilize 100MBs, I have to pay for the whole 1TB (minimum configuration storage).
Is there an alternative method in Google Cloud Platform (GCP) similar to AWS Elastic File System (EFS)? I’m looking for a solution that charges based on the storage actually used and can be configured for use with Cloud Run applications. Additionally, is there a way to connect EFS with GCP Cloud Run?
I’m a product manager on Cloud Run working on improving our volumes support. May I ask: in what way does Cloud Storage FUSE not meet your requirements?
Hello Roopak, we’re working on improving our logging here. Were you able to resolve the issue? We will be launching a create/edit UI in Cloud Console soon, so that may be an easier way to get things configured correctly.
Another issue with gcs bucket with fuse in cloud run is that it does not support changing file directory permissions . Ex i tried running chmod +x on a file but it did not change the permission . Althogh it supports read and right , it is not supporting exceution permission on the file.
Can you please have a check on this as well, because this issue only persist s with GCS volume with Cloud run , when we try attaching the fuse directlty to a compute engine or gke via Csi driver i am able to change the file permission.
FWIW, GC Storage doesn’t work for my use-case either. The fuse driver relies on locally caching files to/from the GCS bucket. My project is working with files typically 100gb or more which means my cloud run job would need enough local storage to cache the transfers to/from GCS. My workflow is that users upload files directly to a GCS bucket using signed-urls then there is a job to process these files and move the resulting processed file to a different GCS bucket to be consumed by another application.
gcsfuse doesn’t work for my use case either. I’m using it with Cloud Run to download large videos, which can be up to 10GB. gcsfuse consumes a lot of RAM because Cloud Run uses an in-memory disk.
Alternatively, from GCSFuse version 2.9.1 onwards, writes can be configured with streaming writes feature ( which doesnt involve staging the file locally ) with the help of --enable-streaming-writes flag.
But unfortunately, there are no volume mounting options in the Cloud Run console interface. And adding this option to YAML results in an error: Unsupported or unrecognized flag for Cloud Storage volume: enable-streaming-writes (field: spec.execution_template.spec.task_spec.volumes[0]).