I am trying to write a Google Cloud Function which will allow the caller of the function to upload a file.
The uploaded file should be stored in specific bucket in GCS.
Below is my code :
import functions_framework import google.cloud.storage from google.auth import compute_engine import os
os.environ['GOOGLE_APPLICATION_CREDENTIALS'] = '/path/to/service-account-key.json'
@functions_framework.http def gcftoloadfile1(request): # Use the default credentials for authentication #credentials = compute_engine.Credentials()
# Create a client object with the authenticated credentials
#storage_client = storage.Client()
storage_client = google.cloud.storage.Client.from_service_account_json('XXXXX-XXXXX-956803-94e3153efc1a.json')
#get the bucket Info
#bucket = storage_client.get_bucket('abcbucket')
#Retrieves a blob named "my-file.txt" from the bucket
#blob = bucket.blob('input/input.csv')
#Uploads the string "Hello, World!" to the blob
#blob.upload_from_string('Hello, World from GCF!')
#Confirms that the blob was uploaded successfully
#print(f'File uploaded to {bucket.name}/{blob.name}')
#print(f'Came Here')
return 'Service account downloaded successfully.'
When I try this, it throws the below error: No such file or directory: ‘XXXXX-XXXXX-956803-94e3153efc1a.json’
I could not find a example where I can use “Application Default Credentials” to avoid using the key file.
Any help or pointers would be greatly appreciated.
Example shown above on what all I tried and the error