I have created service accounts for external clients to upload files to a bucket.
Each service account is assigned directly and only its folder within the bucket.
The problem is that at the moment only one of the service accounts does not work, even with the same settings as the others.
The client tries to send the files via the Google SDK and fails.
I deleted and recreated the service account, put and removed the custom role, nothing worked.
In the last test the client managed to upload only one file, but the other gave the error that normally appears:
ERROR: (gcloud.storage.cp) [service-account@project.iam.gservice.com] does not have permission to access b instance [bucket] (or it may not exist): service-account@project.iam.gservice.com does not have storage.buckets.get access to the Google Cloud Storage bucket. Permission âstorage.buckets.getâ denied on resource (or it may not exist). This command is authenticated as service-account@project.iam.gservice.com which is the active account specified by the [core/account] property.
I have attached some images to help.
If anyone can help please do so, I would appreciate it.
Based on the error you receive, your access account still doesnât have the storage.buckets.get permission required. You need to check if the IAM policies applied to bucket level and inherited from the project level.
Another way To allow a service account to upload files only to a specific âfolderâ (path prefix) within a bucket, grant it the roles/storage.objectCreator role combined with an IAM Condition that restricts object creation to that prefix
Go to the Google Cloud Console â Cloud Storage â Buckets â Select your bucket.
Click the âPERMISSIONSâ tab.
Click â+ GRANT ACCESSâ.
Enter the email address(es) of the user(s) or service account(s) you want to grant upload access to.
Assign Role:
Click âSelect a roleâ.
Filter by typing âObject Creatorâ.
Select the Storage Object Creator (roles/storage.objectCreator) role.
Add IAM Condition (Crucial Step):
Click â+ ADD IAM CONDITIONâ.
Give it a descriptive title
Select âCondition Editor".
Condition: Use the resource.name attribute, which represents the full object path
This will allow the service account to upload files in the destination object starting with the folder name. But this role does not grant storage.objects.list. You can add the roles/storage.objectViewer also with condition.
Same steps mentioned above but change the assign role to âObject Viewerâ.
Was this helpful? If so, please accept this answer as âSolutionâ. If you need additional assistance, reply here within 2 business days and Iâll be happy to help.
Hello, @kensan , how are you?
Thank you very much for your help.
I checked both the bucket-level and project-level policies and didnât find any account with the roles/storage.objectCreator role, but I need to share some information gathered from tests I ran earlier this morningâŚ
When the service accounts try to upload small files (~70MB), the upload works; however, when they try to upload larger files (~1GB), thatâs when the error message mentioned in this thread appears.
Based on that, I decided to test uploading the 1GB file using my account, which has several permissionsâunlike the service accounts, which are supposed to have minimal privilegesâand it worked!
Now the question is: what can I do to allow these service accounts to do the same as my account, but with the least privilege possible? And which role should be assigned?
Talking with support the problem was solved. Iâll copy the answer here:
"We have identified something when working with large files, and that is that when you upload a large file using âgcloud storage cpâ to optimize the action the tool switches to âParallel composite uploadsâ, you can see more about this feature here[1].
Parallel composite uploads optimizes the request by dividing the file into 32 chunks, they are uploaded as temporary objects into the bucket and then it composes the file into one single file in the destination folder. This creates a discrepancy between the action and the permissions you have, because in order to upload the temporary files it needs to list the bucket, but the service account only has permissions to write inside the folder.
I have some options you can use and I know they can help you to solve this problem.
If you think it is convenient, you can change the access of the service account to a bucket level, you can see how to apply your custom role to the service account here[2], that for sure will work and your account will be able to upload files to the folder.
Now if giving permissions to a bucket level is not an option you can opt from not using Parallel composite upload, by doing that your file will be uploaded as a single file, but the command wonât need to list the bucket to upload the temporary files.
To do that you can run the following command: gcloud config set storage/parallel_composite_upload_enabled False , you can see what this command does on the documentation, once you run that command you can upload larger files."