Scheduling (unlimited) Results to S3

2 posts were split to a new topic: Support S3 encrypted buckets

I created a new topic on whether or not IAM Roles are supported when sending the results to S3:

[Sending data to s3 required hard coded access keys. Why not IAM ROLES?](https://discourse.looker.com/t/sending-data-to-s3-required-hard-coded-access-keys-why-not-iam-roles/14114) Administration

According to this doc we need AWS Access Keys and Secret Keys: All of our infrastructure in AWS uses IAM Roles. We don’t have any Access Keys or Secret keys anywhere (for security purposes, mostly) Is there a way to use IAM Roles to send data to s3?

I’m posting here, in case anyone runs across it with this post.

Our bucket also denies uploads unless server-side encryption bit is set in the request.

3 Likes

What ACL does Looker uses when we export to S3? We are having issues with the end destination (another AWS account) being unable to access the file.

1 Like

We’ve setup the IAM user with access key and secret key.

For some reason, when we select the limit to be ā€œResults in Tableā€, it works, but when we select the limit to be ā€œAll resultsā€, it fails.

Does it need a separate set of permission for all results which might be streaming data? the size of the result for All Results is only 67KB, and yet s3 put is failing.

Error: Upload to S3 bucket {bucket_name} aborted due to error: The ciphertext refers to a customer master key that does not exist, does not exist in this region, or you are not allowed to access.

every time we need to send or schedule data to s3 bucket we need to insert the access and secret keys?