Status: 429 Response:\nb\'"Too many requests."

Hello,

I am receiving below errors when trying to read a gzip file from Google Cloud Storage through dataflow. I am receiving the same error for multiple dataflow jobs at the same step. I am dont understand what I need to change

google.auth.exceptions.TransportError: (‘Failed to retrieve http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/dataflow-runner@*****.iam.gserviceaccount.com/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fbigquery%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fdevstorage.full_control%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fuserinfo.email%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fdatastore%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fspanner.admin%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fspanner.data from the Google Compute Engine metadata service. Status: 429 Response:\nb'“Too many requests.”'’, <google.auth.transport.requests._Response object at 0x7f8a8c5287c0>)

google.auth.exceptions.RefreshError: (‘Failed to retrieve http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/dataflow-runner@****.iam.gserviceaccount.com/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fbigquery%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fdevstorage.full_control%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fuserinfo.email%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fdatastore%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fspanner.admin%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fspanner.data from the Google Compute Engine metadata service. Status: 429 Response:\nb'“Too many requests.”'’, <google.auth.transport.requests._Response object at 0x7f8a8c5287c0>)

2 Likes

Hello @rish ,

Thank you for contacting Google Cloud Community.

Please note that the user traffic receiving 429s from GCS can indicate the following:

  1. Exceeding object/bucket mutation rate

  2. Resource contention

  3. DoS limits (QPS and DoS - Egress)

  4. Non-conforming traffic as per these guidelines

  5. Ingress bandwidth limit (see 429s due to exceeding the ingress bandwidth limit )

  6. Project QPS quota (see Project QPS quota exceeded - QPS Throttlers )

  7. Egress bandwidth quotas (Bouncer )

Solution: You are hitting a limit to the number of requests Cloud Storage allows for a given resource. See the Cloud Storage quotas for a discussion of limits in Cloud Storage.

  • If your workload consists of 1000’s of requests per second to a bucket, see Request rate and access distribution guidelines for a discussion of best practices, including ramping up your workload gradually and avoiding sequential filenames.

  • If your workload is potentially using 50 Gbps or more of network egress to specific locations, check your bandwidth usage to ensure you’re not encountering a bandwidth quota.

I hope the above information is helpful :slightly_smiling_face:

Thanks & Regards,
Manish Bavireddy.

1 Like