Cloud Run RPC Admin API only working with regional endpoint

Yesterday I suddenly started getting errors like this:
“UNKNOWN:Error received from peer ipv4:172.217.23.106:443 {grpc_message:“Requested entity was not found.”, grpc_status:5, created_time:“2026-01-09T14:27:48.415971867+00:00”}”
from airflow.providers.google.cloud.operators.cloud_run.CloudRunExecuteJobOperator. For context my airflow deployment is outside GCP.

I did a rather deep investigation ofc starting with triple checking project id, region and job name are correct as well as that there were no changes in IaC. To cut story short, I’ve been able to narrow it down to a simple python reproduction code which shows that using global API endpoint always returns “not found” response, even for just listing jobs! When explicit regional endpoint is used then the call succeeds.

My suspicion is that something broke with global routing in GCP, but I’m not sure where should I report such a problem and want to first check if there is anything I can do on my side. Long term solution could be to expand CloudRunExecuteJobOperator functionality and allow it to use configured regional endpoint but since this is outside my control I suppose this route will take some time.

Has anyone faced a similar problem?

import os
import sys
import grpc
import google.auth
import google.auth.transport.grpc
import google.auth.transport.requests

# We use the public message types (stable API)
from google.cloud import run_v2

def main():
    # 1. Read Environment
    project_id = os.environ.get("PROJECT_ID")
    region = os.environ.get("REGION")
    enable_regional = os.environ.get("REGIONAL_ENDPOINT_ENABLED", "0") == "1"

    if not project_id or not region:
        print("❌ Error: Set PROJECT_ID and REGION env vars.")
        sys.exit(1)

    # 2. Determine Endpoint
    if enable_regional:
        target = f"{region}-run.googleapis.com:443"
        mode = "REGIONAL"
    else:
        target = "run.googleapis.com:443"
        mode = "GLOBAL"

    print(f"--- Raw gRPC (No Stubs) Configuration: {mode} ---")
    print(f"Target: {target}")

    # 3. Create Credentials
    credentials, _ = google.auth.default()
    request = google.auth.transport.requests.Request()
    credentials.refresh(request)

    ssl_creds = grpc.ssl_channel_credentials()
    call_creds = grpc.access_token_call_credentials(credentials.token)
    channel_creds = grpc.composite_channel_credentials(ssl_creds, call_creds)

    # 4. Create Channel
    channel = grpc.secure_channel(target, channel_creds)

    # 5. Prepare Request (Serialize to bytes manually)
    parent = f"projects/{project_id}/locations/{region}"
    print(f"RPC: /google.cloud.run.v2.Jobs/ListJobs (parent={parent})")

    # Create the high-level object
    request_obj = run_v2.ListJobsRequest(parent=parent)
    # Serialize to raw bytes (proto-plus method)
    request_bytes = run_v2.ListJobsRequest.serialize(request_obj)

    # 6. Execute Raw RPC
    # We use unary_unary because ListJobs is a simple request/response
    # Format: /<package>.<Service>/<Method>
    method_name = "/google.cloud.run.v2.Jobs/ListJobs"

    try:
        unary_call = channel.unary_unary(method_name)
        response_bytes = unary_call(request_bytes)

        # 7. Deserialize Response
        response_obj = run_v2.ListJobsResponse.deserialize(response_bytes)

        count = 0
        for job in response_obj.jobs:
            print(f"   - {job.name}")
            count += 1
        print(f"✅ Success! Found {count} jobs via pure Raw gRPC.")

    except grpc.RpcError as e:
        print("\n❌ gRPC Call Failed!")
        print(f"Status Code: {e.code()}")
        print(f"Details: {e.details()}")

if __name__ == "__main__":
    main()

2 Likes