I use pulumi as IaaS and yesterday we migrated from GCP’s v6 to v7 and now the stream is unable to start.
This stream is running for almost 6 months syncing data from CloudSQL Postgres to Big Query and never had a problem.
When I try to start the stream I get the following:
The stream is connected to non-existent connectionProfile
I tried to debug it through Logs Explorer and there is no more information than that.
After the pulumi update, all the steps to create the stream have passed. Based on the GCP’s operation cli response:
validationResult:> validations:> - code: POSTGRES_VALIDATE_CONNECTIVITY> description: Validates that Datastream can connect to the source Postgres database.> state: PASSED> - code: POSTGRES_VALIDATE_PUBLICATION> description: Validates that the publication exists and configured for the required> tables.> state: PASSED> - code: POSTGRES_VALIDATE_LOGICAL_DECODING> description: Validates that logical decoding is properly configured on the database.> state: PASSED> - code: POSTGRES_VALIDATE_REPLICATION_SLOT> description: Validates that the replication slot exists and not lost.> state: PASSED> - code: POSTGRES_VALIDATE_BACKFILL_PERMISSIONS> description: Validates that the given Postgres user has the select permissions> on the required customer tables> state: PASSED> - code: BIGQUERY_VALIDATE_API_ENABLED> description: Validates that the BigQuery API is enabled in the current project.> state: PASSED> - code: BIGQUERY_VALIDATE_DESTINATION_PERMISSIONS> description: Validates that Datastream has permissions to write data into BigQuery.> state: PASSED> - code: BIGQUERY_VALIDATE_DYNAMIC_DATASET_LOCATION> description: Validates that Datastream can write data into BigQuery datasets> in the specified location.> state: PASSED> verb: create
I checked the IAM policies and all the BigQuery-related permissions are granted to the DataStream service account. Reference: https://www.googlecloudcommunity.com/gc/Serverless/Unable-to-connect-datastream-to-a-bigquery-destination-in/m-p/646046
I also checked the KMS key for BQ encryption/decryption and the same assigned on the BigQuery dataset is assigned on the DataStream destination configuration.
How can I overcome this issue?