https://gist.github.com/donbowman/5ea8f8d8017493cbfa3a9e4f6e736bcc shows all the details. I have also posted on stackoverflow: https://stackoverflow.com/questions/73767750/how-do-i-create-a-bigquery-pubsub-direct-in-gcp-i-get-an-error-failed-to-create
I have created a schema (initially in avro, but then in protobuf).
I have created a schema using this, and a topic using it.
I have then created a bigquery table using the schema.
When I create a subscription of type bigquery table, i get a 400 error, invalid argument.I have manually verified all the fields vs the protobuf. I cannot see anything wrong, and, there is no hint in the error message. The message sent is:
{"ackDeadlineSeconds": 900, "bigqueryConfig": {"dropUnknownFields": true, "table": "agilicus:checkme.httpobs", "useTopicSchema": true, "writeMetadata": true}, "name": "projects/agilicus/subscriptions/check-me.httpobs", "topic": "projects/agilicus/topics/check-me.httpobs"}
but all i get is an error each time.
If i purposely mispell the table name, there is no change in the error, so i believe the issue is with pubsub topic + schema somehow.
Does any one have a suggestion or example? I am quite stuck.
I have also tried using a python file to create the subscription: same effect.