Hi folks,
I’m trying to test out some really simple exports from BigQuery to Bigtable and I’m hitting a persistent error while trying to do so.
I have a BigTable that looks like:
gcloud bigtable instances tables create TEST_EXPORT_NEW \
–instance=my-instance
–column-families=cf
Created table [TEST_EXPORT_NEW].
And I’m trying to export using this query:
EXPORT DATA OPTIONS (
format=‘CLOUD_BIGTABLE’,
uri=“https://bigtable.googleapis.com/projects/canary-443022/instances/my-instance/appProfiles/MY_APP_PROFILE/tables/TEST_EXPORT_NEW”,
bigtable_options=“”“{
“columnFamilies”: [
{
“familyId”: “cf”,
“encoding”: “BINARY”,
“columns”: [
{
“qualifierString”: “value”,
“fieldName”: “value_bytes”
}
]
}
]
}”“”
) AS
SELECT
CAST(‘test’ AS BYTES) as rowkey,
CAST(‘test’ AS BYTES) as value_bytes
FROM (SELECT 1)
I seem to be constantly hitting issues:
Error while mutating the row ‘test’ (projects/canary-443022/instances/my-instance/tables/TEST_EXPORT_NEW) : Requested column family not found.
I’ve been able to confirm that I can manually write Rows / retrieve rows to the ‘cf’ via cbt just fine. I tried changing the name of the column family from ‘cf’ to ‘test_cf’ but that doesn’t help either.
I have confirmed I’ve added the IAM roles called out - https://cloud.google.com/bigquery/docs/export-to-bigtable#required_roles. I’m not sure what I’m doing wrong here so any suggestions / ideas would be welcome.
Thanks,
Piyush