You are correct, the LIST operation is like a SELECT * FROM Table Where Filter ; operation, so it will return a list of results. You would want to use the Create operation to insert a row into BigQuery (in a loop for many rows)…
Here is a flow that does Shopify to Sheets and to BigQuery for the Product object in Shopify. I have a couple of extra branches which do a list of the existing data in Sheets and in BigQuery, so that I can view that in the execution logs to see the before and after my inserts. These are completely optional and can be removed if you want. I’ve found it helpful to do a LIST operation just so I can get some sample data in the logs and that helps me to figure out the business meaning of each of the fields I need to map (and can be used as the structure for my Data Transformer task’s script).
You will see that the BigQuery connector is in the second integration flow to the right, and it is called by the For Each Loop (ID:17). There is a way to do batch updates with a job, etc…, but I just wanted something simple when I was developing this, so I did a loop with the Create operation on the BigQuery Connector, which takes one row at a time. Also note that I have 2 triggers on the sub-integration to write a row to BigQuery. This is also optional. The recommended trigger to use is the Private Trigger. The API trigger is there in case I want to call this sub-integration as an API from some external code someday in the future. The Private trigger can only be called by another integration in the same project, while an API Trigger will always create a Public API (secured by IAM) (for example, this can be called from an Apigee Proxy).
Here are the Shopify connection task configurations:
I used the new Data Transformer Task (Preview) for all of my main data mappings from Shopify to Sheets and BigQuery formats. If there is interest, I can share these details as well. I found the Shopify data structure a bit challenging because some of their JSON substructures had stringified JSON in them, so I had to add an extra parseJson command for those sub structures.
Hope that helps!


