Can someone provide details on how to upload a csv automatically to big query every 5 minutes
or should this data be stored in google cloud storage initially, then streamed to BQ.
please note that my intent is to use google sheet for data processing
Can someone provide details on how to upload a csv automatically to big query every 5 minutes
or should this data be stored in google cloud storage initially, then streamed to BQ.
please note that my intent is to use google sheet for data processing
Where will the CSV data be hosted? Is it in a file system on premises? Linux vs Windows vs something else? One could always schedule a CRON job and use the bq command line tool to run against the new CSV. I think the right answer would be to look at the puzzle holistically. Maybe paint the complete picture of where the data is originally coming from, where you want it to end up and other considerations. As you mention, BQ has streaming insert capability as well as job ingestion.