Hi all,
I want to break up a main table by ‘advertiser’ into different tables to a different project in Big Query.
I recently attempted to migrate this table of 27 GiB from one project to another by filtering the table into separate tables in a python script using pandas. I ran the script over night and the cost was that of 97 TiB worth of query processing.
Basically I did a double for loop in the script and I ran the query to process select statements through each “day” and each “advertiser” in that 27 GB worth of data, 28 million rows.
I’m wondering why the cost is so high and so different to what I expected the 27 GiB to be. (My for loop basically partitioned the data and should be the same…) The documentation said it would be less than 5 USD?
If I did the alternative and ran the query within the console (not with my script) and just ran a
“select * from table where …” and set the query settings to create a new custom table with a custom name. Will it cost less than 5 usd or will I face the same 97 TiB cost?
Best wishes,
jnschan