Hi everyone,
I’m migrating data from Postgres (Cloud SQL) to BigQuery using Federated Query, and I’m running into issues with large numeric values. Some of my numeric columns have absurdly high values. Numbers that can go up to 50+ digits which can’t be stored conventionally into a BIGNUMERIC column.
According to Google’s Blockchain Analytics documentation: UINT256 Handling | Blockchain Analytics | Google Cloud, one approach is to store such large numbers as a STRING to preserve precision and use UDFs to do calculations. However, this approach (1) impacts the performance because UDFs are written in JS (2) Makes every calculation sort of complicated and limited
I’m curious if there are alternative approaches that allow storing and computing on extremely large numeric values (50+ digits) without converting them to strings or splitting the values into multiple columns (chunks).
Any insights would be very helpful