Handling Large Postgres Numeric Values When Migrating to BigQuery via Federated Query

Hi everyone,

I’m migrating data from Postgres (Cloud SQL) to BigQuery using Federated Query, and I’m running into issues with large numeric values. Some of my numeric columns have absurdly high values. Numbers that can go up to 50+ digits which can’t be stored conventionally into a BIGNUMERIC column.

According to Google’s Blockchain Analytics documentation: UINT256 Handling  |  Blockchain Analytics  |  Google Cloud, one approach is to store such large numbers as a STRING to preserve precision and use UDFs to do calculations. However, this approach (1) impacts the performance because UDFs are written in JS (2) Makes every calculation sort of complicated and limited

I’m curious if there are alternative approaches that allow storing and computing on extremely large numeric values (50+ digits) without converting them to strings or splitting the values into multiple columns (chunks).

Any insights would be very helpful

Hi @JB123,

This is currently a limitation of BigQuery’s BIGNUMERIC data type when handling large integer values, like in your case. There is already an existing feature request in Google’s public issue tracker regarding this. You can subscribe to the issue tracker and check the release notes and documents for the latest updates.

While you are correct regarding the potential impacts of the approaches you’ve mentioned, using STRING type remains as best option for managing large integers within BigQuery. However, the link you’ve provided is still under Preview and might have limited support.