I was searching for some information about Datastream and BigQuery and did not find anything about my doubt.
My doubt is: I created a streaming Datastream β BigQuery and my base was 130GB. But when I saw BQ Analysis it told me that 29TB was processed. Why did this happen? In this project I do not have anything work with BQ.
Any configuration change leads to the creation of a new revision. Subsequent revisions will also automatically get this configuration setting unless you make explicit updates to change it.
For Cloud Run services, you can set memory limits using the Google Cloud console, the gcloud command line, or a YAML file when you create a new service or deploy a new revision
Hi @marcelocastro , thanks for the hint! I was having a similar issue and BigQuery analysis usage indeed went down as I increased the max_staleness settings.