we have layers of nested views, that I do not have control over as they are developed by other teams. Clients policy is not persist any data unless it is absolutely must.
I read somewhere that Big Query calculates “complexity” using multiple factors ( number of views, table UDF, nested views, number of columns ) and throw this error when the complexity number exceeds certain value.
I removed all CTE’s and changed them to sub queries, reduces some columns, and running out of options.
At this point I am trying to find out how to find the BQ complexity number for my query, is there a quota that governess that number. any help is apriciated
3 Likes
I faced similar issue and here’s what I did to triage and was successful with it - hopefully this gives some pointers and helps you with it.
- Check if you have self joins, this creates a problem in estimate phase on bq
- isolate each view / join by commenting out sections of it and run the query until you don’t get the error. this way you can find the culprit query.
- try denormalising multiple tables into one so that you can avoid join in linked views - but this means you will have to persists it which is against your client policy. I think it is worth a try if it helps resolve the issue.
- try materialized views too, same as my above point, this may be against your client policy.
2 Likes
Thank you. I have solved the problem multiple times as you suggested by removing joins when possible, but we are now reaching the limit what can be done.
I would prefer not to do trial and error, and like to get confirmation on how this complexity is calculated and if there is a way to see that complexity number.
3 Likes