Struggling with BigQuery + Looker Studio Performance and Query Queuing – Need Advice

Hi everyone,

I’m dealing with a rather unusual problem related to performance and query queuing in BigQuery, and I’m not sure how to approach it.

We’re building a tool to report spending across different platforms using BigQuery + Looker Studio. We currently have 100 reserved slots in BigQuery. Our data model includes a flat table with 80GB of data and 21 million rows, on top of which we have a view with RLS (row-level security) using joins on ID and session_user().

To improve performance, we also created a separate table with unique values for filters only, which indeed makes the dashboard a bit faster.

However, we are still facing major performance issues. Our dashboard has 4 tabs, with roughly 200 visualizations per page. When a user opens the dashboard:

  1. Visualizations with filters load first (because the table is smaller).

  2. Then the filters start applying to the rest of the data (Region, Sector, Country, Brand, Subbrand, etc.).

Every filter selection essentially triggers all 200 queries to BigQuery at once (one per visualization). As a result, we constantly hit query queues, even though we only have 4–5 users per hour on average.

The only idea that comes to mind is: is it possible to delay loading the visualizations behind filters until the user confirms all filter selections? Unfortunately, the business does not agree to reduce the number of visualizations or split them across more pages.

Has anyone dealt with a similar situation? Any ideas on how to handle this efficiently without drastically increasing slot reservations?

Thanks in advance!

What you’re running into is a common dashboard concurrency and query explosion problem in Looker Studio with BigQuery. Since each visualization triggers a separate query, applying multiple filters multiplies queries, quickly hitting slot limits. A practical approach without increasing slots is to use aggregated or precomputed tables—for example, materialized views or summary tables that already include filterable dimensions—so each visualization queries a much smaller dataset. Another tactic is filter-driven dashboards, where visualizations only load after the user clicks an “Apply Filters” button, or implementing cached results via BigQuery BI Engine, which can dramatically reduce query load and latency. Essentially, the goal is to reduce the number and size of queries rather than rely on more slots.

You could try enabling extracts or cached data in Looker Studio so filters don’t trigger 200 live queries every time. Another option is to use parameter controls—load visualizations only after users hit “Apply.” If possible, push some pre-aggregations into materialized views or scheduled tables in BigQuery to reduce live query load.