Google Cloud has announced a significant update to BigQuery’s cost management settings: beginning September 1, 2025, on-demand query quotas will no longer be unlimited by default. This marks a notable shift toward more controlled and predictable billing for the cloud-based analytics platform.
What’s Changing: Default quota for new projects. New Google Cloud projects created on or after September 1, 2025, will now be subject to a default QueryUsagePerDay limit of 200 TiB for on-demand query processing—previously, there was no such limit.
Existing “unlimited” projects: Projects currently set to “unlimited” for daily usage will be updated to a custom limit. This will be calculated based on peak usage over the past 30 days, with additional headroom to support growth.
No change for already-customized projects: Projects with existing custom quotas (for either QueryUsagePerDay or QueryUsagePerUserPerDay) will not have their limits changed as of the cut-off date.Audit log visibility: Organizations can monitor changes via Logs Explorer—access requiring roles/logging.viewer IAM permissions.
Why the Change MattersAccording to industry observers, this adjustment is a major step toward safeguarding users against runaway costs linked to unintentional heavy queries. One analysis noted that without such defaults, a single query can rapidly incur bills exceeding $1,300.
With the new default limit, users face a theoretical daily cap of $1,000 at typical on-demand pricing of $5 per TiB, although real costs vary. A recent blog emphasized that the “era of unpredictable BigQuery on-demand costs is over,” highlighting the move as part of broader FinOps (financial operations) strategies organizations should adopt to govern cloud spending effectively.
Expert Recommendations: Engineers and observers suggest several steps for businesses to ease the transition:
- Review your current usage patterns: Use Cloud Monitoring, the Quotas page, and audit logs to assess your project or user-level on-demand consumption.
- Set custom quotas proactively: Rather than waiting for the new defaults, you can already configure both project-level and user-level custom limits to align with budgetary goals.
- Monitor and alert: Configure Cloud Monitoring alerts to notify you when usage nears defined thresholds.
- Consider Reservations (flat-rate pricing): Workloads that exceed the daily 200 TiB may benefit from flat-rate capacity via BigQuery Reservations, which are not subject to on-demand quotas.
- Optimize queries for efficiency: Reducing data scan volume—through partitioning, clustering, preview features, and well-tuned SQL—can help mitigate both cost and quota consumption.
The broader context and industry response quota shift reflect growing industry concerns about runaway cloud analytics costs. Analysts highlight how flexible pricing models like on-demand are powerful but pose financial unpredictability without guardrails.
Quotas, reserved capacity, and FinOps best practices are becoming integral to responsible cloud usage. Community voices on platforms like Reddit have underscored the utility of daily query volume caps as a safety net—particularly for new users or teams experimenting with large datasets.
This update marks a pivotal shift in BigQuery’s cost governance philosophy—moving away from “pay-as-you-go without limits” toward a more structured, predictable model. Organizations are encouraged to act now to tailor quotas that suit their data strategy and budget objectives.