Google Bigquery - TooManyRequest 429 POST Quota exceeded error - google-bigquery

I am using bigdata via google bigquery, often datasets have 300 million rows. I am not entirely sure which quota is exceeded, and what i can do about it. any help will be appreciated.

You should be able to find in the Monitoring which BigQuery quota is being hit. View instructions here.
If you find the quota you are hitting you can request an increase. Check the process here.

Related

Quota exceeded: Your project exceeded quota for free storage for projects

I am using BigQuery Sandbox (Free Plan) and integration with Google Analytics. Two week ago I got error "Quota exceeded". I removed all tables but It doesn't help - Load job details still show "Quota exceeded: Your project exceeded quota for free storage for projects."
Should I wait 1 month and quota will be refreshed? Or the only one solution - set up billing and use paid plan? I want to keep the Free plan.

Google Pub/Sub + Cloud Run scalability

I have a python application writing pubsub msg into Bigquery. The python code use the google-cloud-bigquery library and the TableData.insertAll() method quota is 10,000 requests per second per table.Quotas documentation.
Cloud Run container auto scaling is set to 100 with 1000 requests per container.So technically, I should be able to reach 10 000 requests/sec right? With the BQ insert API being the biggest bottleneck.
I only have a few 100 requests per sec at the moment, with multiple service running at the same time.
CPU and RAM at 50%.
Now confirming your project structure, and a few details given in the comments; I would then review the Pub/Sub quotas and limits, especially the Quota and the Resource limits, both tables where you can check this information depending on the size and the Throughput quota units sections tells you how to calculate quota usage.
I would answer your question as a yes, you are able to reach 10,000 req/sec. And as in this question depending on the byte size you can have 10,000 row inserts unless the recommendation is 500.
The concurrency in Cloud Run can be modified in case you need to change it.

Google big query API returns “too many free query bytes scanned for this project”

I am having the exact same issue as this person here:
Google big query API returns "too many free query bytes scanned for this project"
But I don't understand how it applies to me. I have a PowerBI Dataflow getting data from bigquery about 8 times a day (it's firebase analytics data, so it quite a lot of data). It was working fine until the start of today. I get the following error message:
Quota exceeded: Your project exceeded quota for free query bytes
scanned. For more information, see
https://cloud.google.com/bigquery/troubleshooting-errors
However when I try to find out what Quota I'm exceeding, I really can't seem to find it in the billing section of cloud services. Also, I'm on the Blaze plan, which I thought would just charge me extra when I used more.
Any help is appreciated.
Thanks!
Jaap
If you don't enable billing, the the quota for your project usage will not show up in the console
You need to enable billing, you have a free limit of 100GB to query per month.
With billing enabled you can query more than 100GB of data.
For a complete overview of the free tier quotas see Free operations

Using the BigQuery sandbox, how to check free Quota usage?

I started using the new free BigQuery sandbox, I know the limitation is 1TB/Month, but how I check, how much I used already, my project has no billing.
thanks
it is a bug in the system, Quota usage don't show if you don't enable billing
"Exceeded quota: too many free query bytes scanned for this project" in Google BigQuery
You still can use cloud console to reach Quotas details
https://console.cloud.google.com/iam-admin/quotas?project=your_project

'TRANSACTION_ROLLBACK' and 'INSUFFICIENT_TOKENS' errors API transfers from adwords to bigquery

I'm using the BigQuery Data Transfer Service for Google AdWords to get adwords data into my data warehouse built in google-bigquery.
For some of the transfers I get the 'TRANSACTION_ROLLBACK' and 'INSUFFICIENT_TOKENS' error. With the message "Will retry later".
After waiting a couple of weeks the errors still persists though.
I've got 10+ really large adwords accounts, so the amount of data are probably larger than for most users of adwords.
I can't find any information on how to increase any potential limit on the amount of data being transferred. Or information on the errors. Anyone who have had the same problem and found a solution?