Does anyone know what the rate limit is for Adwords API keyword Targeting Ideas service?
It seems they mention some rate limits but it doesn't seem to have any details around queries per second
I've not been able to get an official answer on this but support have told me that the rate limit on the targeting ideas service varies by how busy this service is. My experience confirms this.
Also: the recommended backoff of 30 seconds returned in the rate limit error message is almost never enough. I've ended up defaulting to a 2 min backoff for this service.
Related
During our implementation, our implementation partner said that Workday can only handle 10 calls per second and drops any beyond that. Has anyone had the issue and is there a way around it. We have a number of systems that want to use API integrations but we have avoided it due to this problem.
If you have access to the documentation, you can find the limits for RaaS/REST/SOAP here: Reference: Integrations and Web Service Limits
We use the REST API to check the last 20 transactions for a specific user
What is the max number of requests per seconds we can make using the Elrond REST API ?
The rate limits for the official api aren't known as far as I'm aware.
If you plan to have a lot of requests each second you might want to consider setting up your own observer squad and api so you can be independent from the elrond infrastructure. This not only gives you greater control over the response times (and downtimes), but you will also reduce the load on the official servers so others won't be affected by the amount of requests you make.
We are using the Google Calendar API to keep a sync between our app and events in our users' google calendar.
We have started regularly getting rate limiting errors (403).
However our usage according the APIs and Services page of the google cloud console is well below the stated limits (10,000 queries per minute and 600 per user per minute). We are also using the batch API to send our requests so cannot implement exponential backoff
Anyone got any advice on avoiding these rate limiting errors?
Rate limiting errors with google are basically flood protection you are going to fast. Dont hold to much stock in what the status shows on the Google developer console the numbers in those graphs are guesstimates at best and they are not Realtime.
The main cause for rate limit is that when you send a request there is no way of know which server your request is going to be run on. There is also no way of knowing what other requests are being run on the same server. So your request may run faster or slower than you would expect sometimes which makes it hard to track down exactly what 10,000 queries per minute and 600 per user per minute actually is.
10000 requests run on an overloaded server may run in 2 minutes while on a server that is not being overloaded it could be run in 30 seconds meaning the next request you send will blow out the quota.
As there is really no way of avoiding it you you should just ensure that your application is capable of responding to it by sending the request again. I wrote an article a number of years ago about how i would track my requests locally in my application and then ensured that it kept things at the right speed flood buster
Really as long as your application responds by sending the request again you should be ok.
Anybody know if there is any API call limits per second, hour or day for Tumblr API? It seems to me the limits do exist when I make a lot of api calls in a short period via OAuth. However, I couldn't find any document on Tumblr API website or on Google. Many thanks.
I have been using Tumblr API for about 2 years now, and I must admit that "Rate Limit Exceeded" issue has no deterministic and, more important, officially confirmed answer.
In Tumblr's API Agreement you may find some reference to limitations under section "Respect for Limitations" which says
In addition, you will comply with any limitations imposed by Tumblr on the frequency of access, calls and use of the Tumblr API and Tumblr Firehose
We ask that you respect these limitations, as well as any rate limits that we may place on actions, which are designed to protect our systems
Notes:
There is a special Tumblr tagged blog "rate-limit-exceeded" dedicated to this. However, it does not say much about number of request per period of time that a reported person used when facing this problem.
For example here you can find avg 1000 requests per minute to be the limit.
As for my application the request rate is approximately 1 request per second. The application runs for about a year already in 24/7 manner. There were several times though this issue occurred to me even with this relatively low rate. However, I consider the failure rate to be insignificant.
From: https://www.tumblr.com/oauth/apps
Newly registered consumers are rate limited to 1,000 requests per hour, and 5,000 requests per day.
If you go to that link it looks like you can get the rate limit removed if you ask nicely! :)
Have searched for answers on this for 2 days now with very little luck.
I'm developing a Drupal 7 site which has a Geofield field being autopopulated from an address field using the Google Geocoder API, but as of a couple of days ago this stopped working:
Exception: Google API returned bad status.\nStatus: OVER_QUERY_LIMIT in geocoder_google() (line 52 of /home/.../modules/geocoder/plugins/geocoder_handler/google.inc).
I can remove the proximity search filter that is sending too many requests to the Google API but I can't progress because I run into the above error every time I try to add a new record to the database (which just does one lookup to get a geocode from an address field but fails). Is there any way to unblock my site from Google's API or reset my usage? I've added an API key but to no avail. This was all working fine up until very recently, which I guess is when I unknowingly exceeded the usage limit.
I have limited API experience and am a Drupal/PHP beginner so please be gentle! Happy to provide more info, code, error messages etc if needed. Relevant Drupal 7 modules being used are OpenLayers, OpenLayers Proximity, Geofield, GeoPHP and Geocoder. Thanks for any help anyone can offer.
From Google Geocode Documentation:
Use of the Google Geocoding API is subject to a query limit of 2,500 geolocation requests per day. (User of Google Maps API for Business may perform up to 100,000 requests per day.) This limit is enforced to prevent abuse and/or repurposing of the Geocoding API, and this limit may be changed in the future without notice. Additionally, we enforce a request rate limit to prevent abuse of the service. If you exceed the 24-hour limit or otherwise abuse the service, the Geocoding API may stop working for you temporarily. If you continue to exceed this limit, your access to the Geocoding API may be blocked.
So, I guess you have to wait 24 hours, or upgrade to the business version.