Google Sheets v3 API usage limits - google-sheets-api

Although usage limits for v4 of the Google Sheets API are clearly documented, I cannot seem to find the equivalent information for v3.
To clarify the context, I have inherited a system that relies on a sheet being published to the web then its data retrieved in a browser, client-side via an endpoint like this: https://spreadsheets.google.com/feeds/list/{spreadsheet ID}/1/public/values?alt=json (no API key used)
The system is expected to come under moderate load for a short period (eg; 100,000+ hits in 24 hours) in the near future and I'm trying to work out if I need to be wary of access to the API being denied.
Thanks in advance!

Related

How to get third-party API up-to-date?

So, I stepped once at this problem. I had offered a website that used the SoundCloud API. Everything worked properly. Content was extracted from the JSON and placed in the layout of the website. However, I received an email one day from the owner of the website, which indicated that the website did not work properly. I then came out to investigate and came to the conclusion that the "problem" was not on my side, but at SoundCloud's side. I studied on the API page of SoundCloud and came to the conclusion that the API had received a major update, making the link with SC and the site no longer worked.
Lately I'm trying many new APIs to, including those from Instagram and Dribbble. I was therefore wondering if it is at all possible to ensure that such problems can be reduced in the future or it might be appropriate API pages of this third-party APIs to monitor?
There's no "right" answer. After many years of using and maintaining many APIs here are some of the conclusions I've come to:
The best providers let you work with a specific version of their API whose interface and expected behavior never changes. They might release bug fixes and new endpoints, but you can be confident that as long as the API is supported it will not break your system.
A good provider will provide an end-of-life date for each version of their API. It's up to you to keep track of when you need to update.
Paid services will often be supported longer than free services. Plus the contract / SLA will guarantee it remains available for a specific amount of time.
The most popular APIs often have mailing lists and/or blogs. For those that offer it, sign up to be notified of updates. For those that don't you'll have to monitor their blogs or news posts. And I suggest not using any service that would drop support for an API version without warning.

Google Adwords Keyword Tool API to automatically extract data onto a website

Im looking to use Google Adwords Keyword tool data on a website. Ive been looking around in the API and I cant find much to match what I need. I noticed a lot of keyword research tool websites use google as their main source for their information. How would I go about doing this and extracting the data and have it run on a website automatically so it wouldnt need to be updated manually each month?
you can use the Traffic Estimator service in the AdWords API:
https://developers.google.com/adwords/api/docs/reference/v201409/TrafficEstimatorService
Be warned that this is notoriously inaccurate (which is odd given that you would think Google had its own data to call upon!)
I use the TargetingIdea service in the AdWords API to generate lists of keywords to use for building AdWords campaigns. (https://developers.google.com/adwords/api/docs/reference/v201409/TargetingIdeaService.TargetingIdea)
First off you need an API key - they're not that easy to get and your app needs to offer a whole lot of features to meet the required minimum functionality - take a look here https://developers.google.com/adwords/api/docs/requirements
Once you've jumped over that hurdle you get the data from Google by sending a request to the service. That request includes some targeting criteria like location and language and also a "seed" keyword. You can also specify if you want closely related results or broadly related results.
For example if you sold tractors you'd put 'tractors' in as a seed keyword and then the API would return either closely related terms like 'tractors for sale', 'used tractor spares' etc or more broadly related terms like 'agricultural machinery'.

How to get unblocked after exceeding the Google Geocode API usage limit?

Have searched for answers on this for 2 days now with very little luck.
I'm developing a Drupal 7 site which has a Geofield field being autopopulated from an address field using the Google Geocoder API, but as of a couple of days ago this stopped working:
Exception: Google API returned bad status.\nStatus: OVER_QUERY_LIMIT in geocoder_google() (line 52 of /home/.../modules/geocoder/plugins/geocoder_handler/google.inc).
I can remove the proximity search filter that is sending too many requests to the Google API but I can't progress because I run into the above error every time I try to add a new record to the database (which just does one lookup to get a geocode from an address field but fails). Is there any way to unblock my site from Google's API or reset my usage? I've added an API key but to no avail. This was all working fine up until very recently, which I guess is when I unknowingly exceeded the usage limit.
I have limited API experience and am a Drupal/PHP beginner so please be gentle! Happy to provide more info, code, error messages etc if needed. Relevant Drupal 7 modules being used are OpenLayers, OpenLayers Proximity, Geofield, GeoPHP and Geocoder. Thanks for any help anyone can offer.
From Google Geocode Documentation:
Use of the Google Geocoding API is subject to a query limit of 2,500 geolocation requests per day. (User of Google Maps API for Business may perform up to 100,000 requests per day.) This limit is enforced to prevent abuse and/or repurposing of the Geocoding API, and this limit may be changed in the future without notice. Additionally, we enforce a request rate limit to prevent abuse of the service. If you exceed the 24-hour limit or otherwise abuse the service, the Geocoding API may stop working for you temporarily. If you continue to exceed this limit, your access to the Geocoding API may be blocked.
So, I guess you have to wait 24 hours, or upgrade to the business version.

Alternative to the deprecated google REST web search API

I have been using the Google Websearch API for over 1 year now. The service was deprecated in Nov 2010 but continues to provide results to date. More recently, google has started to enforce the 1,000 queries (?) per day limit on this deprecated service. I swear, last month I made over 10,000 API calls in one day without any errors from the service (same IP, same API key).
So I guess my question is has anyone found an alternative yet? I know yahoo boss is pretty good but I am working exclusively on Google for my projects. I do not mind spending money for for this service either as long as i can get 64 results from Google.
On that thought, how are services like Zoomrank able to bypass all Google limits? I have a subscription with Zoomrank and I can get daily rankings for all my keywords. Do they have a tie-up with Google or are they just accessing some secret service I don't know about.
Some people have suggested the new Google custom search, but i dont know how does that help me search the web? Google CS is limited to the CSE you create and searches within those engines. If I am looking for web results for Pizza, Google CS doesnt help me.
Thanks for your input. Much appreciated
UPDATE: #ggez44 points to some official Google documentation of the solution described below here: http://support.google.com/customsearch/bin/answer.py?hl=en&answer=1210656
You can use the Google Custom Search Engine to search the entire web.
In brief:
Create a CSE that searches a single site (e.g. google.com)
In the CSE control panel's Basics section, set to "Search the entire web but emphasize certain sites"
In the Sites section, delete the single site that you added when you created the CSE
Full details here:
http://www.google.com/support/forum/p/customsearch/thread?tid=56c0bd92dda351b7&hl=en&fid=56c0bd92dda351b7000495e3f500d83f
Once that's implemented, you can enable billing in the Google API Console at a CPM of $5, to a total of 10,000 queries.
Google API Console: https://code.google.com/apis/console/
Pricing: https://code.google.com/apis/customsearch/v1/overview.html#Pricing

Google API Request Limit

Does anyone know where I can find Google API Request Limits for their different services?
On simulating 500+ concurrent users it seems to fail silently fairly often (maybe 1 in 10 loads)
Any ideas?
The information is in their support resources. I am not aware of a central place, but it's all there. Searching the docs for "request limit" should usually do the trick.
The Geocoding API's limits for example can be found here.
Google Maps API Web Services and Google Static Maps API limits were cut effective a few days ago. Starting October 1st 2011 commercial web sites and apps using Google Maps API for free receive:
max of 2,500 calls/day, if modified using Styled Maps feature
max of 25,000 calls/day in total
Fusion tables are preferable to the Google Maps API alone, particularly with respect to rate limits:
Applications using the Google Fusion Tables API can send a maximum of
5 requests per second to the Google Fusion Tables server.
I think they removed the limit recently: can't even find a mention of it in documentation pages where I know for sure that it was mentioned and read about the limit removal somewhere this summer.
Even their new EULA states that their service is not limited but they remain free to limit it however they want at any moment.
500 concurrent users doesn't seem to be that much though, even if limitations where in place; are you sure it's Google what's failing?