Google Places API: OVER_QUERY_LIMIT - api

I am getting an error notifying me of exceeding the API Quota. However, I have a quota of 150,000 requests and have only used up 12,000 of them. What is causing this error?
example of code:
from googleplaces import GooglePlaces
api_key = ''
google_places = GooglePlaces(api_key)
query_result = google_places.nearby_search(location="Vancouver, Canada", keyword="Subway")
for place in query_result.places:
print(place.name)
Error Message:
googleplaces.GooglePlacesError: Request to URL https://maps.googleapis.com/maps/api/geocode/json?sensor=false&address=Vancouver%2C+Canada failed with response code: OVER_QUERY_LIMIT

The request from error message is not a Places API request. This is Geocoding API request.
https://maps.googleapis.com/maps/api/geocode/json?sensor=false&address=Vancouver%2C+Canada
It doesn't include any API key. That means you can have only 2500 daily geocoding requests without an API key. Also, geocoding requests have a query per second limits (QPS) which is 50 queries per second. You might be exceeding the QPS limit as well.
https://developers.google.com/maps/documentation/geocoding/usage-limits
Not sure why the library that supposed to be calling Places API web service in reality calls Geocoding API web service. Maybe this is some kind of fallback in case if Places API doesn't provide any result.

For anyone looking at this post in a more recent time period, Google has made it so it is NECESSARY to have an api key to use their maps api. So please ensure you have an api key for your program. Also keep in mind the different throttle limits.
See below for more info:
https://developers.google.com/maps/documentation/geocoding/usage-and-billing#:~:text=While%20there%20is%20no%20maximum,side%20and%20server%2Dside%20queries.

Related

How to test an API with filters

I've received a documentation from a developer for an API i have to test.
I have the API URL: https://something/get-list
As per the documentation, the accepted paremeters are:
token (required)
filters[is_true] – 0 or 1
I'm trying to figure it out how to input the filters[is_true] in query params in postman. No matter what i do the response is not filtered, and i want to test the api.
I've tried with key filters[is_true], $filter=filters[is_true] etc, but doesn't work.
I do not have much experience with APIs and I didn't test before with filters so not sure what to do.
Thanks!

Main difference between GET and POST api calls?

I am getting confused with the difference with GET and POST.
Can you provide some good resource or explanation with examples.
I'm just getting started with this.
Thank you.
GET is for retrieving data from the URL - for example, example.com?tab=settings
'tab' is what you would use to 'GET' the data from
POST is more secure, and allows you to send or retrieve the data directly
Other points mentioned here are:
GET requests can be cached
GET requests remain in the browser history
GET requests can be bookmarked
GET requests should never be used when dealing with sensitive data
GET requests have length restrictions
GET requests are only used to request data (not modify)
POST requests are never cached
POST requests do not remain in the browser history
POST requests cannot be bookmarked
POST requests have no restrictions on data length

How do batchUpdate calls count towards usage limits?

Calls like spreadsheets.batchUpdate and spreadsheets.values.batchUpdate can take multiple update actions in a single call.
I read about google sheets api usage limits at https://developers.google.com/sheets/api/limits, however it is not clear if these calls would count as one or multiple requests. Could you explain?
Thanks
At spreadsheets.batchUpdate and spreadsheets.values.batchUpdate, even when multiple requests are included in one batch request of batchUpdate, the request uses only one API.
For example, when 10 requests are included in the request body of batchUpdate and the request body is run by batchUpdate, only one API is used.
About the maximum requests in one batchUpdate, I have never investigated this. But in my experience, when I had used 100,000 requests in one batchUpdate, I could confirm that the script worked fine.
If I misunderstood your question, I apologize.

Blogger API Gives Error 500 when Requesting List of Scheduled Posts

I am using Blogger API v3. When requesting a list of scheduled status posts, the API always returns error 500. First I thought it might just be my blog or my app. However, I've tested on the API's own website (try it out) on a newly created blog and it still happens. Does anyone else have this same problem? Thanks in advance.
EDIT: Of course, this is assuming you already have scheduled posts in your blog though.
The 500 Internal Server Error is a very general HTTP status code that
means something has gone wrong on the web site's server but the server
could not be more specific on what the exact problem is.
Reference : https://www.westhost.com/knowledgebase/display/WES/What+Is+A+500+error
Q : Does anyone else have this same problem?
A : Yes, I have, several times.
In fact, the 500 error is not only happen when we request a list of posts but also in every request we can make with Blogger API. AFAIK, when I do a request with Blogger API and it's returned 500 error, it's always because when doing multiple request in almost the same time (usually because of looping which I forgot to break)
I've also encounter this error when testing it straight from the Blogger API examples page. The first time I request it retured 500 error, but the second time, the request the returned the data that I requested.
For the sample in Blogger API site, it may be just an authentication error. As for the error by your own request, I suggest you to check your coding again, the the request may be placed inside a looping or you've send a request BEFORE the previous request has successfully returned a response.

Official way to fetch the number of tweets for a URL

Twitter has private endpoints like this one:
http://urls.api.twitter.com/1/urls/count.json
Tweet counts can be fetched from here, but this is not recommended by Twitter. Besides, they keep saying they gonna shut down these endpoints in the near future.
The Site Streams API is now in closed beta, they don't accept applications.
https://dev.twitter.com/streaming/sitestreams
So that leaves is with only one option, the REST API, but I don't see any endpoint there which could return the number of tweets for a given URL.
What's the best way to get this data? Is there an "official" endpoint for this?
Or the only way is to use something like the Public stream API or the REST API search endpoints and filter the results?
The private endpoint will be shut down by 20 Nov and there'll be nothing to replace it. This blog post from Twitter explains the background: apparently it's to do with their move to their new "real-time, multi-tenant distributed database" system codenamed Manhattan.
The REST API will be of limited use for this purpose. You'd have to do a search for your URL, collect each page of results and add up the total number of tweets yourself. For example this request
https://api.twitter.com/1.1/search/tweets.json?q=metro.co.uk&count=100
will get tweets associated with http://metro.co.uk. (It won't work if you just paste this into your browser - you have to authenticate first. You can try this on the Twitter API console tool.) But the Search API returns a max of 100 tweets per page of results, and it only returns tweets from the last 7 days.
It seems the only solution (explained here) is an elaborate one using a Twitter Streaming API. Basically you'd have to create your own app to count relevant tweets. It would open a connection to stream.twitter.com passing your URL as a track parameter. Twitter will return a tweet every time anyone tweets the address, and your app will have to count them. The example given in that post is:
curl -u user:password "https://stream.twitter.com/1/statuses/filter.json" -d "track=https%3A%2F%2Fdev.twitter.com%2Fdiscussions%2F5653"
I'm not sure how you would deal with shortened URLs in this scenario.
This change has meant that third-party services like SharedCount that report a count of Twitter shares are having to stop offering that data. Sorry to give you bad news - I'm really disappointed with this situation myself. It seems crazy that we can't just get a total of tweets for a given URL.
You can find a little bit more about this in this thread.