Foursquare API limitation and design advice - api

Foursquare encourages developers to use maximum caching before doing repetitive calls to Foursquare API in order not to extend hourly limit of usage (5000 requests / hour).
So, does it mean is it a bad idea to access Venues API directly from mobile app?
Do we need to make our mobile app retrieve results from our server instead of calling Foursquare directly?
Thanks

Accessing Foursquare using your own servers as a proxy is generally encouraged. As you stated, it's better for caching, but also has the advantages of
being able to make changes quickly without waiting for an app submission (and can therefore propagate to all your users, even ones that don't actively upgrade)
ability to collect and aggregate information about call volume, errors, etc. more easily

Related

Podio API limit

I am working on one product which is fetching all the organization/workspace and app details of the customer. The customer can refresh them any time.
So let’s say I have one customer who has 100 applications across multiple workspaces so around it is making around 110 calls to get each application detail, workspace details and organizations.
Now if that customer refreshed the applications multiple times like 10 times in an hour so the action only for that is 1000 API calls. If I have 50 such users active and doing this thing then it will be something 50000.
AFAIK I can not make so many API calls in an hour so how to handle this scenario. I know a lot of applications are doing such things so want to understand how everyone is handling this.
If you need a higher rate limit, I would encourage you to contact Podio support and ask specifically for what you need. We have internal guidelines for evaluating these kinds of requests and may increase the limit for your user and client ID if appropriate.
In general, though, I would expect your app to implement some kind of batching, transient storage, and/or caching layers, especially if your customers are interacting with Podio exclusively or primarily through your system.
Please see our official statement here: https://developers.podio.com/index/limits
Summary:
The general limit is 5,000 API calls per hour, but if the API call is marked as "Rate limited" in the API reference the call is deemed resource intensive and a lower rate of 1000 calls per hour is enforced. If you hit the rate limits the API will begin returning 420 HTTP error codes for all API calls. Rate limits are per user per API key.
Contacting support:
If you have a project that requires a higher rate limit contact support#podio.com with a brief description of your project, your estimated usage and the client_id of the API key you are using.
Usage tips:
Tips for reducing API usage
Avoid making API requests inside loops. Instead of fetching individual objects inside a loop, fetch a collection of objects in one API operation. E.g. filter items
Cache results whenever possible. This is especially true when you are displaying data to the public (i.e. every sees the same output).
Don't poll for changes. Instead of polling Podio to see if your content has changed use webhooks or push to receive a notification. This might save you thousands of requests: https://developers.podio.com/doc/hooks
Use logging to see how many requests you're making
Bundle responses with "fields" parameter
You might want to build an API proxy app; you would need a messaging queue and a rate limiter. This would lets you keep track of the api calls consumptions across apps and users.
Also worth noting: some API routes are more expensive than others if they are more resource intensive on the Podio side… The term in use is rate-limited: rate limited api route are bound to 1k calls an hours, so in effect costs 5 times as much as regular routes.
Hope this helps!

API call request limit

I have been looking into various different APIs which can provide my the weather data I need in JSON format. A lot of these API's have certain limits such as: in order to get more requests per minute, you need to pay more money per month so that your app can make more API requests.
However, a lot of these API's also have free account which five you limited access to them.
So what I was thinking is, wouldn't it be possible for a developer to just make lots of different developer accounts with an API provider and then just make lots of different API keys?
That way, they wouldn't have to pay anything as they could stick with the free accounts. Whenever one of the API keys has reached the maximum daily request calls, the developer could just put a switch statement in their code which gets their software to use a different API key.
I see no reason why this wouldn't work from a technical point of view... but, is such a thing allowed?
Thanks, Dan.
This would technically be possible, and it happens.
It is also probably against the service's terms, a good reason for the service to ban all your sock puppet accounts, and perhaps even illegal.
If the service that offers the API has spent time and money implementing a per-developer limit for their API, they have almost certainly enforced that in their terms of service, and you would be wise to respect those.
(relevant xkcd)

how many apps can I build with one twitter account

I'm using Twitter API to crawl some data.
I'm just wondering how many apps I can build with one account to let me bypass the rate limit?
Thanks in advance!
Deliberately creating multiples for the purpose of bypassing the rate limit is against the Twitter API Terms of Service and will get your applications banned.
If you need more data than the REST API rate limits allow, then you may need to use the Streaming API.
The rate limits exist to ensure the best quality of service for as many people as possible. Don't abuse them.

Google plus determine changes in network

I am trying to determine changes in the Google+ network in an efficient manner (profile changes). My first idea was to use the eTags of the People.List and People.Get. My assumption was that the eTag in the List (person) would be the same as the one in the Get. This is not the case!
I rather not want to get the details of all the people in the network and checking the eTag for each of them. I will run out of daily api-calls very quickly using that scenario.
Are there any other ways of determining the changes in the network?
Thanks!
I'm not aware of a way to notify your service when changes occur on a user's profile. I don't think that etags will work for what you are trying to do and the client libraries should already be using the etags to manage any query caching. You can perform a few tricks to make queries lighter on your backend though:
Batch API calls
Use a fields filter to just only get the data that matters for your application
If you are running out of quota, you can also request to have your limits raised from the Google APIs console by clicking the Quotas link on the left. The developer relations team from Google+ checks the request regularly and will raise your quota limits if your usage justifies it.

Rails 3: How To Keep Track of My API Usage?

My app as an API which I have no control of. I don't know which users use it, how many times, etc.
What's the best way of doing an analytics system to keep track of my API usage? Does it need to be a Rack app to filter traffic or there's a better solution for it?
Thanks!
You can create an interface by your self. All you have to do is, trace users by their request to your app and can track records by identifying them by their api key. So a user request to your api will be their usage and so on.