Creating Multiple Google/YouTube Data API Keys - api

Is it possible to create multiple API keys for the YouTube Data API?
The majority of Live YouTube Subscriber Counters use loads of different API keys for their counters (as can be seen in their JavaScript code).
The aim of doing so is to not exceed the daily quota limit of 1,000,000 and having to send requests every few seconds per page visited would mean that the limit would be reached very quickly.
How are they able to get away with this?

Here is a SO post to answer your question.
Technically you can run your application using different API Keys it
should work fine. Technically there is nothing wrong with creating
additional projects on Google Developer console. You don't need to go
as far as creating another Google account.

Related

How to add pagination in API

I want to know how to add pagination in APIs for enhanced data retrieval most specifically the youtube API!
I didn't try out anything so far as it's a new concept towards me!
What i personally do usually is one of two things,
(the most preferred way for me) I create more than one API Token. every X requests i dynamically change the API that executes the request., then it avoids throttlings.
When requesting or sending a large amount of requests, you can stop dynamically every X time.

Podio API limit

I am working on one product which is fetching all the organization/workspace and app details of the customer. The customer can refresh them any time.
So let’s say I have one customer who has 100 applications across multiple workspaces so around it is making around 110 calls to get each application detail, workspace details and organizations.
Now if that customer refreshed the applications multiple times like 10 times in an hour so the action only for that is 1000 API calls. If I have 50 such users active and doing this thing then it will be something 50000.
AFAIK I can not make so many API calls in an hour so how to handle this scenario. I know a lot of applications are doing such things so want to understand how everyone is handling this.
If you need a higher rate limit, I would encourage you to contact Podio support and ask specifically for what you need. We have internal guidelines for evaluating these kinds of requests and may increase the limit for your user and client ID if appropriate.
In general, though, I would expect your app to implement some kind of batching, transient storage, and/or caching layers, especially if your customers are interacting with Podio exclusively or primarily through your system.
Please see our official statement here: https://developers.podio.com/index/limits
Summary:
The general limit is 5,000 API calls per hour, but if the API call is marked as "Rate limited" in the API reference the call is deemed resource intensive and a lower rate of 1000 calls per hour is enforced. If you hit the rate limits the API will begin returning 420 HTTP error codes for all API calls. Rate limits are per user per API key.
Contacting support:
If you have a project that requires a higher rate limit contact support#podio.com with a brief description of your project, your estimated usage and the client_id of the API key you are using.
Usage tips:
Tips for reducing API usage
Avoid making API requests inside loops. Instead of fetching individual objects inside a loop, fetch a collection of objects in one API operation. E.g. filter items
Cache results whenever possible. This is especially true when you are displaying data to the public (i.e. every sees the same output).
Don't poll for changes. Instead of polling Podio to see if your content has changed use webhooks or push to receive a notification. This might save you thousands of requests: https://developers.podio.com/doc/hooks
Use logging to see how many requests you're making
Bundle responses with "fields" parameter
You might want to build an API proxy app; you would need a messaging queue and a rate limiter. This would lets you keep track of the api calls consumptions across apps and users.
Also worth noting: some API routes are more expensive than others if they are more resource intensive on the Podio side… The term in use is rate-limited: rate limited api route are bound to 1k calls an hours, so in effect costs 5 times as much as regular routes.
Hope this helps!

Google plus determine changes in network

I am trying to determine changes in the Google+ network in an efficient manner (profile changes). My first idea was to use the eTags of the People.List and People.Get. My assumption was that the eTag in the List (person) would be the same as the one in the Get. This is not the case!
I rather not want to get the details of all the people in the network and checking the eTag for each of them. I will run out of daily api-calls very quickly using that scenario.
Are there any other ways of determining the changes in the network?
Thanks!
I'm not aware of a way to notify your service when changes occur on a user's profile. I don't think that etags will work for what you are trying to do and the client libraries should already be using the etags to manage any query caching. You can perform a few tricks to make queries lighter on your backend though:
Batch API calls
Use a fields filter to just only get the data that matters for your application
If you are running out of quota, you can also request to have your limits raised from the Google APIs console by clicking the Quotas link on the left. The developer relations team from Google+ checks the request regularly and will raise your quota limits if your usage justifies it.

Google API Request Limit

Does anyone know where I can find Google API Request Limits for their different services?
On simulating 500+ concurrent users it seems to fail silently fairly often (maybe 1 in 10 loads)
Any ideas?
The information is in their support resources. I am not aware of a central place, but it's all there. Searching the docs for "request limit" should usually do the trick.
The Geocoding API's limits for example can be found here.
Google Maps API Web Services and Google Static Maps API limits were cut effective a few days ago. Starting October 1st 2011 commercial web sites and apps using Google Maps API for free receive:
max of 2,500 calls/day, if modified using Styled Maps feature
max of 25,000 calls/day in total
Fusion tables are preferable to the Google Maps API alone, particularly with respect to rate limits:
Applications using the Google Fusion Tables API can send a maximum of
5 requests per second to the Google Fusion Tables server.
I think they removed the limit recently: can't even find a mention of it in documentation pages where I know for sure that it was mentioned and read about the limit removal somewhere this summer.
Even their new EULA states that their service is not limited but they remain free to limit it however they want at any moment.
500 concurrent users doesn't seem to be that much though, even if limitations where in place; are you sure it's Google what's failing?

How often to run the cron, to mine twitter public timeline?

The webapps that depend on the public timeline of twitter, how often do they collect the data? There must be hundreds of thousands of messages every minute, correct? How do they manage to collect all the tweets, without missing any of them?
Some services (Friendfeed is a good example) are granted access to the Twitter Streaming API, aka the 'firehose'. It requires approval and a written agreement.
The publictimeline is not a great place to mine data anymore. Twitter now uses its Streaming APIs to output tweets like crazy. The closest comparison to the publictimeline would be the spritzer method, but that only includes a small sample. If you need to gather all (or more) tweets than the spritzer method, you'll need to sign a written agreement to get access to other Streaming API (HTTP push) feeds, such as the firehose feed, which returns all public tweets.
The twitter API is rate limited, as has been said. The public timeline (twitter.com/public_timeline) is not rate limited in the same sense, but it is only updated every 5 seconds, so most tweets never appear there.
There are I think three or four companies that have access to the firehose, as Twitter's full feed is called. FriendFeed is one of these. Another is Gnip. Gnip resells the feed to other companies. This is probably the only feasible way to get a full twitter feed.
Go here:
http://twitter.com/help/request_whitelisting
and get your account white-listed (allows 20,000 per hour) if 100 requests per hour isn't enough.
#ceejayoz its not 100 GET requests its 100 requests in general excluding a few requests like verify_credentials and rate_limit_status.