New Instagram Rate Limits - api

We have created an app to display our own feed on our own website. However, we're now exceeding the new rate limit of 500 requests per hour. Instagram says our app does not need review and therefore cannot be upgraded to a "Live" app and take advantage of the higher rate limit. How do we get around this? On a side note, this is a lot of trouble to display our own feed on our own website. Sheesh.

If I got it right, you datafeed is seen by all users, but it's always the same datafeed. So instead of caching it for each user, you could request data each 30 seconds (or less) and store the results in a DB. You should then try always to read feed data from DB but if 15 seconds have passed you should request data from Instagram again.
That would make pretty transaparent to your users

Related

Receiving 403 (rate limited) errors from Google Calendar API

We are using the Google Calendar API to keep a sync between our app and events in our users' google calendar.
We have started regularly getting rate limiting errors (403).
However our usage according the APIs and Services page of the google cloud console is well below the stated limits (10,000 queries per minute and 600 per user per minute). We are also using the batch API to send our requests so cannot implement exponential backoff
Anyone got any advice on avoiding these rate limiting errors?
Rate limiting errors with google are basically flood protection you are going to fast. Dont hold to much stock in what the status shows on the Google developer console the numbers in those graphs are guesstimates at best and they are not Realtime.
The main cause for rate limit is that when you send a request there is no way of know which server your request is going to be run on. There is also no way of knowing what other requests are being run on the same server. So your request may run faster or slower than you would expect sometimes which makes it hard to track down exactly what 10,000 queries per minute and 600 per user per minute actually is.
10000 requests run on an overloaded server may run in 2 minutes while on a server that is not being overloaded it could be run in 30 seconds meaning the next request you send will blow out the quota.
As there is really no way of avoiding it you you should just ensure that your application is capable of responding to it by sending the request again. I wrote an article a number of years ago about how i would track my requests locally in my application and then ensured that it kept things at the right speed flood buster
Really as long as your application responds by sending the request again you should be ok.

Twitter user_timeline, weird statuses_count and some late tweets

We use the twitter user_timeline api to get the last 200 tweets for a set of twitter accounts.
I noticed a couple of weird issues
A few tweets arrived to the system hours after their actual creation time. Meaning, a person tweets, an hour later we run the user_timeline api for the user, we don't see the tweet, 8 hours later we run the timeline, and we receive the tweet. Does this mean it might take twitter hours sometime to index a tweet and make available for the timeline api
Sometimes the user statuses_count decreases with every new tweet for a specific account. for example, the first tweet has the statuses_count = 100, then next tweet which was tweeted after the first has statuses_count = 99. Is this because the user deleted some tweets? Is the statuses_count reliable?
Thanks
The Twitter API is eventually consistent, so I would theorise that for the timelines call, what could be happening is that there is some data center synchronisation going on behind the scenes and that you might be hitting an older copy of the data at the time of the call. It could also be because of some local caching, but it's not clear from the question how you've built your system. In most cases where I've seen an issue like this, that would be my guess as what is going on. If you want to get Tweets in more real-time, that's what the streaming API is optimized for - the REST API works differently.
On the second question, there's again a small chance that this is a consistency issue, or it could indeed be due to Tweet deletion. The different elements of the Tweet object (user object, media info, links etc) are hydrated from different systems, so they may just be momentarily out of sync, or, Tweets may have been deleted.

Youtube Data API quota exceeded

I am developing an app locally and I just integrated the Youtube data api v3 to query videos.
Last night I received the 403 error that said my daily quota has been exceeded. If I look at the chart under quotas in the developer console, it says there was 10,000 requests yesterday. This is totally impossible as I am only using this locally.
Here is the quota chart
If I click on credentials in the left hand menu and select the API key page, it says only 309 requests for that API key in the last 30 days. That is the only API key I have activated. It can't be API theft, as it only says 309 requests for that key.
I am totally confused. What is happening here? Is there a way to see the IP address where these requests are originating from?
Those 2 stats are different from each other. 1 request could have a quota cost of one to over a hundred. That quota cost reflects to the queries per day stat. So it's not surprising for you to hit 10,000 with 300 requests.
To get around this issue, you may need to optimize your API request to retrieve only the resource that you need. If the default quota (10,000) isn't really enough, then you probably need to request for quota increase through the console or through this direct apply for higher quota link.
Complete info can be found in the youtube documentation
You can check how to calculate on link

Is it possible to increase the Google Sheets API quota limit beyond 2500 per account and 500 per user?

The problem: Running into Google Sheets API read/write quota limits. Specifically, the read/write requests per 100 seconds and read/write requests per 100 seconds per user quotas.
Some background:
For the past few months I've been developing a web app for students and staff in our school district which uses a Google spreadsheet as the database. Each school in our district was assigned a different Google spreadsheet, and a service account was created to make read and write calls to these spreadsheets on behalf of the web app.
We started with one school of approximately 1000 students, but it has now expanded to two other schools with a total user load of around 4000. Due to the nature of a school day schedule, we started hitting our quota limit (per 100 sec & per 100 sec per user) since almost everyone uses the app at the same time.
I found the usage limits guide for the Google sheets API, and as per the instructions I created a billing account, and linked the associated service account project to it. I then went to the quotas section in the developers console and applied for a higher quota. This involved filling out a Google form which asked "How much quota do you need? Express in number of API queries per day." Again, queries per day is not the problem, rather it's the number of queries per 100 seconds and per user (service account). After a couple of weeks our limit was increased to 2500 read/write requests per 100 seconds and 500 read/write requests per 100 seconds per user. The billing account was not charged, and after a little searching, I realized this was a free increase. This bump in our quota limit helped, but it's still going to be an issue because our district wants to add more schools in the future.
Here's what I need to know:
1) [ESSENTIAL QUESTION] Does Google have an upper limit or maximum to the number of read/write requests a single service account/user/IP can make within the 100 second time frame, and if so what is it?
2) If it is possible to go beyond our current quota limit (2500/500), is there another way of requesting/applying for the increase. Once again we have a billing account established for the project and are willing to pay for the service.
I've been pulling (what's left of) my hair out trying to find definitive answers to my questions. This post came close to what I was looking for, and I even did some of the things the OP suggested, but I just need a direct answer to my "essential" question.
Couple more things.
I understand that Google Charts Visualization doesn't have a quota limitation, and I'd consider using it however, for privacy reasons I can't have the spreadsheet keys exposed in plain javascript. Are there other options here?
Also, one might suggest creating multiple service accounts, but I'd rather avoid this if possible.
Thank you for your help. I'm very much a novice and I greatly appreciate your time and expertise.
To answer your questions:
1) [ESSENTIAL QUESTION] Does Google have an upper limit or maximum to the number of read/write requests a single service account/user/IP can make within the 100 second time frame, and if so what is it?
*The provided documentation only stated that Google Sheets API has a limit of 500 requests per 100 seconds per project, and 100 requests per 100 seconds per user. Check this post for additional information.*
2) If it is possible to go beyond our current quota limit (2500/500), is there another way of requesting/applying for the increase. Once again we have a billing account established for the project and are willing to pay for the service.
AFAIK, you can request for a higher quota limit and the Google Engineers may grant the request as long as you are making a reasonable request.
Also, you may check this thread for additional tips:
You can use spreadsheets.get to read the entire spreadsheet in a single call, rather than 1 call per request. Alternately, you
can use spreadsheets.values.batchGet to read multiple different
ranges in a single call, if all you need are the values.
The Drive API offers "push notifications", so you can get notified when changes occur and react to those, instead of polling for
them. The latency of the notifications is a little on the slow side,
but it gets the job done.

Twitter Streaming API limits?

I understand the Twitter REST API has strict request limits (few hundred times per 15 minutes), and that the streaming API is sometimes better for retrieving live data.
My question is, what exactly are the streaming API limits? Twitter references a percentage on their docs, but not a specific amount. Any insight is greatly appreciated.
What I'm trying to do:
Simple page for me to view the latest tweet (& date / time it was posted) from ~1000 twitter users. It seems I would rapidly hit the limit using the REST API, so would the streaming API be required for this application?
You should be fine using the Streaming API, unless those ~1000 users combined are tweeting more than (very) roughly 60 tweets per second at any moment.
Using the Streaming API endpoint statuses/filter with the follow parameter, you are allowed up to 5000 users. There is no rate limit except when the stream returns more than about 1% of the all tweets being tweeted at that moment. (60 tweets per second is 1% of the average rate of tweets, which is always fluctuating, so don't rely on that number.)
If your stream does go above the 1% threshold, you can detect this. (See the LIMIT notice.) Then you would use the REST API to find missed tweets.
Twitter simply will not allow multiple streams from one registered app/account. Doing so will result in the older one being closed.
Also too many connection tries are not allowed as well and will result in a user being blocked.
Reference docs: Public Streaming API (outdated)