How can I make pickle life bigger in console.cloud.google. in the gmail API? Now he dies every 7 days - api

How can I make pickle life bigger in console.cloud.google. in the gmail API? Now he dies every 7 days.
I tried to search in Google and in the help, but I did not find anything

Related

Google vision API giving sporadic 403 errors

I have a very basic python app that calls the google vision API and asks for OCR on an image.
It was working fine a few days ago using a basic API key. I have since created a modified version that uses a service account as well, which also worked.
All my images are ~500kB
However, today about 80% of all calls return "403 reauthorized" when I try to run OCR on the image. The remainder run as they always have done...
The google quotas limit page lists:
MB per image 4 MB
MB per request 8 MB
Requests per second 10
Requests per feature per day 700,000
Requests per feature per month 20,000,000
Images per second 8
Images per request 16
And I am way below any of these limits (by orders of magnitude) - any idea what might be going on?
It seems strange that simply running the same code, with the same input images, will sometimes give a 403 and sometimes not....perhaps the error is indicative of the API struggling with demand?
Thanks Dave - yes, I have. After much debugging at my end, it seems it was the API that was up the spout :-) All fine now...

Google Drive Push notifications limitations

I've created an application which receives push-notifications (watches) of changes made to the spreadsheet in Google Drive. Everything is working fine! But when I make another change (within 30 seconds or so) in the Google Drive spreadsheet, Google is not sending another notification to my application. After some minutes it works again, but in the meantime Google is not sending notifications.
Is there an limit on the amount of changes wihtin a specific time-range?
Any help / ideas are welcome!
#Niek Jansen I have tried these and what I have noticed is that for excel docs it takes upto 2 minutes to send a notification again when the anything changes. For word, its ~30s and for ppt its around 2-3 minutes

Not allowed to exceed free API quota for Time Zone API on GCP

I have an issue in my project where the Time Zone API won’t let me increase beyond the free quota. Billing is enabled for this project, but it won’t let me increase beyond 2500 req/day.
According to the Usage Limits, since billing is enabled I should be able to increase my limit to 100k/day.
I have several other projects running that will let me increase up to 100k/day. I just created a new project a few minutes ago, and sure enough, after enabling billing, I’m able to increase to 100k/day
It seems broken only on my one project. I’ve disabled/enabled it, etc. to no avail.
I started using an API key from another project just to get around it, but billing just got disabled for that project, so I’m back to having issues again.
I’ve tried leaving feedback via the cloud console and asking in a couple other places. But I’m being told that the places I’m asking in aren’t the right places, and leaving feedback multiple times has yielded no results after about a month.
I’m kind of at a loss now on where to go. Any suggestions on how to get this working would be very welcome at this point.
This is what I see in my project (with billing enabled).
This is what I SHOULD see in my project.

Measure how hot a topic is on Twitter

What kind of service should I use to measure how hot a topic is on Twitter, and how hot it has been in the past?
I thought about:
The Twitter API (https://dev.twitter.com/rest/reference/get/search/tweets) that lets me run searches up to 100 tweets. So in this case I have to make multiple calls to determine how many tweets there are. Is that correct?
TweetReach, that gives reports like this: https://tweetreach.com/reports/16000571, but the cheapest plan is at 300$/month.
With the Twitter API, you have a few options, but none of them may be exactly what you want, and none of them can go back very far into the past. You would have to either compile that information yourself, or use an external service like the one you mentioned.
Using the search API, you can only get results from the past 7 days, and are limited to 100 tweets per request. You can also set result_type to popular to get the most popular tweets about that search term. Twitter does have rate limits, but the ones for search are relatively high. You can use 180 requests every 15 minutes for any user you have authenticated, plus 450 requests every 15 minutes for the app itself (completely separate from the user requests). So if you only use app requests, you can get 45,000 tweets every 15 minutes.
If you don't need to search for specific terms, you can get trending topics in different areas using trends. The available areas can be retrieved using trends/available. Searching for trends also gives you the tweet_volume of each trend over the past 24 hours. If you check the trends every 24 hours and save the volumes, you can build up histories of trending topics.
Another option is using the streaming api. This only gives you current tweets, but you can use track to only get results for a set of terms, which you can then analyze.
Any external service, like TweetReach, will probably either cost you money or strictly limit the amount you can do with it unless you pay.
I'm the Social Media Manager for Union Metrics (we make TweetReach and lots of other things) and I just wanted to let you know that our free snapshots are built on the Search API which gives it those restrictions you've already discussed above, while our full snapshot reports can grab up to 1500 tweets for $20.
We do have more comprehensive Twitter analytics which I think you've already looked at, and those do backfill 30 days before tracking going forward. However you might have missed our new product Echo, which allows for a full, interactive search of the entire Twitter archive (you can see it in action here https://unionmetrics.com/product/echo-twitter-archive-search/) and is available through our Social Suite.
I understand if you don't have a large budget, and I completely understand the dilemma of cost of your time to build what you need vs. budget restrictions. Hope this helps at least let you know what else we offer!
Sarah A. Parker
Social Media Manager | Union Metrics
Fine Makers of TweetReach, The Union Metrics Social Suite, and more

How to get unblocked after exceeding the Google Geocode API usage limit?

Have searched for answers on this for 2 days now with very little luck.
I'm developing a Drupal 7 site which has a Geofield field being autopopulated from an address field using the Google Geocoder API, but as of a couple of days ago this stopped working:
Exception: Google API returned bad status.\nStatus: OVER_QUERY_LIMIT in geocoder_google() (line 52 of /home/.../modules/geocoder/plugins/geocoder_handler/google.inc).
I can remove the proximity search filter that is sending too many requests to the Google API but I can't progress because I run into the above error every time I try to add a new record to the database (which just does one lookup to get a geocode from an address field but fails). Is there any way to unblock my site from Google's API or reset my usage? I've added an API key but to no avail. This was all working fine up until very recently, which I guess is when I unknowingly exceeded the usage limit.
I have limited API experience and am a Drupal/PHP beginner so please be gentle! Happy to provide more info, code, error messages etc if needed. Relevant Drupal 7 modules being used are OpenLayers, OpenLayers Proximity, Geofield, GeoPHP and Geocoder. Thanks for any help anyone can offer.
From Google Geocode Documentation:
Use of the Google Geocoding API is subject to a query limit of 2,500 geolocation requests per day. (User of Google Maps API for Business may perform up to 100,000 requests per day.) This limit is enforced to prevent abuse and/or repurposing of the Geocoding API, and this limit may be changed in the future without notice. Additionally, we enforce a request rate limit to prevent abuse of the service. If you exceed the 24-hour limit or otherwise abuse the service, the Geocoding API may stop working for you temporarily. If you continue to exceed this limit, your access to the Geocoding API may be blocked.
So, I guess you have to wait 24 hours, or upgrade to the business version.