Not allowed to exceed free API quota for Time Zone API on GCP - api

I have an issue in my project where the Time Zone API won’t let me increase beyond the free quota. Billing is enabled for this project, but it won’t let me increase beyond 2500 req/day.
According to the Usage Limits, since billing is enabled I should be able to increase my limit to 100k/day.
I have several other projects running that will let me increase up to 100k/day. I just created a new project a few minutes ago, and sure enough, after enabling billing, I’m able to increase to 100k/day
It seems broken only on my one project. I’ve disabled/enabled it, etc. to no avail.
I started using an API key from another project just to get around it, but billing just got disabled for that project, so I’m back to having issues again.
I’ve tried leaving feedback via the cloud console and asking in a couple other places. But I’m being told that the places I’m asking in aren’t the right places, and leaving feedback multiple times has yielded no results after about a month.
I’m kind of at a loss now on where to go. Any suggestions on how to get this working would be very welcome at this point.
This is what I see in my project (with billing enabled).
This is what I SHOULD see in my project.

Related

Google Workspace Migrate Gmail API Limit

I'm using Google's Workspace Migrate tool to move Gmail data for users from one Workspace domain to another. After about 10 minutes, Gmail message migrations stop with an error saying "Quota exceeded for quota metric 'Queries' and limit 'Queries per minute per user" of service "gmail.googleapis.com' for [Google Cloud project number]."
I don't see in Google Cloud that I'm actually hitting any limits. I don't have the ability to throttle API requests as I'm using a tool provided by Google. Do "free" Google Cloud projects have different limits than "paid" projects?
I'm expecting Google's tool to work as advertised. I have a case open with their support, but I'm not getting anywhere fast when it comes to a solution.
Hello 👋 Not sure if you already got an answer to this, but I recommend checking the Quotas page on GCP to see your current usage. You can access that here. Search for the "Queries per minute per user" metric for the Gmail API and look at your "Seven-day peak usage percentage". A note, though, since your question was posted more than seven days ago, you might need to rerun a migration to see your current usage.
Regarding your question about the limit for "free projects", I can't really help there. I can tell you that on our project (which has a billing account attached), we have the limit set to 15,000.
You can always ask Google to increase your quota if you're not getting enough for your use case.

Calculating Front end performance metrics via Web API's ( navigation API and performance timeline API)

In order to calculate the first contentful paint , i used the below command in my browser console.
window.performance.getEntriesByType('paint') -> From that , i fetched the start time of first contentful paint which is : startTime: 710.1449999972829 ms.
Reference
But if i audit the same page via lighthouse(from chrome dev tools), the first contentful paint calculated by lighthouse is '1.5 s'
I am trying to understand why there is a wide difference between the two data. Tried running the audit couple of times via lighthouse, still the data hardy matches with web api data.
Can anyone explain me as to why there is huge difference. Should i go ahead with the data from web api's or should i consider lighthouse data as valid one?
Thank you for the great question, I learned something today because of it.
It appears that even on desktop view there is some throttling applied to the CPU, this didn't used to be the case as far as I am aware.
I found this article that explains the current throttling policy.
The key part here is as follows:
Starting with Chrome 80, the Audits panel is simplifying the throttling configuration:
Simulated throttling remains the default setting. This matches the setup of PageSpeed Insights and the Lighthouse CLI default, so this provides cross-tool consistency.
No throttling is removed as it leads to inaccurate scoring and misleading metric results.
Within the Audits panel settings, you can uncheck the Simulated throttling checkbox to use Applied throttling. For the moment, we are keeping this Applied throttling option available for users of the View Trace button. Under applied throttling, the trace matches the metrics values, whereas under Simulated things do not currently match up.
Point 3 is the main part. Basically the throttling is still applied to the CPU on desktop. Also note they say "for the moment" so this is obviously something they are considering removing in the future.
This is actually a really good idea, most developers are running powerful hardware, most consumers are running cheap off the shelf laptops with i3 processors (or equivalent...or worse!).
As Google spend a lot of time refining Lighthouse I would leave Simulated throttling ON and use their results as they will be more indicative of what an end user might see.
Switching off simulated throttling
If you want your trace results (or console performance API results) to match then uncheck "simulated throttling" at the top of the page.

YouTube API Services Compliance Review

I have a project where I need to have the API quota increased significantly from the 10,000 daily hits, and I think this is being processed by Google as part of a YouTube API Services Compliance Review.
However, I have not had any response in over a week and the delay is putting the project at risk of a delayed launch and additional costs.
Does anyone know if this is normal and if there is a way to expedite the review, or speak to someone? Even pay for a higher tier of support?
Thanks in advance.
If you’ve filled the audit form https://support.google.com/youtube/contact/yt_api_form?hl=en properly, you should get a response within two weeks (Google reviews thousands of these, among other things to prevent abuse this is one of the processes that isn’t fully automated).
I recommend if your in a rush since your paying for credits you might as well open a second account and load balance between two or even three accounts; in your code you can create counters and swap before capping out the 24 hour term; not sure what data you’re looking to extract but depends on what data you may be able to even use other services to supplement.
They will get back to you about your application; just requires massive patience.

"reasonable" use of web APIs to sync data

My goal is to synchronize a web-application with an internal database. The web-application has a public API, but in order to fully synchronize the two sources I would need to make around 2000 separate API calls every time. My instinct tells me that this is excessive and possibly irresponsible, but I lack the experience to know for sure.
In this particular case the web-application is Asana, but I've encountered similar situations before with other services. Is there any way to know if you're abusing a service through excessive API calls? I know I'm not going to DOS a company like Asana, but I can't shake the feeling that there must be a better way than making ~150k requests per day.
The only other option I can think of is to update the web-service only when I know there's been a change in the database, but I'll lose a lot of capability that way.
I apologize for the subjectivity of this question, but I'm really hoping that someone can explain if there's any kind of etiquette that's expected when using public APIs.
(I work at Asana)
This is an excellent question, or rather set of questions.
You are designing a system that will repeatedly make requests for every object. What will happen as the number of objects grows? Even if your initial request rate were reasonable, this would suffer problems with scalability. A more scalable solution is one that scales with the number of changes in the system. This will also grow over time, but much more slowly - the number of changes a single user can make per day is relatively constant, but the total number of objects they've created over time grows and grows. So my first piece of advice would be to avoid doing things this way, and instead find a way to detect changes and just act on those. It would be interesting to know why you feel you'll lose capability by taking this approach.
Now, I happen to know that the Asana API does not currently provide you with any friendly mechanism to just detect changes in the system. This is a commonly requested feature and we are looking into it, though I unfortunately cannot promise a delivery date. So you might be left with no choice but to poll our system for now.
As for being polite to the API, many service providers set limits on their API usage to prevent accidental or malicious use of the API from impacting the service to their other customers -- Asana is no exception. Sometimes these limits are published, other times not, and there is no standard limit: it all depends on the service. But it is very thoughtful of you to be curious about service limitations.
That said, 150k requests per day is, for the Asana API, kind of a lot. If all of our API users gave us that much traffic, we might be serving more requests per day than Google Web Search, and we're not quite that scalable yet. :) Technically, sometimes, we might handle requests at that volume from a single user.
If you must poll, try to poll on intervals like 15 minutes. But please do not poll your entire workspace on this time period; it's likely to be too much traffic/data. We're working on trying to provide you with a better solution.
If you do happen to make too many requests of the Asana API, you will get back HTTP status code 429 instead of your desired response; you can read more about that here (https://asana.com/developers/documentation/getting-started/errors).

How to make a Tag cloud app that post on a website?

I want to make an app where the users can post messages that will be displayed on a website. The users would need to create a username and password to be able to post.
The app would be like a twitter, but only be able to post through the app and read the last few posts and not be able to write private messages.
The website would function like a huge cloud of thoughts where everyone could go and read what others have written. Once the post hit the cloud, they can't be deleted. Only me could delete posts.
All posts would have different color and font size, it would look like a huge tag cloud on the website.
How do I make an app and a website like this?
David H
The tutorial application for Google Application Engine is an unstyled version of what you describe. They'll even host it for you for free (up to a non-trivial level of usage).
The tag cloud creation is not so very hard but without knowing your preferred language it is hard to point you to helpful libraries (there are plenty out there).
Getting people to use it will be the hard part.
added in response to comment:
Good luck on your endeavor. I would be surprised if you weren't able to learn everything you need to know and have a working web app by the time school starts. I found a simple stand alone web cloud creation library that explains what it does and will run on GAE. So now even that part is in place for you.
I'm tempted to make some pathetic reference to the sorts of computing that I did prior to high school, but I expect that you probably have SD data cards have more computational power than I had available to me. Kids these days! ;)