How to implement IoT with GCP: What are the limits of both cloud projects and service accounts per project? To what number can they be increased? - google-bigquery

In short: What are the limits of both cloud projects and service accounts per project? How can they be increased? Is the architecture a good idea at all?
I am developing an IoT application with tens of thousands of planned devices in the field, having a few devices per customer and hundreds of customers. Every device will continuously (24/7) stream measurement data directly to BigQuery with one dataset (or table) per device at sample rates of at least 100Hz.
Of course, every device needs to be authenticated and authorized to gain tightly restricted access to its cloud dataset. As stated in the Auth Guide API keys are not very secure. Therefore, the most appropriate solutions seems to have one service account per customer with one account key per device (as suggested in this GCP article). However, the FAQs of Cloud IAM state that the number of service accounts is limited to 100 per project.
This limit could be reached quickly. If so, how easily/costly is it to increase this limit towards thousands or tens of thousands of service accounts per project?
In such a scenario also the number of needed projects could easily grow to hundreds or thousands. Would this be feasible?
Is this overall concept practical or are there better approaches within GCP?

Related

Allowed to use multiple YouTube API keys for 1 project?

I am quickly reaching quota limits while using YouTube Data API v3 for searches only using 1 API key.
I have applied for Quota increase but I hear it can take some time.
However I landed on the below article which states that a max of 300 APIs can be used for 1 project. Are my really allowed to use multiple YouTube Data API v3 keys and switch between them each time quota limit is reached??
https://cloud.google.com/docs/authentication/api-keys
I had been scrambling for solutions. I hope I read it well!
Keys and credentials within a project share their quota. Creating additional api keys within the same project on google developer console is not going to get you additional quota.
As seen below all the credentials her will share the same quota.
You would need to create additional projects and create a key within each project.
All of these projects have their own credentials with their own quotas.
You should wait for an extension These days it shouldn't take more then a couple of weeks to hear back about a quota increase.
The answer is yes and no (but probably more no than yes in your case).
YES, you are allowed to use multiple Youtube Data API v3 keys and switch between but NO, you can't switch between them because you reached the quota limitation.
By doing so, you violate Youtube's Developer Policies compliance and expose yourself to sanctions. The only reason you should switch between them is to separate your environments.
From the Youtube's Developer Policies :
Don’t create multiple Google Cloud projects for the same API service or use case in an attempt to deceptively acquire an API quota that is higher than the one your project was assigned.
It is acceptable to have a separate API Project for each different use case of your API service. Examples include:
One API project for your iOS app, a separate API Project for your Android app.
One API project for a production server, one for a development server.
One API project for your user-facing API service, one API project for internal system analytics
source : https://developers.google.com/youtube/terms/developer-policies-guide#don%E2%80%99t_spread_api_access_across_multiple_or_unknown_projects

Is it possible to increase the Google Sheets API quota limit beyond 2500 per account and 500 per user?

The problem: Running into Google Sheets API read/write quota limits. Specifically, the read/write requests per 100 seconds and read/write requests per 100 seconds per user quotas.
Some background:
For the past few months I've been developing a web app for students and staff in our school district which uses a Google spreadsheet as the database. Each school in our district was assigned a different Google spreadsheet, and a service account was created to make read and write calls to these spreadsheets on behalf of the web app.
We started with one school of approximately 1000 students, but it has now expanded to two other schools with a total user load of around 4000. Due to the nature of a school day schedule, we started hitting our quota limit (per 100 sec & per 100 sec per user) since almost everyone uses the app at the same time.
I found the usage limits guide for the Google sheets API, and as per the instructions I created a billing account, and linked the associated service account project to it. I then went to the quotas section in the developers console and applied for a higher quota. This involved filling out a Google form which asked "How much quota do you need? Express in number of API queries per day." Again, queries per day is not the problem, rather it's the number of queries per 100 seconds and per user (service account). After a couple of weeks our limit was increased to 2500 read/write requests per 100 seconds and 500 read/write requests per 100 seconds per user. The billing account was not charged, and after a little searching, I realized this was a free increase. This bump in our quota limit helped, but it's still going to be an issue because our district wants to add more schools in the future.
Here's what I need to know:
1) [ESSENTIAL QUESTION] Does Google have an upper limit or maximum to the number of read/write requests a single service account/user/IP can make within the 100 second time frame, and if so what is it?
2) If it is possible to go beyond our current quota limit (2500/500), is there another way of requesting/applying for the increase. Once again we have a billing account established for the project and are willing to pay for the service.
I've been pulling (what's left of) my hair out trying to find definitive answers to my questions. This post came close to what I was looking for, and I even did some of the things the OP suggested, but I just need a direct answer to my "essential" question.
Couple more things.
I understand that Google Charts Visualization doesn't have a quota limitation, and I'd consider using it however, for privacy reasons I can't have the spreadsheet keys exposed in plain javascript. Are there other options here?
Also, one might suggest creating multiple service accounts, but I'd rather avoid this if possible.
Thank you for your help. I'm very much a novice and I greatly appreciate your time and expertise.
To answer your questions:
1) [ESSENTIAL QUESTION] Does Google have an upper limit or maximum to the number of read/write requests a single service account/user/IP can make within the 100 second time frame, and if so what is it?
*The provided documentation only stated that Google Sheets API has a limit of 500 requests per 100 seconds per project, and 100 requests per 100 seconds per user. Check this post for additional information.*
2) If it is possible to go beyond our current quota limit (2500/500), is there another way of requesting/applying for the increase. Once again we have a billing account established for the project and are willing to pay for the service.
AFAIK, you can request for a higher quota limit and the Google Engineers may grant the request as long as you are making a reasonable request.
Also, you may check this thread for additional tips:
You can use spreadsheets.get to read the entire spreadsheet in a single call, rather than 1 call per request. Alternately, you
can use spreadsheets.values.batchGet to read multiple different
ranges in a single call, if all you need are the values.
The Drive API offers "push notifications", so you can get notified when changes occur and react to those, instead of polling for
them. The latency of the notifications is a little on the slow side,
but it gets the job done.

Youtube API's maximum number of video uploads per day

We are building an app with a video upload functionality. We were wondering if we could use a Youtube account to upload all of our user videos. They should only be accessible via our app... we don't mind if ads show up while viewing them.
If the app grows, we're looking at potential thousands of uploads per day.
Does Youtube support this? If a few videos get flagged, will the "master" account be shut down?
Finally, if Youtube is the not right choice, do you have any recommendation? We would like to avoid hosting them as much as possible... Since streaming large amounts of videos is an enormous challenge for a start up.
Thank you!
Some information on the video uploads:
https://developers.google.com/youtube/v3/docs/videos/insert
This method supports media upload. Uploaded files must conform to
these constraints: Maximum file size: 128GB Accepted Media MIME types:
video/*, application/octet-stream
You can get the qouta information here: https://developers.google.com/youtube/v3/getting-started#quota
Projects that enable the YouTube Data API have a default quota
allocation of 1 million units per day, an amount sufficient for the
overwhelming majority of our API users.
...
Different types of operations have different quota costs.
A simple read operation that only retrieves the ID of each returned
resource has a cost of approximately 1 unit. A write operation has a
cost of approximately 50 units. A video upload has a cost of
approximately 1600 units.
Yes, youtube can block API access, not only on flagged videos, but at any time as described here: https://developers.google.com/youtube/terms/api-services-terms-of-service#termination
24.2 Termination by YouTube. Notwithstanding anything to the contrary, YouTube reserves the right to (i) suspend or terminate access to, or
use of, any aspects of the YouTube API Services by you, your API
Client(s) and those acting on your behalf), and (ii) terminate the
Agreement (or any portion thereof), as applied to any specific user or
API Client, category of users or API Clients, or all users or API
Clients at any time. For example, we may need to exercise such rights
in instances of your breach of this Agreement, court order, when we
believe there to have been misconduct or conduct which may create
potential liability for YouTube or its Affiliates. Although we will
try to give you reasonable notice, we have no obligation to do so.

Geocoding API usage limits at project level or account level?

Would someone be kind enough to tell me whether the Google API usage limits specified here: https://developers.google.com/maps/documentation/geocoding/usage-limits are calculated set at the project level, or account level please?
I'm using one API key for several maps on our website. Total calls per day limit is no problem at all. We're occasionally clocking more than 50 requests per second in peak times though.
If I create a new project, and get a new API key in the same account, will that mean we can hit 50 requests per second on one API key, and 50 requests per second separately on another API key...or are they calculated at the account level?
Many thanks everyone!
The documentation states the following:
Most of the Google Maps APIs have a complimentary per-day quota that can be set in the Google API Console. The daily default and maximum query limits vary by API. You can increase the complimentary daily limits by enabling billing, or purchasing a Google Maps APIs Premium Plan license. Quota limits are enforced on a unique project basis, and you may not take any action to circumvent quota limits. For example, you may not create multiple projects to compound and exceed quota limits.
https://developers.google.com/maps/faq#usage-limits
So, as you can see the usage quota is calculated on the per project basis. If you use two API keys from different projects each one will have its own usage limits. Also you cannot create unlimited number of project for one account. As far as I know you can create approximately 16 projects within one account.
I hope this clarifies your doubt.
The usage limits are calculated at the account level, not the project or key level. They do this to prevent people from just creating unlimited projects to get around the acceptable usage limits that they are providing.

/d2l/api/le/1.0/" grade.getCourseId() "/grades/" gradeId, "PUT"

The problem is that my application sends a large user data, say around 1000+ entries. The API works in a linear fashion where one user grade data is inserted at a time, which results in my product's session timeout. While we can always increase the session timeout at this end, however just wanted to check if D2L provides any API that PUTs/Pushes multiple user grades. Or any alternative approach will be appreciated.
Currently, the Valence Learning Framework API does not provide a way to do bulk-creation of grade objects or values. This is a feature that several clients have asked for and it is on the development roadmap for the platform; however, D2L does not yet have a firm estimate on delivery for this functionality.