We are building an app with a video upload functionality. We were wondering if we could use a Youtube account to upload all of our user videos. They should only be accessible via our app... we don't mind if ads show up while viewing them.
If the app grows, we're looking at potential thousands of uploads per day.
Does Youtube support this? If a few videos get flagged, will the "master" account be shut down?
Finally, if Youtube is the not right choice, do you have any recommendation? We would like to avoid hosting them as much as possible... Since streaming large amounts of videos is an enormous challenge for a start up.
Thank you!
Some information on the video uploads:
https://developers.google.com/youtube/v3/docs/videos/insert
This method supports media upload. Uploaded files must conform to
these constraints: Maximum file size: 128GB Accepted Media MIME types:
video/*, application/octet-stream
You can get the qouta information here: https://developers.google.com/youtube/v3/getting-started#quota
Projects that enable the YouTube Data API have a default quota
allocation of 1 million units per day, an amount sufficient for the
overwhelming majority of our API users.
...
Different types of operations have different quota costs.
A simple read operation that only retrieves the ID of each returned
resource has a cost of approximately 1 unit. A write operation has a
cost of approximately 50 units. A video upload has a cost of
approximately 1600 units.
Yes, youtube can block API access, not only on flagged videos, but at any time as described here: https://developers.google.com/youtube/terms/api-services-terms-of-service#termination
24.2 Termination by YouTube. Notwithstanding anything to the contrary, YouTube reserves the right to (i) suspend or terminate access to, or
use of, any aspects of the YouTube API Services by you, your API
Client(s) and those acting on your behalf), and (ii) terminate the
Agreement (or any portion thereof), as applied to any specific user or
API Client, category of users or API Clients, or all users or API
Clients at any time. For example, we may need to exercise such rights
in instances of your breach of this Agreement, court order, when we
believe there to have been misconduct or conduct which may create
potential liability for YouTube or its Affiliates. Although we will
try to give you reasonable notice, we have no obligation to do so.
Related
youtube data api v3 Queries per minute have 1,800,000 limit, but Queries per day have 10000 limit(see attachment), In our opinion, this is quite unreasonable, is it a bug? enter image description here
Not a bug. https://developers.google.com/youtube/v3/guides/quota_and_compliance_audits
Projects that enable the YouTube Data API have a default quota allocation of 10,000 units per day, an amount sufficient for the majority of our API users. You can see your quota usage on the Quotas page in the API Console.
If you would like to request additional quota beyond the default allocation, you must first complete an audit to show that your project is in compliance with the YouTube API Services Terms of Service. This gives YouTube visibility into the intended use cases of large projects and ensures that YouTube's API services are being used in a manner that is free from abuse. Visit this link for additional details on complying with YouTube’s Developer Policies.
I have a project where I need to have the API quota increased significantly from the 10,000 daily hits, and I think this is being processed by Google as part of a YouTube API Services Compliance Review.
However, I have not had any response in over a week and the delay is putting the project at risk of a delayed launch and additional costs.
Does anyone know if this is normal and if there is a way to expedite the review, or speak to someone? Even pay for a higher tier of support?
Thanks in advance.
If you’ve filled the audit form https://support.google.com/youtube/contact/yt_api_form?hl=en properly, you should get a response within two weeks (Google reviews thousands of these, among other things to prevent abuse this is one of the processes that isn’t fully automated).
I recommend if your in a rush since your paying for credits you might as well open a second account and load balance between two or even three accounts; in your code you can create counters and swap before capping out the 24 hour term; not sure what data you’re looking to extract but depends on what data you may be able to even use other services to supplement.
They will get back to you about your application; just requires massive patience.
We're currently in the process of recording 1fps time-lapsed 4k 360 photos of every island in the Bahamas, with embedded GPS EXIF data. An average hour of filming tends to produce around 600 image frames, which can easily expand to 2000-10,000 images per day on bigger routes. 2000 or so are approved on Google Maps already, but we're hitting a larger brick wall.
The Street View app is obviously the best way to upload when you have 50-100 image files, but it obviously struggles when it starts to hit over 500+ uploads in a batch (publishing doesn't start, or the app crashes), so we're left manually submitting collections. Add that to the standard 4000/day quota, and it's quite a challenge.
Having looked at the Publish API, it's rather tricky to leave a CLI tool running as it's designed with OAuth flow in mind with 1hr access tokens. The service account route seems to the way to go, but the PHP API client seems to have scant documentation for SV Publishing. Connecting photos is also tricky with that many images.
We ideally need a desktop uploader (like the backup tool), or a way to directly import from folders in Google Drive. The first seems discontinued, and there's no data on the second.
Can anyone explain or elucidate on the best practice for this kind of volume upload with the Street View publishing service?
Since you are only capturing 4K images, it will be much simpler to use video mode, depending on if your camera supports it at the desired framerate. You may check this documentation for more information.
Additional information:
You may also check out some of the desktop utilities at the bottom of the Street View website which may be able to help.
You may want to consider one of the Street View ready cameras (also listed on the above webpage) that is capable of recording and uploading 360 videos to Street View. Upon publication, the 360 photo frames are extracted from the video and used to create an automatically connected Street View experience on Google Maps.
Check these pages to learn more about this option:
Set up and connect 360 cameras
Capture and publish in Video mode with the Street View app
Tips for capturing 360 videos for Street View
I am working on one product which is fetching all the organization/workspace and app details of the customer. The customer can refresh them any time.
So let’s say I have one customer who has 100 applications across multiple workspaces so around it is making around 110 calls to get each application detail, workspace details and organizations.
Now if that customer refreshed the applications multiple times like 10 times in an hour so the action only for that is 1000 API calls. If I have 50 such users active and doing this thing then it will be something 50000.
AFAIK I can not make so many API calls in an hour so how to handle this scenario. I know a lot of applications are doing such things so want to understand how everyone is handling this.
If you need a higher rate limit, I would encourage you to contact Podio support and ask specifically for what you need. We have internal guidelines for evaluating these kinds of requests and may increase the limit for your user and client ID if appropriate.
In general, though, I would expect your app to implement some kind of batching, transient storage, and/or caching layers, especially if your customers are interacting with Podio exclusively or primarily through your system.
Please see our official statement here: https://developers.podio.com/index/limits
Summary:
The general limit is 5,000 API calls per hour, but if the API call is marked as "Rate limited" in the API reference the call is deemed resource intensive and a lower rate of 1000 calls per hour is enforced. If you hit the rate limits the API will begin returning 420 HTTP error codes for all API calls. Rate limits are per user per API key.
Contacting support:
If you have a project that requires a higher rate limit contact support#podio.com with a brief description of your project, your estimated usage and the client_id of the API key you are using.
Usage tips:
Tips for reducing API usage
Avoid making API requests inside loops. Instead of fetching individual objects inside a loop, fetch a collection of objects in one API operation. E.g. filter items
Cache results whenever possible. This is especially true when you are displaying data to the public (i.e. every sees the same output).
Don't poll for changes. Instead of polling Podio to see if your content has changed use webhooks or push to receive a notification. This might save you thousands of requests: https://developers.podio.com/doc/hooks
Use logging to see how many requests you're making
Bundle responses with "fields" parameter
You might want to build an API proxy app; you would need a messaging queue and a rate limiter. This would lets you keep track of the api calls consumptions across apps and users.
Also worth noting: some API routes are more expensive than others if they are more resource intensive on the Podio side… The term in use is rate-limited: rate limited api route are bound to 1k calls an hours, so in effect costs 5 times as much as regular routes.
Hope this helps!
I am developing a web for my university where users can create an account and upload images. Images are private and can only be seen by the person who uploaded them. For instance, is like a cloud file system.
Each user have a free account with 500MB. I am using Amazon S3 to store the images, that is to say storage implies costs.
How can I avoid that bots upload millions of MB? How can I avoid that a bot creates million of new accounts and upload 500MB per account without affecting the user experience?
On one hand I definitely don't want to put a CAPTCHA in the registration form because it negatively affects the conversion rate. On the other, I don't want to pay thousands of dollars because a bot upload million of dummy images.
Does anyone know whether Dropbox, Google Drive, etc, suffers from this (content uploaded by bots)? It seems that is not a problem because I couldn't find anything about it. All spam related problems I could read about only covered spam in forums. It makes sense also. Spam in forums can be read by other users. Spam in a service like Dropbox or Google Drive reaches no one. Nonetheless I have to protect it to avoid cost surprises.
As far as I can see, without using CAPTCHAs this can be done:
Set up monitoring systems that warn for specific abuse patterns (the same IP uploading lots of data and creating new accounts repeatedly).
Throttle users that follow those patterns; this will hopefully make them realize and make the process worthless. If this fails, then disable those accounts and have their owners mail/talk to you in order to explain what's happening.
Since you say it's a system for your university, make users provide proof of enrollment (e.g. an university e-mail address) in case of abuse.
Have this forbidden usage explicit in your terms of use.
Of course, a smart enough bot can work around all those problems.
For a more advanced solution, you might try some machine learning or AI that learns about normal and abnormal usage patterns, then applies that information to judge a possible abuser.
I would recommend to :
make users register using their email
don't allow multiple accounts for a single email
send them an email registration confirm, and deactivate the "unconfirmed" accounts after a short amount of time (eg 3 days)
AFAIK, Drupal embeds this kind of controls out-of-the-box or with little effort (and no programming).
This won't solve all your problems, but in fact it will reduce the risk of bot exploits.
As you said you need a registration, there are two points to tackle this problem - make sure no bots register and/or limit the number of uploads.
I personally would use both points. For the user signup, design a login form where the user has to enter its email address, send them a mail with a link in it and activate their account only after clicking this link. Or let the user solve a simple math question on signup.
For the second point, you can store the number of uploaded bytes per user and time. You can then set a quota on allowed upload usage per time, for example you may not upload more than 10MB per hour. If a user hits this limit more than n times, you can deactivate his account.
And: set up and alerting and monitoring system. For example monitor the number of non-activated users, monitor the amount of uploads etc. and set up alerts if these exceed a certain threshold.
The above mentioned methods may not be perfect and probably won't block out all bots, but they will at least make it way harder for bots to upload unwanted data. Also these methods are quite simple, so you can start of with your project and see if this is really a problem. And if you get bots to upload data, you will at least receive alerts and can invent a better solution afterwards.