How many short urls can be generated per day for Firebase Dynamic Links? - firebase-dynamic-links

https://firebase.google.com/docs/dynamic-links/rest
The document says "Requests are limited to 5 requests/IP address/second, and 200,000 requests/day.".
So which is correct?
"200,000 total times per day" or "200,000 times per 1 IP Address per day"??

Look at it more like, 5 requests/IP address/second as a rate - you can't exceed that number of requests sent per IP address per second.
Whereas 200,000 requests/day is the total limit of requests you can send per day.
So, you're probably looking for the answer as 200,000 total requests per day.
If you require it, you can request to increase your quota using this form.
Enter the following fields to request the increase in Firebase Dynamic Links API quota. We have default spread quota (200,000 queries per day) and burst (500 queries per 100 seconds). Consider spreading your load over a longer period of time before requesting.

Related

How to increase quota over 1000 requests per day of map javascript api

I added a visa account to enable billing but was limited to 1000 requests per day. While I need about 20,000 requests per day, I can't set quota them because they only import 0 to 1000.
Please help me ! Thanks you
There is no such limit as 1000 requests per day; just see the documentation.
20.000 requests * 30 days = 600.000 requests per month.
> 500.000 requests per month it reads: contact sales for volume pricing.

Configuring Maximum rows per request in Dataflow Bigquery

I am using this template:
https://cloud.google.com/dataflow/docs/templates/provided-templates#cloudpubsubtobigquery
Reading quota limits under Maximum rows per request here: https://cloud.google.com/bigquery/quotas#streaming_inserts
They recommend keeping max 500 rows per request.
Where can I configure Maximum rows per request in BigQuery sink?
I've searched the whole documentation but did not find any relevant info.
The default value for ‘Maximum rows per request’ for inserts in BigQuery is 10,000 rows as mentioned here. The recommended number of rows is 500 per request, but you can experiment by inserting lower or higher number of rows. There is no configuration to limit or modify the number of rows to 500, the only limit is 10,000 rows per request. Quota increases can be requested through the Quotas page in your project and the quota can be increased in increments of 50,000 rows.

Do Google Custom Search JSON API purchased additional queries rollover?

The documentation for Google Custom Search JSON/ATOM API is unclear regarding pricing for additional queries. It states:
The API provides 100 search queries per day for free. If you need
more, you may sign up for billing in the API Console. Additional
requests cost $5 per 1000 queries, up to 10k queries per day.
For those that use that API in excess of the initial free 100, does the $5/1000 additional queries reset each day, or does that number roll over for subsequent days?
For instance, if I have a total number of queries on 3 consecutive days of 110, 120, and 130, will the account be billed $5 each day for the 10, 20, and 30 extra queries? Or will I be billed $5 the first day and by the end of the 3rd day I'll still have a bucket of 940 additional queries left to use for future overages?
In case anyone is also still looking for an answer to this question, (as I was a day ago), it turns out that for the Google Custom Search API the billing for $5/1000 queries is prorated.
Google Cloud Support emailed me the following:
With the Google Custom Search, you have a free 100 search queries per day which resets daily. But for example if you exceeded the daily limit, you will be charged $5 per 1,000 queries. In case that you go over the 100 queries per day and did not reach the 1,000 queries, it will be prorated from the $5. This resets daily as well. Please take note that 10k queries is the maximum daily.
I then clarified with them the following example, which they agreed was correct.
An example of making an average of 180 JSON API queries/day:
100 queries/day are free
80 queries/day are charged at $5 * (80/1000) = $0.40/day
Monthly it would be $12.00 (40cents * 30)

BigQuery: I have reached the daily limit for Load Jobs. When does the quota reset back to 0?

I have exceeded the daily limit for the number of import to a specific table.
(Max=1000 imports according to the documentation here: https://developers.google.com/bigquery/quota-policy#import )
I would like to know when exactly does the quota reset back to 0? Is it 24hours after I exceeded the quota, or is it at a specific time?
As of this July 18 2014, all daily quotas are partially replenished every 10 minutes or so.
The first time you run a load job to a table (or if you haven't done so in a while) you'll get 1000 loads. Every few minutes, the quota will partially replenish, up to a maximum of 1000 available.
While this sounds complex, it means that you never get in a situation where you run out of daily quota and have to wait up to 24 hours for quota to reset. Instead, if you run out of quota you can start running jobs fairly soon thereafter (as long as you stay within the replenishment rate).
Hope that is helpful.

Dealing with Amazon Product Advertising API Throttle limits

For those of you who use the Amazon Product Advertising API, what experience have you had with running into their throttle? Supposedly, the limit is set at 1 request per second, is that your experience?
I want my site to grow to be nation-wide, but I'm concerned about its capability to make all the Amazon API requests without getting throttled. We cache all the responses for 24 hours, and also throttle our own users who make too many searches within a short period.
Should I be concerned? Any suggestions?
I believe they have changed it. Per this link:
https://forums.aws.amazon.com/message.jspa?messageID=199771
Hourly request limit per account = 2,000 + 500 * [Average associate revenue driven per day over the past 30 days period]/24 to a maximum of 25,000 requests per hour.
Here is the latest on request limits that I could find, effective Sept 3rd, 2012.
If your application is trying to submit requests that exceed the
maximum request limit for your account, you may receive error messages
from Product Advertising API. The request limit for each account is
calculated based on revenue performance. Each account used to access
the Product Advertising API is allowed an initial usage limit of 1
request per second. Each account will receive an additional 1 request
per second (up to a maximum of 10 requests per second) for every
$4,600 of shipped item revenue driven per hour in a trailing 30-day
period.
https://affiliate-program.amazon.com/gp/advertising/api/detail/faq.html
They have updated their guidelines, you now have more requests when you sell more items.
Effective 23-Jan-2019, the request limit for each account is calculated based on revenue performance attributed to calls to the
Product Advertising API (PA API) during the last 30 days.
Each account used for Product Advertising API is allowed an initial
usage limit of 8640 requests per day (TPD) subject to a maximum of 1
request per second (TPS). Your account will receive an additional 1
TPD for every 5 cents or 1 TPS (up to a maximum of 10) for every $4320
of shipped item revenue generated via the use of Product Advertising
API for shipments in the last 30 days.
Source: https://docs.aws.amazon.com/AWSECommerceService/latest/DG/TroubleshootingApplications.html
Amazon enforces limits on how many calls you can make per hour and per second.
You can increase the former by following the sanctioned route (increase commission revenue) or by privately petitioning Amazon with a valid reason. When whitelisted, your limit will go up to 25,000 calls per hour, which is more than good enough for the vast majority of projects I can think of.
The latter limit is murkier and enforced depending on the type of query you make. My interpretation is that it is meant to keep serial crawlers who do batch item lookups in check. If you are simply doing keyword searches etc., I would not worry so much about it. Otherwise, the solution is to distribute your calls across multiple IPs.
One other point to keep in mind if you are querying multiple locales is to use separate accounts per locale. Some locales are grouped and will count to the same call quota. European Amazons, for instance, form such a pool.