Do Google Custom Search JSON API purchased additional queries rollover? - google-custom-search

The documentation for Google Custom Search JSON/ATOM API is unclear regarding pricing for additional queries. It states:
The API provides 100 search queries per day for free. If you need
more, you may sign up for billing in the API Console. Additional
requests cost $5 per 1000 queries, up to 10k queries per day.
For those that use that API in excess of the initial free 100, does the $5/1000 additional queries reset each day, or does that number roll over for subsequent days?
For instance, if I have a total number of queries on 3 consecutive days of 110, 120, and 130, will the account be billed $5 each day for the 10, 20, and 30 extra queries? Or will I be billed $5 the first day and by the end of the 3rd day I'll still have a bucket of 940 additional queries left to use for future overages?

In case anyone is also still looking for an answer to this question, (as I was a day ago), it turns out that for the Google Custom Search API the billing for $5/1000 queries is prorated.
Google Cloud Support emailed me the following:
With the Google Custom Search, you have a free 100 search queries per day which resets daily. But for example if you exceeded the daily limit, you will be charged $5 per 1,000 queries. In case that you go over the 100 queries per day and did not reach the 1,000 queries, it will be prorated from the $5. This resets daily as well. Please take note that 10k queries is the maximum daily.
I then clarified with them the following example, which they agreed was correct.
An example of making an average of 180 JSON API queries/day:
100 queries/day are free
80 queries/day are charged at $5 * (80/1000) = $0.40/day
Monthly it would be $12.00 (40cents * 30)

Related

How many short urls can be generated per day for Firebase Dynamic Links?

https://firebase.google.com/docs/dynamic-links/rest
The document says "Requests are limited to 5 requests/IP address/second, and 200,000 requests/day.".
So which is correct?
"200,000 total times per day" or "200,000 times per 1 IP Address per day"??
Look at it more like, 5 requests/IP address/second as a rate - you can't exceed that number of requests sent per IP address per second.
Whereas 200,000 requests/day is the total limit of requests you can send per day.
So, you're probably looking for the answer as 200,000 total requests per day.
If you require it, you can request to increase your quota using this form.
Enter the following fields to request the increase in Firebase Dynamic Links API quota. We have default spread quota (200,000 queries per day) and burst (500 queries per 100 seconds). Consider spreading your load over a longer period of time before requesting.

Calculate Conversion "Weight" based on Multiple Conversions

I'd like to estimate the value of each "conversion" starting from a free trial signup all the way to that user becoming a paid user.
Let's say I have an online coding course website that offers a free 30-day trial. After the trial period, the cost is $100 per month.
I bring in 100,000 users/mo to the signup page of my site via Google Ads paid search
I consistently get ~1,000 free trial signups per month (1% conversion rate). All signups are considered free trial users
Of the 1,000 trial users, 500 log in exactly 2x within the first week (let's call these L2W1 for Logged in 2x within Week 1)
Of the 1,000 trial users, 100 log in at least 3x within the first week (L3W1). These users are mutually exclusive from above
Of the 500 L2W1 users, 50 users (10%) sign up for at least 1 course
Of the 100 L3W1 users, 50 users (or 50%) sign up for at least 1 course
On average, I get 35 paid users monthly
10 of 50 L2W1 users become Paid Users
25 / 50 L3W1 - in other words, 50% of L3W1 Users convert to a Paid User
To recap, assume these are the only events that I am currently tracking:
100,000 Site Visitors ---> 1,000 Trial Signups
500 L2W1 ---> 50 Course Signups (CS) ---> 15 Paid Users = $1,000
100 L3W1 ---> 50 Course Signups (CS) ---> 35 Paid Users = $2,500
Total: 35 Paid Users (PU) ---> $3,500
To keep things simple, let's ignore life-time value (e.g. the average user subscribes for 3.5 months).
QUESTION: Is there a mathematical equation I could use to assign values (or percentages) to each conversion event? As I receive more information, I'd like to provide that to Google as a signal that each conversion type down the funnel is a sign of a more qualified user and therefore more valuable to my business.
I can simply take the 1,000 Trial Signups and divide $3,500 (and ignore all other conversion types) in which case each Trial Signup is worth $3.50. However, only 35/1000 Trial Users convert to a Paid User so there is valuable information I am leaving on the table and not informing Google for automated bidding purposes.
I'm thinking something like this is better:
1 Trial Signup = $1.00
1 L2W1 = $2.00
1 L3W1 = $3.00
1 CS = $4.00
1 PU = $90.00
.. so a Paid User who goes through several of the steps above will equate to about $100. Not sure if this is a good approach or if the math makes sense. Any pointers or tips would be greatly appreciated.
I've been reading on Bayes Theorem and it seems to be a good model to use for this case but I'm not at all familiar with it enough to know if it's applicable to this situation.

Is there a way to export more than 10000 records to AWS s3 from Google reporting analytics API daily?

I have a GA 360 view that gets decent amount of traffic daily and i want to export the hit level data (using GA_client_id) to the aws s3 server. the limitation here is that GA API allows only 10000 records a day. someone suggested that if we put the GA_client_id in a custom dimension, the limit would not apply. is it true? please let me know if there is another solution to export more than 10000 records for a single view per day. please note that this will be a single query that will auto run daily at a specific time.
Thank you so much in advance.
10000 records a day.
Correction 10000 requests per day per view. A request response can include millions of records (rows)

flight-offers-search and flight-cheapest-date-search - limit by number of connections and layover duration

I am testing flight-offers-search and flight-cheapest-date-search
Are there parameters available to limit by number of connections and layover duration, didn't see it in the doc.
Also, is there a functionality to fetch future prices for given period ex: get average price for 2 week trips in the next month, 3 months, 1 year?
Thank you.
Regarding your first point: In the Flight Offers Search API: As of today, the API doesn't offer a parameter to control the time of the layover you will have to check the response and do it on your side. For the number of connections, you can filter direct and non-direct flights using the parameter nonStop. Then, if you want to limit the number of stops you have to do it by filtering the response (by looking at the number of segments inside the itineraries).
Flight Cheapest Date Search has a similar parameter to control the direct and non-direct offers: nonStop.
Regarding your second point: not directly, for this, you can do it by:
Use the Flight Offers Search and do multiples searches and make an average of the prices you find
Use the Flight Cheapest Date Search to do the same (keep in mind that this API uses a pre-computed cache and has a limited number of origin-destination)

Dealing with Amazon Product Advertising API Throttle limits

For those of you who use the Amazon Product Advertising API, what experience have you had with running into their throttle? Supposedly, the limit is set at 1 request per second, is that your experience?
I want my site to grow to be nation-wide, but I'm concerned about its capability to make all the Amazon API requests without getting throttled. We cache all the responses for 24 hours, and also throttle our own users who make too many searches within a short period.
Should I be concerned? Any suggestions?
I believe they have changed it. Per this link:
https://forums.aws.amazon.com/message.jspa?messageID=199771
Hourly request limit per account = 2,000 + 500 * [Average associate revenue driven per day over the past 30 days period]/24 to a maximum of 25,000 requests per hour.
Here is the latest on request limits that I could find, effective Sept 3rd, 2012.
If your application is trying to submit requests that exceed the
maximum request limit for your account, you may receive error messages
from Product Advertising API. The request limit for each account is
calculated based on revenue performance. Each account used to access
the Product Advertising API is allowed an initial usage limit of 1
request per second. Each account will receive an additional 1 request
per second (up to a maximum of 10 requests per second) for every
$4,600 of shipped item revenue driven per hour in a trailing 30-day
period.
https://affiliate-program.amazon.com/gp/advertising/api/detail/faq.html
They have updated their guidelines, you now have more requests when you sell more items.
Effective 23-Jan-2019, the request limit for each account is calculated based on revenue performance attributed to calls to the
Product Advertising API (PA API) during the last 30 days.
Each account used for Product Advertising API is allowed an initial
usage limit of 8640 requests per day (TPD) subject to a maximum of 1
request per second (TPS). Your account will receive an additional 1
TPD for every 5 cents or 1 TPS (up to a maximum of 10) for every $4320
of shipped item revenue generated via the use of Product Advertising
API for shipments in the last 30 days.
Source: https://docs.aws.amazon.com/AWSECommerceService/latest/DG/TroubleshootingApplications.html
Amazon enforces limits on how many calls you can make per hour and per second.
You can increase the former by following the sanctioned route (increase commission revenue) or by privately petitioning Amazon with a valid reason. When whitelisted, your limit will go up to 25,000 calls per hour, which is more than good enough for the vast majority of projects I can think of.
The latter limit is murkier and enforced depending on the type of query you make. My interpretation is that it is meant to keep serial crawlers who do batch item lookups in check. If you are simply doing keyword searches etc., I would not worry so much about it. Otherwise, the solution is to distribute your calls across multiple IPs.
One other point to keep in mind if you are querying multiple locales is to use separate accounts per locale. Some locales are grouped and will count to the same call quota. European Amazons, for instance, form such a pool.