Google bigquery for free? - google-bigquery

Im new here and looking actually for this:
https://temboo.com/hardware/google-big-query-getting-started
Its going about how to connect Sensors to Google Bigquery,
but I actually don't know whether it is free or not.
My usage per month were around 1GB.
Please tell me what I can get for free there, I'm absolutely beginner and don't want get a big bill.
Thanks,
Petr

BigQuery charges for query processing and storage.
Query processing will likely be free in your case, since the first 1 TB per month is free, and you're only using 1 GB per month.
You will likely to have pay a small amount to store the data you're querying, but at that scale we're talking pennies ($0.02/GB/month).
Full pricing details:
https://cloud.google.com/bigquery/pricing

Related

If I pull the data from BiqQuery, will Google charge me or not for sending the data to Data Studio?

If I pull the data from BiqQuery, will Google charge me or not for sending the data to Data Studio?
That depends. BigQuery is a consumption based model, unless you purchased slots. What that means is, any time you query you're utilizing resources and then getting charged at their defined rate, $5 per TB of data scanned.
There are a few caveats to that however one being that the first TB of data scanned per month is free, and not every query issued will scanned data as it may use cache. If you are concerned about the associated cost one option would be to utilize the BigQuery sandbox. It has limited functionality but will not charge you, however there are limitations.
https://cloud.google.com/bigquery/docs/quickstarts/quickstart-cloud-console
BigQuery runs queries that you pay.
DataStudio runs queries on BigQuery that you pay.
There is no cost of transfer between the two systems.

Google BigQuery for realtime call records data

I am thinking to use Google Big Query to store realtime call records involving around 3 million rows per day inserted and never updated.
I have signed up for a trial account and ran some tests
I have few concerns before i can go ahead with development
When streaming data via PHP it takes around 10-20 minutes sometime to get loaded on my tables and this is a show stopper for us because network support engineers need this data updated realtime to troubleshoot quality issues
Partitions, we can store data in partitions divided for each day but that also involves one partition being 2.5 GB on any given day and that shoots my costs to query data in range of thousands per month. Is there any other way to bring down cost here? We can store data partitioned per hour but there is no such support available.
If not BigQuery what other solutions are out there in market which can deliver similar performance and can solve these problems ?
You have the "Streaming insert" option which enables the records to be searchable in few seconds (it has its price).
See: streaming-data-into-bigquery
Check table-decorators for limiting query scan.

How can I see my accumulated query costs for BigQuery while in the free trial?

Google Cloud billing is not updating with the free trial (on monthly payments) and I can not change it to a faster update cycle. As per https://cloud.google.com/free-trial/docs/billing-during-free-trial the bill should come every month.
It is therefore not easy to see how much of the 300$ is left.
Is there any way to at least see how many TBs my queries used? This should be by far the biggest item on the bill.
I am concerned that I might get 'stuck' between some important queries that I otherwise could have managed better to have at least partial results available after the trial ends.
BigQuery analysis & storage costs should be listed under your GCP billing transactions:
https://console.cloud.google.com/billing/<INSERT_YOUR_BILLING_ID_HERE>/history?e=13803970,13803205
Another way to see how much you have queried is by enabling audit logging as described here.

List all the queries made to BigQuery with their processed bytes

I would like to know if there is a method in the BigQuery API or any other way where i can list all the queries made and their processed bytes. Something like what is listed in the Activity Page but with the processedBytes field:
https://console.cloud.google.com/home/activity?project=coherent-server-125913
We are having a problem with billing. Suddenly our BigQuery Analysis Costs have increased a lot and we think we are being charged like 20 times more than expected (we check all the responses from BigQuery API and save the processedBytes field, taking into account that the minimum charge is of 10MB).
The only way we can solve this difference is listing all the requests and comparing to our numbers to see if we arenĀ“t measuring something or if we are doing something wrong. We have opened a billing support ticket and they have redirected me to Stackoverflow for asking the question as they think that is a technical issue.
Thanks in advance!
Instead of checking totalBytesProcessed - you should try checking totalBytesBilled and billingTier (see here)
You might jumped to high billing tiers - just guess
The best place to check would be the BigQuery logs.
This is going to tell you what queries were run, who ran them, what date/time they were run, the total bytes billed etc.
Logs can be a bit tedious to look through but BigQuery allows you to stream BigQuery logs into a BigQuery table and you can then query said table to identify expensive queries.
I've done this and it works really well to give you visibility on your BQ charges. The process of how to do this is outlined in more detail here: https://www.reportsimple.com.au/post/google-bigquery

How do I find out how much of my 1TB free monthly allotment I've used in Google BigQuery?

I feel like there must be some way to figure out how much of free 1TB is left besides summing up the "bytes processed" amounts for every single query. But I haven't been able to find it anywhere in the console or elsewhere.
Unfortunately this is not super easy right now. The best way to get the answer is, as you said, to sum up the bytes processed for the queries you've run.
You can get the data via the BQ jobs.list API, or you can use BQ's audit logs if that's more convenient. The audit logs can even be queried in BQ, but of course that incurs additional usage. :-)
You can also see your unbilled usage for GCP services via the GCP console. However, this only shows BQ usage from the free 1 TB tier once you've incurred some actual BQ charges (i.e., once you've gone over the 1 TB), which makes it less useful for your particular use case.
To expand on Jeremy's answer, you might get that information also directly via the INFORMATION_SCHEMA.JOBS_BY_* views as I explained in my answer to How can I monitor incurred BigQuery billings costs (jobs completed) by table/dataset in real-time? based on How to monitor query costs in Google BigQuery
SELECT
(SUM(total_bytes_processed) * 5) / (1024 * 1024 *1024 *1024),
FROM
`region-us`.INFORMATION_SCHEMA.JOBS_BY_USER
Please consider the caveats mentioned in the linked answers (caching, region-specific costs per TB, ...)!