I've been trying out BigQuery for a week or so. I've linked BigQuery to one of our organizations Firebase projects. I used my free trial of Google Cloud Platform for this test. I didn't want to use BQ anymore, thus I deactived the link between Firebase and BQ. I also degraded our account from Blaze back to Spark. I assumed that, with the deactivation of the Blaze subscription and link between Firebase and BQ, no queries would be ran in BQ. This happened almost a week ago. However, this did not happen; the last query ran yesterday. I am not sure if I will be billed for these queries.
How do I cancel all queries in BQ in the future? I can't seem to find a(n easy) way to do that.
THanks in advance.
Related
I need some help to understand what happened to our cloud to have BigQuery resource running every 12h to our cloud without configuring it. Also, it seems very intense because we got charged, in average, one dollar every day for the past month.
After checking in Logs Explorer, I saw several logs regarding the BigQuery resource
I saw the email from one of our software guy. Since I removed him from our Firebase project, there is no more requests.
Though, that person did not do or configure anything regarding the BigQuery so we are a bit lost here and this is why we are asking some help to investigate and understand what is going on.
Hope you will be able to help. Let me know if you need more information.
Thanks in advance
NB: I did not try to add the software guy's email yet. I wanted to see how it will go for the rest of the month.
The most likely causes I've observed for this in the wild:
A scheduled query was setup by a user.
A data studio dashboard was setup and configured to periodically refresh data in the background.
Someone's setup a workflow the queries BigQuery, such as cloud composer or a cloud function, etc.
It's also possible its just something like a script running in a crontab on someone's machine. The audit log should have relevant details like request origin for cases where it's just something running as part of a bespoke process.
I'm looking for some best (simplest;)) practices here.
I have Google Analytics data that is send to BigQuery on a daily basis. I have a query running on a daily basis that uses the data from the previous day's table.
However, I can't be sure this table and the data is there at the time the query runs and I'd like to check if it does. If it isn't there I want to retry later.
Ideally I have some monitoring/alerting around this as well.
Of course this can be done within the Google Cloud in many ways, I'm looking for some best practices how others do this?
I'm used to working with Airflow, but using Composer just for this seems a bit over the top. Cloud Run would be an option and I'm sure there are others. Also I've seen this question discussing how to handle a dependency in SQL, I'm just not sure if I could have it retry using just SQL as well?
EDIT:
I've got the check for the table working in SQL. I guess I just have to see if BigQuery has a way to build in delay like 'WAITFOR'
Our company has many schedule reports in BigQuery that generate aggregation tables of Google Analytics data. Because we cannot control when Google Analytics data is imported into our BigQuery environment we keep getting days with no data.
This means we then have to manually run the data for missing days.
I have edited my schedule query to keep pushing back the time of day the scheduled query runs however it is now running around 8 AM. These queries are for reports for stakeholders and stakeholders are requesting them earlier. Is there any way to ensure Google Analytics export to BigQuery processing times?
You may also think about a Scheduled Query solution that reruns at a later time if the requested table isn't available yet.
You can't current add a conditional trigger to a BigQuery scheduled query.
You could manually add a fail safe to your query to check for table from yesterday using a combination of the code below and DATE_SUB(CURRENT_DATE(), INTERVAL 1 DAY):
SELECT
MAX(FORMAT_TIMESTAMP('%F %T', TIMESTAMP(PARSE_DATE('%Y%m%d',
REGEXP_EXTRACT(_TABLE_SUFFIX,r'^\d\d\d\d\d\d\d\d'))) ))
FROM `DATASET.ga_sessions_*` AS ga_sessions
Obviously this will fail if the conditions are not met and will not retry, which I understand is not an advancement on your current setup.
I've encountered this many times in the past and eventually had to move my data pipelines to another solution, as scheduled queries are still quite simplistic.
I would recommend you take a look at CRMint for simple pipelines into BigQuery:
https://github.com/google/crmint
If you still find this too simplistic then you should look at Google Cloud Composer, where you can check a table exists before running a particular job in a pipeline:
Excuse me for maybe a not very precise question, but I just need to check if I am missing something or it really is some kind of problem with Google Cloud (GC) BigQuery.
I've got this Java program that reads from a website and publish the data into a GC Pub/Sub Topic; a pipeline is conveniently up, pulling the message from Pub/Sub and sending it to BigQuery via the template job offered in GC Dataflow. In the end, a DataStudio dashboard is getting the data from the BigQuery table and building up its charts and all...
The thing is, all the process is working fine: I can see the resulting dashboard being populated correctly, BUT I cannot see the data in the table in BigQuery, even after refreshing the whole page. Sometimes the results show on the following day (!).
Is it me forgetting something, or is it GC BigQuery in a beta release being incomplete?
As #Pentium10 said, the GUI is just for quick previews. It does take some time to update itself. If you want to check if the data is in the table do a query.
Im trying to do logs analysis with BigQuery. Specifically, I have an appengine app and a javascript client that will be sending log data to BigQuery. In bigquery, I'll store the full log text in one column but also extract important fields into other columns. I then want to be able to do adhoc queries over those columns.
Two questions:
1) Is BigQuery particularly good or particularly bad at this use case?
2) How do I setup revolving logs? I.e. I want to only store the last N logs or the last X GB of log data. I see delete is not supported.
Just so you know, there is an excellent demo of moving App Engine Log data to BigQuery via App Engine MapReduce called log2bq (http://code.google.com/p/log2bq/)
Re: "use case" - Stack Overflow is not a good place for judgements about best or worst, but BigQuery is used internally at Google to analyse really really big log data.
I don't see the advantage of storing full log text in a single column. If you decide that you must set up revolving "logs," you could ingest daily log dumps by creating separate BigQuery tables, perhaps one per day, and then delete the tables when they become old. See https://developers.google.com/bigquery/docs/reference/v2/tables/delete for more information on the Table.delete method.
After implementing this - we decided to open source the framework we built for it. You can see the details of the framework here: http://blog.streak.com/2012/07/export-your-google-app-engine-logs-to.html
If you want your Google App Engine (Google Cloud) project's logs to be in BigQuery, Google has added this functionality built in to the new Cloud Logging system. It is a beta feature known as "Logs Export"
https://cloud.google.com/logging/docs/install/logs_export
They summarize it as:
Export your Google Compute Engine logs and your Google App Engine logs to a Google Cloud Storage bucket, a Google BigQuery dataset, a Google Cloud Pub/Sub topic, or any combination of the three.
We use the "Stream App Engine Logs to BigQuery" feature in our Python GAE projects. This sends our app's logs directly to BigQuery as they are occurring to provide near real-time log records in a BigQuery dataset.
There is also a page describing how to use the exported logs.
https://cloud.google.com/logging/docs/export/using_exported_logs
When we want to query logs exported to BigQuery over multiple days (e.g. the last week), you can use a SQL query with a FROM clause like this:
FROM
(TABLE_DATE_RANGE(my_bq_dataset.myapplog_,
DATE_ADD(CURRENT_TIMESTAMP(), -7, 'DAY'), CURRENT_TIMESTAMP()))