How to create BigQuery Partition table in Bigquery WebUI. I know we can create from gcloud and api.
I searched webUI but couldn't find any option for creating partition table from WebUI.
Thanks,
We added support for this recently and expect the UI to show it once the changes are deployed. I will provide an update once that is done. Thanks!
Related
I have a Firebase application which is uploading events with parameters. I need to be able to view those events in order to debug some issues we're having in production. I can only see the tables which are generated nightly in BigQuery. I can find references online saying that BigQuery allows viewing real time data. What I can't find is any straightforward instructions on how to create those views.
Is it possible? If so, can someone give me instructions that even a complete newb could follow?
We have decided to use the BigQuery APIs for information we want to see immediately in the database.
I am trying to see if I can set up a trigger system, so whenever a new row of data is populated in these tables A, B, and C --> it populates new rows into a new table I created (table D, for example)?
I'm using Bigquery. Does this platform allow this capability?
Not sure what kind of coding should be used for this...(Insert into, etc.)
Maybe late to the party, but this is possible now. https://cloud.google.com/blog/topics/developers-practitioners/how-trigger-cloud-run-actions-bigquery-events
Triggers are not supported on BigQuery, basically because they are not aligned to the intended use patterns of the product. You may also refer to this existing question.
There is a Feature Request in place for an interesting approach to trigger an action when rows are loaded to BigQuery, but currently there is no ETA for it.
You my want to consider Cloud Composer as different alternative, instead of using triggers, you may orchestrate your data ingestion tasks.
BigQuery is a data warehouse, and there is no trigger support.
Have a look at https://cloud.google.com/sql/ , maybe it will help you.
We want to perform a test on BigQuery with scheduled queries.
The test retrieves a table from a dataset and, basically, copies it in another dataset (for which we have permission as owners) in another project. So far, we managed to do that with a script we wrote in R against the BigQuery API in a Google Compute Engine instance but we want/need to do it with scheduled queries in BigQuery.
If I just compose a query for retrieving the initial table data and I try to schedule it, I see there's a project selector but it's disabled so seems like I'm tied to the project for the user I'm logging in with.
Is this doable or am I overdoing it and using the API is the only option to do this?
Is this doable or am I overdoing it and using the API is the only option to do this?
The current scheduler logic doesn't allow this and for that reason, the project drop-down is disabled in the webUI.
As an example, I tried setting this schedule Job
CREATE TABLE IF NOT EXISTS `projectId.partitionTables.tableName` (Field0 TIMESTAMP) --AS SELECT * FROM mydataset.myothertable
And this is the error returning from the transferAPI
You will need to ask BigQuery team to add this option to future version of th scheduler API
According to https://cloud.google.com/bigquery/docs/creating-partitioned-tables#converting_dated_tables_into_a_partitioned_table you can specify date partitioning options for BQ tables.
Specify the partitioning configuration in the Tables::insert request
Since we do a lot of our ETL pipeline using load jobs, using create_disposition='CREATE_IF_NEEDED' I was wondering if there is a way to specify table partitioning scheme with the load job configuration.
Input appreciated.
Thanks
Right now partitioned tables have to be created before a load job. We're working on the support of creating partitioned tables within a load/query job.
Very keen on this too! The Spark Connector I am using does it via load jobs
I'm currently creating sharded tables daily using the Python API.
Is there a way to add the comments for the table's columns when creating the table ? I couldn't see it in the doc but it might still be implemented.
Thanks in advance.
From the REST API for BigQuery documentation, tables.Insert takes a table resource, and in the schema.fields[].description property, you can put in a description.