BigQuery:Transfer WildCard Table from One Account to another - google-bigquery

We are moving our BigQuery data from QA to production environment.
For that we have created new google account for production environment.
How we can transfer wildcard table data from one google account to another ?

You can use google groups to very fast copy tables between projects/datasets for different google accounts.
Set up a google group from the main google account
Invite the new google account (as owner) to the google group
Accept the invitation from the new google accounts gmail.
Share the original data set using the shared google group email. Under dataset name select the arrow down & pick Share dataset. Make sure to share as group and not user and make the account an owner (or you can not copy tables)
From the new google account create a new project & dataset in BQ. Then add the old project id to the new google account under Switch to project/ Display project (under the arrrow down under the dataset name). You can now see the old project/dataset and all its tables from the new google account. From there you can copy any tables from the old project to the new project/google account. Very large tables within seconds.
Edit: I think you need to use old UI for this to work since the options does not seem to be available in the new one yet

You can move the BigQuery data from your source account/project by using the logic of exporting the BigQuery dataset to a GCS bucket and then importing the data to the new BigQuery dataset located in your destination account/project.
Export the data from your source BigQuery account/project to a regional or multi-region Cloud Storage bucket in the same location
as your dataset.
Grant the required GCS and BigQuery permissions to the account that will be used to load the data in the destination
account/project by using the IAM console.
Load your data into your destination BigQuery account/project based on the data format selected during the export task.
There are no charges for exporting data from BigQuery to Cloud Storage and vice versa; Nevertheless, you do incur charges for storing the data. I suggest you to take a look on the Free Operations when using BigQuery to know more about this.

Related

Connect different client's GA4 & UA accounts to one BigQuery project

How do I connect various different client's google analytics (GA4 & UA) to one instance of Big Query? I want to store the analytics reports on bigquery & then visualise it on a unified dashboard on Looker
You can set up the exports from Google Analytics to go to the same BigQuery project and transfer historical data to the same project as well.
Even if data is spread across multiple GCP projects, you can still query all from a single project. I would suggest you create a query that connects data from multiple sources together, you can then save it as a view and add it as a source in Looker, you can use it as a custom query in Looker or for best efficiency save the results of your query as a new reporting table that feeds into Looker.

BigQuery Connected Sheets - Required user permissions?

I have a view that is connected to a google sheet via connected sheets.
I'm trying to let a user refresh the data by giving them access in GCP.
I've tried giving access at the project, dataset and view levels. But every time they get the error: "Query failed, no access to the connected BigQuery table"
I'm giving the role of bigquery.user and bigquery.dataviewer.
What could be causing this?
Please, make sure if the user you wish to give access to the BigQuery data in Google Sheets has:
An Enterprise Plus or G Suite Enterprise for Education account
Access to BigQuery
A project with billing setup in BigQuery
A BigQuery Job Creator role on the selected billing project
BigQuery Data Viewer role on the datasets containing the selected table
According to the documentation:
If you share a sheet with someone who doesn't meet the criteria
above, they'll be able to see analysis created with Connected Sheets
and perform regular Sheets operations, but they won't be able to
refresh it or create their own connected sheet.
Additionally, have a look for the another SO thread and Using Connected Sheets documentation.

How to refresh google drive data source - Google Big Query

I have a question regarding refreshing google big query table where the data source is google drive.
Imagine, you have CSV file on google drive and every day someone updates for you.
1. The filename is not changing
2. location URI is same
How can I refresh my big query table by using this google drive file?
Could you please guide me or send me related links?
Thanks
From the BigQuery docs:
Loading data into BigQuery from Google Drive is not currently
supported, but you can query data in Google Drive using an external
table.
The link above provides instructions on how to create an external table that references your stored-in-Drive data source. Considering that you want to be querying data from a Google Drive file which you will be updating in Drive, this is the solution you are looking for (in contrast to downloading your csv locally and then loading it into BQ, in which case you would then have to be updating directly in BQ).

No schema auto-detect when querying External Tables in Bigquery and new data arrive

This is the currently situation:
I've created an external table in Bigquery against json in Cloud Storage.
I'm testing how it works regarding to the schema auto-detect.
When I create the table, there were 2 json files with different schemas, and Bigquery does it well.
When I load a new file with a new schema (adding a new attribute to a record field), Bigquery recognizes the new record, but this new field doesn't appear. So the schema auto-detect doesn't work as I expected.
How can I get schema auto-detect when new files arrives to my cloud storage folder?
Any help?
Culprit: AFAIK the auto-schema detection happens when you create a table, and not updated as you add new files.
Possible solution:
Re-create the tables when new files arrive.
Straightforward implementation:
Add a pub/sub notification on GCS for new arriving files, have a Google Cloud Function that re-creates the table trigger on this.

Error processing action bigquery.publish(0): BigQuery error: Dataset is not shared as editable

I am working on new iOT project. I have a telit device that comunicates with m2m cloud plataform. I created a new project on google cloud and abilities the bigquery api. I created a new dataset but I did not create a table because I understand that this will be created when the first data will sent. I created a trigger on m2m cloud to send data when an condition is true. I have shared the dataset using the email generated by google cloud to the my application. I don't know if and where I should to put this email address on m2m cloud.
This is dataset share permissions.
Can you help me?
Rodrigo Rocha
You will have to make sure that the email you share the dataset with is set to Can Edit so you can write tables.
Once that is done, to accomplish what you want to do, you'll have to run "append" jobs with a "CREATE IF NEEDED" disposition and a "WRITE_EMPTY" disposition. For more info, check out this link:
https://cloud.google.com/bigquery/docs/reference/v2/jobs