Have created a table in BigQuery and connected to BigQuery in PowerBI Desktop. I authenticate successfully and can see my project and dataset structure. However, the table that should be visible under cpi_quotes is not there. Tables relating to public datasets are so this is specific to my data which I can query within BigQuery itself just fine. Is there a permission I need to enable within BigQuery to make this table visible to PowerBI?
Related
How do I connect various different client's google analytics (GA4 & UA) to one instance of Big Query? I want to store the analytics reports on bigquery & then visualise it on a unified dashboard on Looker
You can set up the exports from Google Analytics to go to the same BigQuery project and transfer historical data to the same project as well.
Even if data is spread across multiple GCP projects, you can still query all from a single project. I would suggest you create a query that connects data from multiple sources together, you can then save it as a view and add it as a source in Looker, you can use it as a custom query in Looker or for best efficiency save the results of your query as a new reporting table that feeds into Looker.
I have a view that is connected to a google sheet via connected sheets.
I'm trying to let a user refresh the data by giving them access in GCP.
I've tried giving access at the project, dataset and view levels. But every time they get the error: "Query failed, no access to the connected BigQuery table"
I'm giving the role of bigquery.user and bigquery.dataviewer.
What could be causing this?
Please, make sure if the user you wish to give access to the BigQuery data in Google Sheets has:
An Enterprise Plus or G Suite Enterprise for Education account
Access to BigQuery
A project with billing setup in BigQuery
A BigQuery Job Creator role on the selected billing project
BigQuery Data Viewer role on the datasets containing the selected table
According to the documentation:
If you share a sheet with someone who doesn't meet the criteria
above, they'll be able to see analysis created with Connected Sheets
and perform regular Sheets operations, but they won't be able to
refresh it or create their own connected sheet.
Additionally, have a look for the another SO thread and Using Connected Sheets documentation.
I have created an instance on Google BigTable and created an external table on Google BigQuery and created an external table which can able to fetch records from BigTable. I tried to create a BigQuery datasource in Power BI Desktop but the external tables are not listing but the tables created on BigQuery are been listed! Is it an issue? or any possible solutions?
I replicated your scenario and got the same behavior as the one you described. It seems that at the moment Power BI does not support BigTable external tables. A workaround is to create one or more views based on the BigTable table and then connect to those views.
In addition, if you're interested you can report the Power BI behavior here.
I need to understand the below:
1.) How does one BigQuery connect to another BigQuery and apply some logic and create another BigQuery. For e.g if i have a ETL tool like Data Stage and we have some data been uploaded for us to consume in form of a BigQuery. So in DataStage or using any other technology how do i design the job so that the source is one BQ and the Target is another BQ.
2.) I want to achieve like my input will be a VIEW (BigQuery) and then need to run some logic on the BigQuery View and then load into another BigQuery view.
3.) What is the technology used to connected one BigQuery to another BigQuery is it https or any other technology.
Thanks
If you have a large amount of data to process (many GB), you should do the transformation of the data directly in the Big Query database. It would be very slow to extract all the data, run it through something locally, and send it back. You don't need any outside technology to make one view depend on another view, besides access to the relevant data.
The ideal job design will be an SQL query that Big Query can process. If you are trying to link tables/views across different projects then the source BQ table must be listed in fully-specified form projectName.datasetName.tableName in the FROM clauses of the SQL query. Project names are globally unique in Google Cloud.
Permissions to access the data must be set up correctly. BQ provides fine-grained control over who can access, and it is in the BQ documentation. You can also enable public access to all BQ users if that is appropriate.
Once you have that SQL query, you can create a new view by sending your SQL to Google BigQuery either through the command line (the bq tool), the web console, or an API.
1) You can use BigQuery Connector in DataStage to read and write to bigquery.
2) Bigquery use namespaces in the format project.dataset.table to access tables across projects. This allows you to manipulate your data in GCP as it were in the same database.
To manipulate your data you can use DML or standard SQL.
To execute your queries you can use the GCP Web console or client libraries such as python or java.
3) BigQuery is a RESTful web service and use HTTPS
We are moving our BigQuery data from QA to production environment.
For that we have created new google account for production environment.
How we can transfer wildcard table data from one google account to another ?
You can use google groups to very fast copy tables between projects/datasets for different google accounts.
Set up a google group from the main google account
Invite the new google account (as owner) to the google group
Accept the invitation from the new google accounts gmail.
Share the original data set using the shared google group email. Under dataset name select the arrow down & pick Share dataset. Make sure to share as group and not user and make the account an owner (or you can not copy tables)
From the new google account create a new project & dataset in BQ. Then add the old project id to the new google account under Switch to project/ Display project (under the arrrow down under the dataset name). You can now see the old project/dataset and all its tables from the new google account. From there you can copy any tables from the old project to the new project/google account. Very large tables within seconds.
Edit: I think you need to use old UI for this to work since the options does not seem to be available in the new one yet
You can move the BigQuery data from your source account/project by using the logic of exporting the BigQuery dataset to a GCS bucket and then importing the data to the new BigQuery dataset located in your destination account/project.
Export the data from your source BigQuery account/project to a regional or multi-region Cloud Storage bucket in the same location
as your dataset.
Grant the required GCS and BigQuery permissions to the account that will be used to load the data in the destination
account/project by using the IAM console.
Load your data into your destination BigQuery account/project based on the data format selected during the export task.
There are no charges for exporting data from BigQuery to Cloud Storage and vice versa; Nevertheless, you do incur charges for storing the data. I suggest you to take a look on the Free Operations when using BigQuery to know more about this.