Execute Transfer in Google Bigquery - PERMISSION_DENIED: No OAuth token with Google Drive scope was found - google-bigquery

I am trying the new 'Transfers' function in google BigQuery.
I am using the option: 'Scheduled Query'
It works with a simple query, but when I am trying another query that is normally working based on a view, that is based on a join between two tables (on table based on a google sheet shared with me) none of the more complicated Transfers I created are working.
I get the following error message:
Failed to start job for table 'xxx' with error PERMISSION_DENIED: Access Denied: BigQuery BigQuery: No OAuth token with Google Drive scope was found.
Is it because one of the source tables is based on a google sheet?
I tried to copy the source table to another table, but when I do this BigQuery automatically deletes this table.
Any ideas?

The problem is with the view which queries Google Drive data. In order to resolve your problem you need to request Google Drive scopes. Quoting directly from documentation:
Accessing data hosted within Google Drive requires an additional OAuth
scope, both when defining the federated source as well as during query
execution.
In the documentation page linked above you'll also find ways to do this via command line, api and web UI.

Related

BigQuery Connected Sheets - Required user permissions?

I have a view that is connected to a google sheet via connected sheets.
I'm trying to let a user refresh the data by giving them access in GCP.
I've tried giving access at the project, dataset and view levels. But every time they get the error: "Query failed, no access to the connected BigQuery table"
I'm giving the role of bigquery.user and bigquery.dataviewer.
What could be causing this?
Please, make sure if the user you wish to give access to the BigQuery data in Google Sheets has:
An Enterprise Plus or G Suite Enterprise for Education account
Access to BigQuery
A project with billing setup in BigQuery
A BigQuery Job Creator role on the selected billing project
BigQuery Data Viewer role on the datasets containing the selected table
According to the documentation:
If you share a sheet with someone who doesn't meet the criteria
above, they'll be able to see analysis created with Connected Sheets
and perform regular Sheets operations, but they won't be able to
refresh it or create their own connected sheet.
Additionally, have a look for the another SO thread and Using Connected Sheets documentation.

Tableau Bigquery access issue with Google Sheet federated table

I have a View (Table A) in Big Query which was created from a Google Sheet. It updates live which is perfect.
I have then connected that View to another View (Table B) in Bigquery. Let’s call this combined View, Table C.
In Tableau Desktop I try to connect to Table C, but it comes up with an authentication issue because Tableau cannot pass on authentication to Google Sheets.
Has anyone found a solution or workaround? Using service accounts, or even cloud functions or a scheduled query which saves the results of Table A as a table every time the google sheet is saved.
This has been asked before in the following link, but hasn’t received a step-by-step solution and I do not have enough stackoverflow reputation to comment:
BigQuery Credential Problems when Accessing Google Sheets Federated Table
https://community.tableau.com/thread/207871

GBQexception: How to read data with big query that is stored on google drive spreadsheet

I uploaded a dataset to bigquery via the google drive option and linking the google spreadsheet to a dataset which I call 'dim_table'
I then created a query to pull data from that dim_table dataset that I run daily.
I am trying to create an automated script that will run the same query code I created to get the dim_table data set and create a new dataset call chart_A
When I run this simple code:
import pandas_gbq as gbq
gbq.read_gbq("Select * from data.dim_stats",'ProjectID')
I get an error:
GenericGBQException: Reason: 403 Access Denied: BigQuery BigQuery: No
OAuth token with Google Drive scope was found.
I have been trying to read documentation on pandas gbq but could not find any documentation that points me on how I can authenticate gdrive with pandas gbq or use oauth. Any help is appreciated! :)
Let me know if you need me to comeup with a sample table online for testing.
best
I haven't used pandas-gbq but authentication methods with BigQuery mentioned here [1].
Create service account with a BigQuery role that can access to your datasets [2].
Create and download the service account's JSON key [3].
Set the private_key parameter to a file path to the JSON file or a string contains the JSON contents.
Also related guide to query Google Drive data without using pandas-gbq is here [4].

How to refresh google drive data source - Google Big Query

I have a question regarding refreshing google big query table where the data source is google drive.
Imagine, you have CSV file on google drive and every day someone updates for you.
1. The filename is not changing
2. location URI is same
How can I refresh my big query table by using this google drive file?
Could you please guide me or send me related links?
Thanks
From the BigQuery docs:
Loading data into BigQuery from Google Drive is not currently
supported, but you can query data in Google Drive using an external
table.
The link above provides instructions on how to create an external table that references your stored-in-Drive data source. Considering that you want to be querying data from a Google Drive file which you will be updating in Drive, this is the solution you are looking for (in contrast to downloading your csv locally and then loading it into BQ, in which case you would then have to be updating directly in BQ).

How do I connect a BigQuery database based on a Google Sheet to Looker?

I'm attempting to connect BigQuery to Looker. I am pulling sample data from a Google Sheets document to a BigQuery dataset; this part is working fine, as my internal BigQuery queries are running just fine for this dataset. Using this documentation from the Looker forums, I tried to create a service account key to connect my BigQuery dataset to Looker. Unfortunately, the documentation is slightly out of date: Google now asks which service account (compute engine default service account, app engine default service account, or a new service account that can have any of multiple roles) you want to attach the key to.
Thus far, I have tried using P12 keys created for the compute engine default service account, the app engine default service account, as well as a new Project Owner service account. When I create the connection in Looker, the admin page confirms that the connection "can connect, can cancel queries, can run simple select query" (I need it to do more complex things, but am just trying to connect at all right now). Using the SQL Runner to test a simple select 10 query out, I was able to query the public datasets, e.g. hacker_news or usa_names. However, whenever I tried to run the same query on my personal sample dataset, I received this error:
Failed to retrieve data - The job encountered an internal error during execution and was unable to complete successfully.
The permissions for the base Google Sheet that the BigQuery project is pulling from are set to be viewable by my coworkers who have the link. I have also been adding each service account I test as an editor (which I assume has the highest permissions). At this point, I am creating new service accounts with each of the different possible roles to see if it's a permissions issue from the role perspective. Nothing has worked so far, so any insight would be helpful!
UPDATE: I have created a new table within the same BigQuery dataset. The new table was created using a CSV file, which was simply a download of my previous table in Google Sheets. I updated the connection to Looker. When I wrote a select 10 query pulling from the new table, it worked fine and ran very quickly. This seems to imply that the problem is something about the permissions between Google Sheets and Google BigQuery.
I've been wanting to do something like this myself for a bit, saw this question, and decided to dig in.
First thing I found was this "documentation" over in the looker discourse:
https://discourse.looker.com/t/live-spreadsheets-in-databases/2698/7
In there, it describes the steps necessary to get this working.
Two important things that you are probably missing, based on your description of events so far (since it sounds like you've already attached the sheet to your dataset and are able to query it from the BigQuery UI):
Make sure you share the Google Sheet with the service account you are using to connect Looker to BigQuery. This is the Username from the Connections tab of the Admin page in Looker.
Make sure you have enabled the Drive and Sheets APIs for your google project. You can do that via The API Library. Just search for "Drive" (or "Sheets"), click on the name, and then click on the "Enable" button from the API detail page.
Once I did the above, I had to wait a few minutes before things started working. I'll go out on a limb and guess that this was because Looker needed to cycle it's internal connection pool before the permissions would reset and work. So you may need to run a few failing queries, or wait out the connection pool before this will go into effect.
Hope that helps.