I started BigQuery trial account just before
and created sample table using a source with Drive URI & CSV.
When I clicked my uploaded table, it shows table information like schema and details.
I remember that there was a preview menu but I can't find it now.
Is there anyway to activate table preview menu?
When your table is created from a file from Google Drive, by default "Preview" tab is not available since it is considered as an external table. I tested it to verify the behavior.
Created table sourcing from Google Drive:
What I could suggest is to load your data from Google Drive to BigQuery, so the "Preview" tab will be available. Also if you load your data in BigQuery, running queries will be faster compared from querying the table externally.
Related
When accessing google_patents_research.publications table from BigQuery I can filter results by the top_terms column. This column uses Google's machine learning algo to search the text inside a patent and extract the words it finds most meaningful.
I was wondering if there is a similar search using BigQuery that can be done on PubMed's database.
According to NCBI docs there is a dataset called nih-sra-datastore which should be accessible from BigQuery, but it doesn't show up when searching for it in BigQuery's console search box.
If you are referring to nih-sra-datastore as explained on this page The National Center for Biotechnology Information it looks that its making reference to the project-id not the dataset as such.
So in order to have access to it you have to follow these steps (on BigQuery explorer):
Click on +ADD DATA
Hover over Pin a Project and select Enter a Project Name
Add nih-sra-datastore
You will pin this public project to your BigQuery explorer.
I'm trying to set up a dashboard in Google Data Studio with Apple News analytics data as one of the sources.
I can see you can download this analytics data manually as a CSV - does anyone know a way of automating this extract? Automatically appending the data weekly to a BigQuery table would be ideal, or Google Sheets or directly into Data Studio if not.
Thanks.
You can load your CSV into BigQuery [1], or schedule a load job, and then use it in datastudio through a BigQuery reader package. Otherwise, if you do not need to append the data you can simply import it with other packages as "Custom JSON/CSV/XML" By Supermetrics.
[1] https://cloud.google.com/bigquery/docs/loading-data#supported_data_formats
I'm importing datasets in Google Cloud Dataprep (by Trifacta) to perform transformations on my data sources. But I can't see Google Drive Sheets in the list after connecting them to Big Query Console. I'm about to use them as rules for my transformations.
I've already created another dataset and the problem persists.
Is it possible to import them or not supported yet?
Thanks,
You are right. According to the documentation Dataprep only supports native BigQuery tables and views as BigQuery sources.
You could try downloading your Drive sheets as csv and then creating a BigQuery table from it, or maybe you could create a load job from your external table into a new native table using:
SELECT * FROM my_dataset.my_external_table
We are moving our BigQuery data from QA to production environment.
For that we have created new google account for production environment.
How we can transfer wildcard table data from one google account to another ?
You can use google groups to very fast copy tables between projects/datasets for different google accounts.
Set up a google group from the main google account
Invite the new google account (as owner) to the google group
Accept the invitation from the new google accounts gmail.
Share the original data set using the shared google group email. Under dataset name select the arrow down & pick Share dataset. Make sure to share as group and not user and make the account an owner (or you can not copy tables)
From the new google account create a new project & dataset in BQ. Then add the old project id to the new google account under Switch to project/ Display project (under the arrrow down under the dataset name). You can now see the old project/dataset and all its tables from the new google account. From there you can copy any tables from the old project to the new project/google account. Very large tables within seconds.
Edit: I think you need to use old UI for this to work since the options does not seem to be available in the new one yet
You can move the BigQuery data from your source account/project by using the logic of exporting the BigQuery dataset to a GCS bucket and then importing the data to the new BigQuery dataset located in your destination account/project.
Export the data from your source BigQuery account/project to a regional or multi-region Cloud Storage bucket in the same location
as your dataset.
Grant the required GCS and BigQuery permissions to the account that will be used to load the data in the destination
account/project by using the IAM console.
Load your data into your destination BigQuery account/project based on the data format selected during the export task.
There are no charges for exporting data from BigQuery to Cloud Storage and vice versa; Nevertheless, you do incur charges for storing the data. I suggest you to take a look on the Free Operations when using BigQuery to know more about this.
I have a question regarding refreshing google big query table where the data source is google drive.
Imagine, you have CSV file on google drive and every day someone updates for you.
1. The filename is not changing
2. location URI is same
How can I refresh my big query table by using this google drive file?
Could you please guide me or send me related links?
Thanks
From the BigQuery docs:
Loading data into BigQuery from Google Drive is not currently
supported, but you can query data in Google Drive using an external
table.
The link above provides instructions on how to create an external table that references your stored-in-Drive data source. Considering that you want to be querying data from a Google Drive file which you will be updating in Drive, this is the solution you are looking for (in contrast to downloading your csv locally and then loading it into BQ, in which case you would then have to be updating directly in BQ).