Compute Engine Label Information in Google Cloud Billing CSV file - billing

In Google Cloud Billing CSV file ,I see none of the Labels associated with the Compute Engines appear in the billing CSV file .
It only has a field named 'Project Labels'.
Is there any way to configure google cloud billing ,so that resource specific labels also appear in the exported CSV ?

According to https://cloudplatform.googleblog.com/2017/12/use-labels-to-gain-visibility-into-GCP-resource-usage-and-spending.html resource specific labels are only available if you enable export of billing data to Big Query.
From there you can construct you own queries to group the data as you like.

Related

Importing from bigquery to google sheets limits to 10k rows

I'm trying to import a table from google bigquery to google sheets.
DATA > DATA CONNECTORS > BIG QUERY
but when I import it it says LIMITED to 10,000 rows.
Is there any way to pass that limit?
At the moment, according to the documentation, the BigQuery Sheets connector has a limit of 10,000 rows. Although, there is a work around for this limit, in case you want to overcome it.
You can use Google Cloud Storage (GCS) as a staging ground. So, you can export your data to GCS as a .csv file, then import it in Google Sheets. Below the steps are described:
Exporting data from BigQuery to a .csv on Google Cloud Storage (GCS)
You can export your table to GCS manually using the console, using gcloud command or one of the available API's, here.
I must point that you need to have the required permissions to export data to GCS. Also, pay attention to the limitations: you can export up to 1GB to a single data file, also the destination of the export has to be Cloud Storage.
Google Cloud Storage to Google Sheets using Google Cloud Functions
In order to import your .csv file to Google Sheets, you can create a Cloud Function which every time a new .csv file is uploaded to GCS, it is also exported to Google Sheets.
These following tutorials do exactly what I mentioned above, you can simply follow one of them, link 1 and link 2.
Doing so,you will be able to query all your data using Google Sheets and overcome the limitation of 10,000 rows with the BigQuery Sheets connector.

Getting DoubleClick data from Google Analytics exported to BigQuery

I have GA360 and I have exported raw Google Analytics data to BigQuery through the integration. I'm just wondering if the DCM dimensions and metrics are part of the export?
Ideally these link
I can't find them as part of the export, what would be the best way to access these dimensions for all my raw data? Core reporting API?
This Google Analytics documentation page shows the BigQuery Export schema for the Google Analytics data that is exported to BigQuery. As you will see, the DoubleClick Campaign Manager (DCM) data is not part of the BigQuery export, unlike the DoubleClick for Publishers (DFP), which does appear (these are all the hits.publisher.<DFP_METRIC> metrics).
Therefore, as explained by #BenP in the comments, you may be interested in the BigQuery Data Transfer service, which does have a feature for DoubleClick Campaign Manager Transfers. Unfortunately, I am not an expert in Google Analytics or DCM, and therefore cannot add much relevant info to the process of linking both sets of data, but maybe you can try combining them yourself and then post a new question for that matter if you do not success in doing so.

Can we upload text file in big query?

I have couple of questions regarding big query:
1. Can we upload text files in google cloud platform and retrieve required data?? If yes how is it possible.
2.My main aim is to upload large amount of data to cloud platform analyse the data and retrieve the desired information whenever required. Data can be both structured and unstructured.
Can we upload text files in google cloud platform and retrieve
required data??
yes. You can
If yes how is it possible
it is simple enough to get to this - You just need first to read below
BigQuery Home: https://cloud.google.com/bigquery/
Quickstarts: https://cloud.google.com/bigquery/docs/quickstarts
How-to Guides: https://cloud.google.com/bigquery/docs/how-to

Is it possible to extract job from big query to GCS across project ids?

Hey guys trying to export a bigquery table to cloud storage a la this example . Not working for me at the moment, am worried that the reason is that the cloud storage project is different to the bigquery table, is this actually doable? I can't see how using that template above.
Confirming:
You CAN have your table in ProjectA to be exported/extracted to GCS bucket in ProjectB. You just need make sure you have proper permissions on both sides. At least:
READ for respective dataset in Project A and
and
WRITE for respective bucket in Project B
Please note: Data in respective dataset of Project A and bucket in Project B - MUST be in the same location - US or EU , etc.
Simply to say: sourse and destination must be in the same location

How to download all data in a Google BigQuery dataset?

Is there an easy way to directly download all the data contained in a certain dataset on Google BigQuery? I'm actually downloading "as csv", making one query after another, but it doesn't allow me to get more than 15k rows, and rows i need to download are over 5M.
Thank you
You can run BigQuery extraction jobs using the Web UI, the command line tool, or the BigQuery API. The data can be extracted
For example, using the command line tool:
First install and auth using these instructions:
https://developers.google.com/bigquery/bq-command-line-tool-quickstart
Then make sure you have an available Google Cloud Storage bucket (see Google Cloud Console for this purpose).
Then, run the following command:
bq extract my_dataset.my_table gs://mybucket/myfilename.csv
More on extracting data via API here:
https://developers.google.com/bigquery/exporting-data-from-bigquery
Detailed step-by-step to download large query output
enable billing
You have to give your credit card number to Google to export the output, and you might have to pay.
But the free quota (1TB of processed data) should suffice for many hobby projects.
create a project
associate billing to a project
do your query
create a new dataset
click "Show options" and enable "Allow Large Results" if the output is very large
export the query result to a table in the dataset
create a bucket on Cloud Storage.
export the table to the created bucked on Cloud Storage.
make sure to click GZIP compression
use a name like <bucket>/prefix.gz.
If the output is very large, the file name must have an asterisk * and the output will be split into multiple files.
download the table from cloud storage to your computer.
Does not seem possible to download multiple files from the web interface if the large file got split up, but you could install gsutil and run:
gsutil -m cp -r 'gs://<bucket>/prefix_*' .
See also: Download files and folders from Google Storage bucket to a local folder
There is a gsutil in Ubuntu 16.04 but it is an unrelated package.
You must install and setup as documented at: https://cloud.google.com/storage/docs/gsutil
unzip locally:
for f in *.gz; do gunzip "$f"; done
Here is a sample project I needed this for which motivated this answer.
For python you can use following code,it will download data as a dataframe.
from google.cloud import bigquery
def read_from_bqtable(bq_projectname, bq_query):
client = bigquery.Client(bq_projectname)
bq_data = client.query(bq_query).to_dataframe()
return bq_data #return dataframe
bigQueryTableData_df = read_from_bqtable('gcp-project-id', 'SELECT * FROM `gcp-project-id.dataset-name.table-name` ')
yes steps suggested by Michael Manoochehri are correct and easy way to export data from Google Bigquery.
I have written a bash script so that you do not required to do these steps every time , just use my bash script .
below are the github url :
https://github.com/rajnish4dba/GoogleBigQuery_Scripts
scope :
1. export data based on your Big Query SQL.
2. export data based on your table name.
3. transfer your export file to SFtp server.
try it and let me know your feedback.
to help use ExportDataFromBigQuery.sh -h