I have GA360 and I have exported raw Google Analytics data to BigQuery through the integration. I'm just wondering if the DCM dimensions and metrics are part of the export?
Ideally these link
I can't find them as part of the export, what would be the best way to access these dimensions for all my raw data? Core reporting API?
This Google Analytics documentation page shows the BigQuery Export schema for the Google Analytics data that is exported to BigQuery. As you will see, the DoubleClick Campaign Manager (DCM) data is not part of the BigQuery export, unlike the DoubleClick for Publishers (DFP), which does appear (these are all the hits.publisher.<DFP_METRIC> metrics).
Therefore, as explained by #BenP in the comments, you may be interested in the BigQuery Data Transfer service, which does have a feature for DoubleClick Campaign Manager Transfers. Unfortunately, I am not an expert in Google Analytics or DCM, and therefore cannot add much relevant info to the process of linking both sets of data, but maybe you can try combining them yourself and then post a new question for that matter if you do not success in doing so.
Related
Hey I am trying to create some batch jobs that reads from a couple Salesforce Objects and pushes them to BQ. Every-time batch process runs it will truncate the table in BQ and push all the data in the SF object back into BQ. Is it possible for google data fusion to automatically detect changes in an object in Salesforce(like adding a new column or changing data types of a column) then be registered and pushed to BQ via google data fusion?
For SF side of the puzzle you could look into https://developer.salesforce.com/docs/atlas.en-us.api_rest.meta/api_rest/resources_describeGlobal.htm and If-Modified-Since header telling you if the definition of table(s) changed. That url is for all tables in the org or you run table-specific metadata describe calls with https://developer.salesforce.com/docs/atlas.en-us.api_rest.meta/api_rest/resources_sobject_describe.htm
But I can't tell you how to use it in your job.
You can use the provided answer of #eyescream to be the condition or the trigger for the update to BigQuery. You may push changes to BigQuery using the pre-built plugin Stream Source approach from Datafusion in which, as mentioned in this docmentation, it
tracks updates in Salesforce sObjects. Examples of sObjects are
opportunities, contacts, accounts, leads, any custom object, etc.
You may use this approach to automatically track changes and push them to BigQuery. You can also find the whole Salesforce Streaming Source configuration reference in this documentation as also redirected from google's official documentation.
However, if you want a more dynamic approach for your overall use case, you may also use the integration of BigQuery with Salesforce. However in this approach, you will need to build your own code in which you can also use #eyescream 's answer as the primary condition/trigger and then automatically push the update to your BigQuery schema.
I just cleaned up my firestore collection data using DataPrep and verified the data via BigQuery. I now want to move the data back to Firestore. Is there a way to do this?
I have used manual method of exporting to JSON and then uploading using a code provided by AngularFirebase. But It is not automated as there is a need to periodically cleanup this data.
I am looking for a process within Google Cloud console. Any help will be appreciated
This is not an answer, more like a partial answer. I could not add a comment as I don't have 50 reputation yet. I am in a similar boat but not entirely the same situation. My situation being that I want to use a subset of BigQuery data and add it to Firestore. My thinking is to do the following:
Use the BigQuery API to query the data periodically using BigQuery Jobs' Load (in your case) or Query (in my case)
Convert it to JSON in code
Use batch commit in Firestore's API to update the firestore database
This is my idea and I am not sure whether this will work, but I will you know more once I am done with this. Unless someone else has better insights to help me and the person asking this question
In Google Cloud Billing CSV file ,I see none of the Labels associated with the Compute Engines appear in the billing CSV file .
It only has a field named 'Project Labels'.
Is there any way to configure google cloud billing ,so that resource specific labels also appear in the exported CSV ?
According to https://cloudplatform.googleblog.com/2017/12/use-labels-to-gain-visibility-into-GCP-resource-usage-and-spending.html resource specific labels are only available if you enable export of billing data to Big Query.
From there you can construct you own queries to group the data as you like.
I have couple of questions regarding big query:
1. Can we upload text files in google cloud platform and retrieve required data?? If yes how is it possible.
2.My main aim is to upload large amount of data to cloud platform analyse the data and retrieve the desired information whenever required. Data can be both structured and unstructured.
Can we upload text files in google cloud platform and retrieve
required data??
yes. You can
If yes how is it possible
it is simple enough to get to this - You just need first to read below
BigQuery Home: https://cloud.google.com/bigquery/
Quickstarts: https://cloud.google.com/bigquery/docs/quickstarts
How-to Guides: https://cloud.google.com/bigquery/docs/how-to
The task is related to Share market.
Currently Omnesys NEST trading terminal provides streaming data for NSE and MCX and so. They also provide an option to link the live share streaming data to EXCEL sheet, and the the market changes are updating to excel sheet for every second.
This is the function used in EXCEL to read the data from NEST terminal:
=RTD("nest.scriprtd",,"mcx_fo|GOLDM15SEPFUT","LTP")
Can anybody help me to extract the live streaming data?
Can you be more specific ? Do you want to develop some live trading strategy using the live data , or do you want to store this data in some database or ....
As an example, you can use pandas DataFrame to retrieve this data into your python program and process it.
UPD:
The NSE NOW terminal is based on Omnesys Nest. I have automated this using AutoIt Check out this link: . The live streaming data from the market watch of Omnesys Nest can be extracted using AutoIt
The github repo for the program is Here