Team,
We have a daily data pull for Google Analytics, we get Source, Geo, Geo Network,Device & AdWords data into SQL Server for reporting purpose.
Today we got the production failure because of the lengthy keyword dimension data from Google. Is there a way to figure out all the column data sizes? Including the list of Dimension & Metrics? Please share if there is a place where it is all defined.
Thanks
Senthil
Related
We are using the Google Ads transfer in BigQuery to ingest our Google Ads data. One thing I have noticed when querying the results is that all of the metrics are exactly 156x of the values we would expect in the Google Ads UI (cost, clicks, etc.)
We have tested multiple transfers and each time we have this same issue. The transfer process seems pretty straight forward, but am I missing something? Has anyone else noticed a similar issue or have any ideas of what to look at to adjust in the data transfer?
For which tables do you notice this behavior?
The dimension tables such as Customer, Campaign, AdGroup are exported every day and so are partitioned by day.
This could cause your duplication?!
You only need the latest partition/day.
So this is for example how I get the latest account / customer data:
SELECT
-- main reason I cast all the id's to string is because BI reporting tool will not see it as a metric but as a dimension field
CAST(customer_id AS STRING) AS account_id, --globally unique, see also: https://developers.google.com/google-ads/api/docs/concepts/api-structure
customer_descriptive_name,
customer_auto_tagging_enabled,
customer_currency_code,
customer_manager,
customer_test_account,
customer_time_zone,
_DATA_DATE AS date, --source table is paritioned on date
_LATEST_DATE,
CASE WHEN _DATA_DATE = _LATEST_DATE THEN TRUE ELSE FALSE END is_most_recent_record
FROM
`YOURPROJECTID.google_ads.ads_Customer_YOURID`
WHERE
_DATA_DATE = _LATEST_DATE
I have a dashboard connected to a BigQuery Table, BI engine works as expected as I am using a calendar filter and my table is partitioned per date.
when I select a longer date range, BI engine stop working with this message "The table or data volume was larger than BI Engine supports at this time", that's fair.
Please notice, I am already filtering by a partition, but sometimes, I need to see the whole data
to solve that, I created a BI reservation, and I notice regardless of the size 1,2,4 GB the memory used is always 600MB? and I get the same message, I attached a screenshot here, is this by design?
Bug Report here: https://issuetracker.google.com/issues/150633500
turn out the error is not related to reservation, but to the fact that BI engine support only 500 partition, my table has more
https://cloud.google.com/bi-engine/docs/overview#limitations
the solution is instead of partition per day, I will use something like week or month
We are using Google data studio to create mobile analytic reports using google Firebase data, linking with BigQuery. We have a dataset created in the report which uses a query to pull data from BigQuery - SELECT * FROM table.events_* which returns 69.4 GB data (verified in validator). The problem is when we create report using this query in dataset, for each report we are charged for 'BigData Analysis' in Tebibytes, which is way too much. But when we calculate the pricing for query, it is not even $1 for the data that we use.
Not sure why the data is processed in tebibytes. Here are some details about the data table in BigData -
Table size: 246.69 MB
Number of columns: 57
Tried query with some filters as well, but still returning Tebibytes data for single report. Is reducing or filtering number of columns only way to restrict the data processing? what transactions comes under BigData Analysis? (other than query processing).
Your help is greatly appreciated. Thanks in advance
We are trying to solve an issue of costs and revenues from DoubleclickCampaignManager/Campaign Manager.
The goal is to create a daily dashboard of media costs (paid search, display, videos and social) thanks to Google Cloud.
We have right now access to Facebook data, Google Analytics and Campaign Manager. The issue is on the last one.
For Campaign Manager, the bucket oustide our organization, have been added to our organization thanks to Data TransferV2.0. We have access to impressions, clicks, activity and match tables csv on Storage and so on, on BigQuery.
We have date, clicks, impressions, cities, ad name metrics, etc... but we only have 0 in costs metrics.
What i mean about costs, it's how much we paid for 1 impression. In revenues, DBM costs, total media costs... (Avertiser, Partner or Account Currency) we only had 0.
We ask Google to help us : they told us to check a checkbox meaning that "Check Campaign Manager and DV360 are linked into Data Transfer".
They told us, that it should work, but we still have 0 on Revenues and Costs.
We should have 32.00 for instance, instead of 0. Do you have any idea how to solve this issue ?
Best,
Theo
If after this solution you have not get any information. I reccomend you to send your offline reports to BigQuery directly.
You could do it following some steps as follows:
create a dataset in BigQuery and copy its ID,
Then, go to settings in CM360 and activate the BigQuery Export:
Copy the iam email account that CM settings has returned to you,
then after, go to bigQuery and include this account with only editor permissions to your dataset.
After all this procces the option will be available to activate in the delivery options of CM360's offline reports :
Have you tried to enable "Report Display & Video 360 cost to Campaign Manager" from Basic Details under Partnet level from DV360.
Report Display & Video 360 cost to Campaign Manager
I started to test Google AdWords transfers for Big Query (https://cloud.google.com/bigquery/docs/adwords-transfer).
I have few questions for which I cannot find answers anywhere.
Is it possible to e.g. edit which columns are downloaded from AdWords to Big Query? E.g. Keyword report has only ad group ID column but not ad group text name.
Or is it possible to decide which tables=reports are downloaded? The transfer creates around 60 tables and I need just 5...
DZ
According to here, AdWords data transfer
store your AdWords data into a Dataset. So, the inputs are in terms of Adwords customer IDs (minimum one customer ID) and the output is a collection of Datasets.
I think, you need a modified version of PubSub to store special columns or tables in BigQuery.