GA4 vs BigQuery - User Count don't match - google-bigquery

I have extracted from Bigquery the active_users and totalusers on 31/12/2022, grouped by CampaignName and Country, using the following query:
select
count(distinct case when (select value.int_value from unnest(event_params) where key = 'engagement_time_msec') > 0 or (select value.string_value from unnest(event_params) where key = 'session_engaged') = '1' then user_pseudo_id else null end) AS active_users
,count(distinct user_pseudo_id) AS totalusers
,traffic_source.name AS CampaignName
,geo.country AS Country
FROM `independent-tea-354108.analytics_254831690.events_20221231`
GROUP BY
traffic_source.name
,geo.country
The result filtered by CampaignName='(organic)' was:
(https://i.stack.imgur.com/LMQAH.png)
But when I compare with the data from GA4, it doesn't match and the difference is huge (around 15000 more active_users in GA4 than in BigQuery). Please note that this is only for one day, if it was a month the difference would be even higher:
(https://i.stack.imgur.com/8arYs.png)
I've tried filtering by other CampaignNames and not a single value matches and the differences are always huge.

These are two common reasons for the GA4 to BigQuery difference, You have probably already looked at them already.
Check your source table for blank 'user_pseudo_id's if you have a consent mode on your website they may be counted in GA4 but not in bigquery and this can cause big differences.
Time zone is another are that can make a difference BigQuery is always in UTC time your GA4 may not be.
I hope these help

Related

Results within Bigquery do not remain the same as in GA4

I'm inside BigQuery performing the query below to see how many users I had from August 1st to August 14th, but the number is not matching what GA4 presents me.
with event AS (
SELECT
user_id,
event_name,
PARSE_DATE('%Y%m%d',
event_date) AS event_date,
TIMESTAMP_MICROS(event_timestamp) AS event_timestamp,
ROW_NUMBER() OVER (PARTITION BY user_id ORDER BY TIMESTAMP_MICROS(event_timestamp) DESC) AS rn,
FROM
`events_*`
WHERE
event_name= 'push_received')
SELECT COUNT ( DISTINCT user_id)
FROM
event
WHERE
event_date >= '2022-08-01'
Resultado do GA4
Result BQ = 37024
There are quite a few reasons why your GA4 data in the web will not match when compared to the BigQuery export and the Data API.
In this case, I believe you are running into the Time Zone issue. event_date is the date that the event was logged in the registered timezone of your Property. However, event_timestamp is a time in UTC that the event was logged by the client.
To resolve this, simply update your query with:
EXTRACT(DATETIME FROM TIMESTAMP_MICROS(`event_timestamp`) at TIME ZONE 'TIMEZONE OF YOUR PROPERTY' )
Your data should then match the WebUI and the GA4 Data API. This post that I co-authored goes into more detail on this and other reasons why your data doesn't match: https://analyticscanvas.com/3-reasons-your-ga4-data-doesnt-match/
You cannot simply compare totals. Divide it into daily comparisons and look at details.

Disagreement between BigQuery and Google Analytics 4 for Pageviews - why?

I have a large table of Google Analytics 4 (GA4) events in Big Query for a bunch of websites I look after. The table has the following schema:
field name
type
event_date
date
event_timestamp
integer
event_name
string
event_key
string
event_string_value
string
event_int_value
integer
event_float_value
float
event_double_value
float
user_pseudo_id
string
user_first_touch_timestamp
integer
device_category
string
device_model_name
string
device_host_name
string
device_web_hostman
string
geo_country
string
geo_city
string
traffic_source_name
string
I query the table to get the total number for pageviews for a specific site using the following query:
with date_range as (
select
'20220601' as start_date,
'20220630' as end_date)
select
count(distinct case when event_name = 'page_view' then concat(user_pseudo_id, cast(event_timestamp as string)) end) as pageviews
from
`project_name.datset_name.table_name`,
date_range
WHERE
event_date BETWEEN PARSE_DATE('%Y%m%d',date_range.start_date) AND PARSE_DATE('%Y%m%d',date_range.end_date)
AND device_web_hostname in ("www.website_name.com")
What is a mystery to me is that when I do this for some sites, the figure for page_views is out by several hundred pageviews. The Big Query figure is higher. What is interesting is that:
If I try other events, such as sessions then there are no issues
As stated, it is only for some sites and not all
I know enought to know:
These numbers are never going to agree, but they shouldn't be out by several hundred either
GA4 has the unprocessed data, so the way I am querying the data is different to how it is being processed in the GA4 interface
I have tried:
Looking at the GA4 documentation to see how pageviews are used/processed; I can't see anything that enlightens me
Debugging each site to make sure tags are firing correctly; they are
I've hit a bit of a wall with this and I'd begrateful if anyone has any insight to point me in another possible direction. Thanks in advance!
The issue lies in this following part of the code:
select
count(distinct case when event_name = 'page_view' then concat(user_pseudo_id, cast(event_timestamp as string)) end) as pageviews
You are counting distinct for concat of user_pseudo_id and event_timestamp which is not unique. You need to also have session_id on top of that to get a unique hit.

How to calculate "Daily user engagement" on Big Query and get the same result that Firebase shows in its dashboard?

I'm trying to calculate the amount of time the average user spent in my app on a particular day.
I just read this great post from Todd Kerpelman on Firebase Blog that explains how to do it. With a little modification on his query I reached the solution bellow:
SELECT AVG(user_daily_engagement_time/1000/60)
FROM(
SELECT user_pseudo_id, event_date, sum(engagement_time) AS user_daily_engagement_time
FROM (
SELECT user_pseudo_id, event_date,
(SELECT value.int_value FROM UNNEST(event_params) WHERE key =
"engagement_time_msec") AS engagement_time
FROM `mydataset.events_20201104`
)
WHERE engagement_time > 0
GROUP BY 1,2)
The problem is that when I compare it with Firebase analytics (for the same period) it gives me a difference higher than 2 minutes. In this case the BigQuery answer is >2minutes higher than the Firebase Dashboard.

Show transactions from a user who saw X page/s in their session

I am working with Google Analytics data in BigQuery.
I'd like to show a list of transaction ids from users who visited a particular page on a website in their session, I've unnested hits.page.pagepath in order to identify a particular page, but since I don't know which row the actual transaction ID will occur on I am having trouble returning meaningful results.
My code looks like this, but is returning 0 results, as all the transaction Ids are NULL values, since they do not happen on rows where the page path meets the AND hits.page.pagePath LIKE "%clear-out%" condition:
SELECT hits.transaction.transactionId AS orderid
FROM `xxx.xxx.ga_sessions_20*` AS t
CROSS JOIN UNNEST(hits) AS hits
WHERE parse_date('%y%m%d', _table_suffix) between
DATE_sub(current_date(), interval 1 day) and
DATE_sub(current_date(), interval 1 day)
AND totals.transactions > 0
AND hits.page.pagePath LIKE "%clear-out%"
AND hits.transaction.transactionId IS NOT NULL
How can I say, for example, return the transaction Ids for all sessions where the user viewed AND hits.page.pagePath LIKE "%clear-out%"?
When cross joining, you're repeating the whole session for each hit. Use this nested info per hit to look for your page - not the cross joined hits.
You're unfortunately giving both the same name. It's better to keep them seperate - here's what it could look like:
SELECT
h.transaction.transactionId AS orderId
--,ARRAY( (SELECT AS STRUCT hitnumber, page.pagePath, transaction.transactionId FROM t.hits ) ) AS hitInfos -- test: show all hits in this session
FROM
`google.com:analytics-bigquery.LondonCycleHelmet.ga_sessions_20130910` AS t
CROSS JOIN t.hits AS h
WHERE
totals.transactions > 0 AND h.transaction.transactionId IS NOT NULL
AND
-- use the repeated hits nest (not the cross joined 'h') to check all pagePaths in the session
(SELECT LOGICAL_OR(page.pagePath LIKE "/helmets/%") FROM t.hits )
LOGICAL_OR() is an aggregation function for OR - so if any hit matches the condition it returns TRUE
(This query uses the openly available GA data from Google. It's a bit old but good to play around with.)

SQL: Average value per day

I have a database called ‘tweets’. The database 'tweets' includes (amongst others) the rows 'tweet_id', 'created at' (dd/mm/yyyy hh/mm/ss), ‘classified’ and 'processed text'. Within the ‘processed text’ row there are certain strings such as {TICKER|IBM}', to which I will refer as ticker-strings.
My target is to get the average value of ‘classified’ per ticker-string per day. The row ‘classified’ includes the numerical values -1, 0 and 1.
At this moment, I have a working SQL query for the average value of ‘classified’ for one ticker-string per day. See the script below.
SELECT Date( `created_at` ) , AVG( `classified` ) AS Classified
FROM `tweets`
WHERE `processed_text` LIKE '%{TICKER|IBM}%'
GROUP BY Date( `created_at` )
There are however two problems with this script:
It does not include days on which there were zero ‘processed_text’s like {TICKER|IBM}. I would however like it to spit out the value zero in this case.
I have 100+ different ticker-strings and would thus like to have a script which can process multiple strings at the same time. I can also do them manually, one by one, but this would cost me a terrible lot of time.
When I had a similar question for counting the ‘tweet_id’s per ticker-string, somebody else suggested using the following:
SELECT d.date, coalesce(IBM, 0) as IBM, coalesce(GOOG, 0) as GOOG,
coalesce(BAC, 0) AS BAC
FROM dates d LEFT JOIN
(SELECT DATE(created_at) AS date,
COUNT(DISTINCT CASE WHEN processed_text LIKE '%{TICKER|IBM}%' then tweet_id
END) as IBM,
COUNT(DISTINCT CASE WHEN processed_text LIKE '%{TICKER|GOOG}%' then tweet_id
END) as GOOG,
COUNT(DISTINCT CASE WHEN processed_text LIKE '%{TICKER|BAC}%' then tweet_id
END) as BAC
FROM tweets
GROUP BY date
) t
ON d.date = t.date;
This script worked perfectly for counting the tweet_ids per ticker-string. As I however stated, I am not looking to find the average classified scores per ticker-string. My question is therefore: Could someone show me how to adjust this script in such a way that I can calculate the average classified scores per ticker-string per day?
SELECT d.date, t.ticker, COALESCE(COUNT(DISTINCT tweet_id), 0) AS tweets
FROM dates d
LEFT JOIN
(SELECT DATE(created_at) AS date,
SUBSTR(processed_text,
LOCATE('{TICKER|', processed_text) + 8,
LOCATE('}', processed_text, LOCATE('{TICKER|', processed_text))
- LOCATE('{TICKER|', processed_text) - 8)) t
ON d.date = t.date
GROUP BY d.date, t.ticker
This will put each ticker on its own row, not a column. If you want them moved to columns, you have to pivot the result. How you do this depends on the DBMS. Some have built-in features for creating pivot tables. Others (e.g. MySQL) do not and you have to write tricky code to do it; if you know all the possible values ahead of time, it's not too hard, but if they can change you have to write dynamic SQL in a stored procedure.
See MySQL pivot table for how to do it in MySQL.