All Collections
Troubleshooting
GA4 vs. WTO - Data Discrepencies
GA4 vs. WTO - Data Discrepencies

Understanding why data doesn't match between these two platforms.

Updated over a week ago

In-test comparisons

The scenarios in this document all assume that you've built a segment in GA for people who've fallen into a specific experiment or variation and are comparing differences. This document does not deal with comparing overall site traffic between GA and WTO, but just explores the data and the integration.

Introduction

If you run an experiment or AA test, and the traffic numbers in GA4 are different to Optimize - this document will provide you with the guidance you need.

We split our decision making into 3 routes:

  1. GA is 15%+ higher than WTO - Jump to this section

  2. WTO is 15% higher than GA - Jump to this section

  3. They are within 15% of eachother - Jump to this section

We will explore each of these separately.

Before we get started:

This troubleshooting guide comes with a big disclaimer. Not normal, but warranted in this situation.

Note: WTO = Webtrends Optimize, GA = Google Analytics

Note: The scenarios all assume that you've built a segment in GA for people who've fallen into a specific experiment or variation and are comparing differences. This document does not deal with comparing overall site traffic between GA and WTO, but just explores the data and the integration.

Note: We cannot guarantee the integrity of your GA data, nor are we experts in your web analytics setup. We do put significant effort into understanding why differences occur, but at no point guarantee a solution.

Why 15%?

We feel this is a fair and reasonable number. Between any two platforms, data is often about 20% off, for numerous reasons.

There are a growing number of valid reasons we're finding GA4 data to differ from WTO. Some of these include:

  • Different definitions of users and sessions

  • Data stitching in GA4

  • Sampling in GA4

  • Differences in devices definitions

  • Differences in bot definitions and inclusion in reported-on data

  • Filtering rules in GA4 datasets, e.g. IP exclusion

  • Differences in consent management rules

  • Ad-blockers and spammers blocking one platform but not the other.

For these and other reasons, we feel anything less than 15% off is not worth exploration.

It is also fair to say that GA4 comes with plentiful data issues - we are happy to help explore anything substantive, but while we stand by our data 100%, we absolutely recognise numerous flaws in GA4's data as does most of the industry. This takes us away from holding both positions of "ga4 data is awful" and "why doesn't my data match ga4" which we hear a lot of. Explorations are very time-consuming, and so we want to be fair both to you and to our team.


Scenario 1: GA is 15%+ higher than WTO

In this rather unlikely situation, you would be observing that despite having a segment on your GA report for people who've seen a particular experience / variation, the numbers in GA are higher than in WTO.

Here are some things you should check in this scenario:

1. What scopes are you analysing by?

It's important to make sure you're comparing the same scope of data. In GA, it's very normal to report on data by Session, whereas in WTO we most typically report by User. Sessions are of course more plentiful than Users (a user can have dozens of sessions), and so we would expect GA to be higher than WTO in this case.

Check your scopes. In WTO, you can toggle them in the settings bar that runs across your report:

Adjusting scope in a Webtrends Optimize report

It's worth noting that while we have similar definitions of session in WTO and GA, they are measured differently. You should not expect these counts to be identical even with an identical scope, albeit similar.

2. When are you sending data to GA?

By default, our integrations only send data to GA after we store the test-view event in WTO. They use a hook similar to:
โ€‹

WT.addEventHandler("pageview", function(data){    
// send data to GA
});

If you're not using this integration, you may send data into GA before we capture the view in WTO, with the WTO side perhaps delayed by things like Pageview Suspension, Polling, delays/waits, etc.

It is important to be sure that both events are being sent out in the same moment, if we want them to have similar numbers.

3. Understanding event counts

In GA, you can find event counts, which are similar to Totals as a Scope in WTO.

Comparing these two numbers is important - GA does some creative things with sampling/estimation and user stitching - if you can see that we've sent a similar number of events into GA that user counts are still quite off, it very clearly points to how users are defined as being the issue.

4. Collect and compare GA Client IDs

If you capture GA's Client ID (found in the _ga cookie) as Custom Data in Optimize, recorded against every pageview that we have, you'd be able to extract this data and compare it.

You're looking to see if for any given user in Optimize, multiple GA client IDs are recorded. If so, this proves that users are doing something like:

  • Clearing GA cookies

  • Blocking GA cookies on a regular basis and they routinely re-set

  • Going through some pages where the GA cookies drop off

Specifically, it points to issues with the GA setup.

To achieve this, you would need to:

  1. Add the value of the _ga cookei to the WTO datalayer before a test-view event is captured. We suggest only storing the client ID and not the full cookie value.

  2. Make sure a Custom Data field is created for your test for this data to flow into. Use the field-type Text.

Once data has collected over a few days, you'll be able to extract both the WTO User ID and the GA Client ID, and compare them to see if for a given WTO User, there are multiple GA Client IDs.

5. Sent wto_uid in the integration

We have found it quite useful, in narrowing down which customers have a problem, to send along the Webtrends Optimize User ID to Google Analytics in a custom dimension. By doing so, you can look in GA4 or Bigquery and pull a list of exact Webtrends Optimize users who've performed the action e.g. "purchase" that you want to compare.

In the past, when doing this, we've found that GA4 is missing some users and WTO is missing some users too for an event. Looking back at their journey, you can then study any commonality in who they are e.g. on a specific browser/device type, or what they've done e.g. purchase a specific type of product, go down a similar journey, etc.

Whilst time-consuming, it's important to keep in mind that for most users, things are tracking perfectly fine. And it's quite often a journey or scenario that's broken, particularly when mirroring events from GTM to WTO. Because of this, finding the common scenarios are crucial to fixing the problem.

To send the UID in the integration, you can adapt it in this manner:

var wto_uid = WT.optimize.lookup(projectAlias).params.systemUID;

// GTM example:
window.dataLayer.push({
'event': 'optimize_view',
'project_alias': `{{eventCategory}}`,
'test_id': `{{eventAction}}`,
'experiment_id': `{{eventLabel}}`,
'wto_uid': wto_uid
});

Once captured, you can pull this data out in both the Explore report and Bigquery, which is incredibly useful both for filling in gaps and also studying journeys.

6. Are you analysing by the right values in GA?

We've found customers sometimes analyse by Project Alias (sometimes referred to as test alias - the ta_ value). If you are, it's worth noting that cloned tests and iterations will all roll-up into the same bucket.

This is less of a concern if you're analysing within a boxed timeframe, and it's definitely only while one test was live, but most people analyse data from "all time" and so you catch users you don't mean to.

If this is the case, we recommend analysing in GA by TEST ID instead of Project Alias, which will be unique to that specific iteration of that test, and should give you far clearer and comparative numbers.


Scenario 2: WTO is 15%+ higher than GA

So you're seeing more data in WTO than GA. There are many reasons this could happen - we will discuss them here and various ways to verify the problem and solve them.

Note: Many of the suggestions discussed here require further work, tracking and a few days for data to come in. It is important to have fair expectations walking into this problem - it's not an easy one to solve.

Note: In principle, this is unlikely to be a problem with WTO. We do not manufacture data, and so having a more complete picture of what's going on is not a criticism of the Optimize platform. But with that said, we will still help you understand why there is a difference.

1. Exclusions in your GA data

A common reason to see less data in GA than WTO is because you have exclusions set up in GA that block some data from being shown.

To see if you have any filters set up in your View in GA, go to: Admin > View > Filters:

If you see any filters in here, whether excludes or includes, you will need to make sure the data you view in Optimize has similar filters.

It's very normal to exclude office IP addresses and internal monitoring tools in GA, but forget to set these in Optimize.

2. Running a redirect test?

Sending data to GA typically happens just after WTO accepts a test-view event. Redirect (split) tests work by redirecting the user just after we accept a test-view. If GA does not count the user before the page bounces away, it might not track. This would result in lower counts in GA than WTO.

One solution to this is to delay the redirect further, hoping it counts the user in time. This comes at the expense of user experience though, and is not recommended.

The other solution is to send data to GA on the redirected-page, which won't unload quickly and so there's plenty of time to send the data.

3. Large amounts of customers blocking GA

The challenge with using GA as your "source of truth" is that it's very well documented how to block it. All adblockers are able to do this, and anyone who is scraping, spamming or automating access to your website probably wants to fly under the radar and so will likely do this too.

WTO, on the other hand, underwent a domain change in 2018 from webtrends.com to webtrends-optimize.com, and as part of that we dropped off most people's radar for adblocking.

It is possible to detect when data is sent to GA - we would suggest you capture successful sends as a metric in WTO to gauge how often blocking occurs.

You can, for example, monkey-patch/proxy navigator.sendBeacon, read the URLs that come through, and if they contain the URL for data going into GA, you can capture a metric in WTO.

You may feel this doesn't happen, but we've surprised many customers with the scale of this problem on their website in the past - in Travel in particular, but other industries to a reasonable extent too.

4. Check device types and browsers

We next recommend that you analyse discrepancies by device and by browser. It's hard to tell as grouped data, but we've found in the past that a certain browser is not sending data to GA correctly - a problem that's diluted by every other browser working just fine, but when studied in an isolated manner it's easy to spot.

This could lead to aspects of your integration being identified as not compatible with certain browsers, e.g. using let or const in your integration code, whilst having a higher than usual share of older browsers that don't support ES6.

From this analysis, you might find something like:

  • Chrome: within 10%

  • Firefox: within 10%

  • Mobile Safari: 60% off

Identifying this helps focus your investigation towards a specific browser instead of the website as a whole.


Scenario 3: They are within 15% of eachother

Example of what you might be seeing to qualify for this scenario:

  • Data in WTO:

    • Control: 1000 users

    • Variation: 995 users

  • Data in GA:

    • Control: 920 users

    • Variation: 900 users

We do not believe this scenario to be a specific problem. The two platforms of GA4 and WTO calculate users and sessions differently, track metrics differently, deal with bot traffic differently, and GA4 also has known data sampling issues that mean data can often be 20% out whereas we do not sample in WTO.
โ€‹
For this reason, if you see data within 15% across the two platforms, you should understand that this is normal, expected, and nothing of concern.

For the same reason, we will not support any investigations of data discrepancies that are within this range. Any investigation is extremely unlikely to uncover anything that needs changing.

Did this answer your question?