Skip to main content

How to measure page load performance as part of your experiences

We are able to utilise data to enhance the reporting specifically around page performance within an experience.

O
Written by Optimize Team
Updated yesterday

Webtrends Optimize lets you capture all manner of data, and page load performance is one of those options. This document describes where the data comes from, what it means, how to capture it, and what data you'll get back out in the Optimize reports.

Be aware, this approach does not work as well with Single Page Applications. SPAs load the page once and then change the content dynamically.

As a result, standard performance metrics like domInteractive won’t fire again for internal page views.

There’s currently no meaningful way to measure traditional page load metrics in SPAs using this method.

Available performance metrics

Although officially deprecated, many browsers have a "Performance Timing API" - a means of capturing page load performance metrics. For these browsers, we can capture that data and therefore make some comparisons on any differences observed as part of your variation.

Data courtesy of caniuse.com

This API comes with useful metrics such as:

  • domInteractive: When the page is ready for user interaction.

  • domComplete: When most elements have finished loading (excluding lazy-loaded content).

  • loadEventEnd: When all scripts and resources have fully loaded.

We can capture these into Custom Data for your experiment. More info on tracking custom data can be found here.

Capturing performance metrics into Custom Data

To record these metrics, add the following script to your post-render code or wherever appropriate in your variation.

This script:

  • Waits for the page to finish loading.

  • Captures load times relative to the request start.

  • Sends them as custom conversion data.

!function(){

if(!window.performance || !window.performance.timing) return; // not available for this user

var testAlias = "ta_something";
var check = setInterval(function(){

if(!window.performance.timing.loadEventEnd) return; // page is still loading
clearInterval(check);

var start = window.performance.timing.requestStart;

// Page has loaded, capture the event
WT.click({
testAlias: testAlias,
conversionPoint: "performance_info",
data: {
domInteractive: window.performance.timing.domInteractive - start,
domComplete: window.performance.timing.domComplete - start,
loadEventEnd: window.performance.timing.loadEventEnd - start
}
});

}, 500);

}();

The data we hope to send will be similar to:

Add Custom Data Fields

Alongside the code, you will need to inform the platform to expect these metrics and you can do this as part of completing the 'Create' form and more info on how to do this is found here.

This will unlock the Non-binomial / Continuous Metric reporting that will make this data valuable.

If you want to capture this routinely, consider adding this into your global conversions, more info can be found here.

Optimize reports output for performance metrics

In your reports, you will find a panel for Non-binomial metrics. This will include your performance metrics, if set up correctly.

These reports will show:

  • Avg per tested visitor

  • Avg per test view (per time we fired the metric)

  • Means and Medians (note considerations below for studying averages)

  • Uplifts

  • Chance to beat control

  • Significance

Be aware, outcomes are inverted for this data.

For this niche use-case, outcomes are inverted. A reduction in numbers is a good outcome, and an increase is bad. Please keep this in mind when making sense of the data and deriving conclusions from it.

Considerations

May factors can affect the numbers we collect.

  1. Test scope affects results: Tests that redirect users to slower pages may show poorer performance even if the experience is good.

  2. Short-term vs long-term: Even if a variant causes slower load in test, it's often a temporary side effect. Once a winner is implemented, it's baked directly into your site — and likely performs better.

  3. Trends matter more than one-offs: A single test might show a dip, but repeated performance drops across experiments suggest it's time to optimize your delivery methods (e.g., favoring CSS changes over heavy JavaScript).

Did this answer your question?