Both your visitor and Google prefer your site to be fast. It improves your conversion rate, but different AB testing tools actually may slow down your site.
This data research shows how your site performance is affected by your (future) AB testing tool. Click here if you want to go straight to the results.
- 1 Background
- 4 Results: Top 9 fastest AB testing tools
- 5 What to do with these results?
1.1 Data shows faster sites convert better
Visitors have a patience threshold. They want to find something and unnecessary waiting will make them want to leave. Just like in a any shop.
For you that means: a slower site is less revenue.
In the graph above you can see the relation (green line) between load time and conversion rate for one of OrangeValley’s customers. 1 second slower, 25% less conversions.
The relation between web performance and conversion has been documented in many studies (e.g. #1, #2, #3, #4, #5). Google is also pushing for website owners to improve their site speed in their effort to improve the web experience.
1.2 The AB test case that lost us 12 points on Google Page Speed
Simyo (telco company) reached a 99/100 Google Page Speed score for their homepage. Implementing an AB testing tool according to the vendor’s best practice made the score drop 12 points.
Here is what happened to the Start render time when we implemented a AB testing tool based on the recommended method of implementation of the vendor.
Implementing an AB testing tool can slow down your site too much. (tweet this)
1.3 The way AB testing tools can make your site slower
Simply put, most AB testing software create an additional step in loading and rendering a web page.
That is why sometimes you can see elements changing on page as it is still loading.
Brian Massey, conversionsciences.com:
The flash draws attention to the element you’re testing, meaning that visitors who see your test will focus on that element. If the treatment wins, it will be rolled out to the site,eliminating the flash. The increase seen in the test can be completely erased. So, this is a pretty important issue.
But that is not the only problem. The presence of such scripts can block the render execution of parts of your page slowing it down even more.
Plenty has been written on this topic. Here are a 2 good sources that go in-depth.
- 11 ways to stop FOOC’ing your AB tests by Alhan Keser (Widerfunnel)
- Server-side vs. client-side AB testing tools: What’s the difference? by Alex Birkett (ConversionXL)
2 Disclaimer and transparancy
Changing an image and some text are part of many AB tests so it makes sense to use these simple changes as the basis of this study. However, a tool’s performance will of course partly depend on the page you are testing and the type of experiment executed. The goal of this research was to compare the tools when these factors are exactly the same. Other situations may lead to different results, so do get data on your specific situation. If your tool has no negative effect at all then probably you should first focus on optimizing the performance of your site.
Moreover, vendors continuously work on their performance. Both Convert.com and Marketizator informed us that they made significant changes to their setup in order to improve on this topic. Convert.com did so before conducting this study, Marketizator just after. We just hope reading this study agencies and end-users become aware of the importance and impact this topic can have on their site visitors.
At OrangeValley we chose the tools we use on a case-by-case basis when we can or we use the tools the client is already using if that is best for them. As a result we mostly use Optimizely, SiteSpect (located in the same building we are in) and VWO for AB testing and sometimes other tools. In late 2015 we were having issues with a client-side tool (flicker). Optimizing code and testing different implementations did not work and a back-end tool was not an option. Since we did not have a plan C we decided to find out the very best client-side AB testing tool when it comes to experienced loading time. Since the study is worth sharing and took a lot of our time we did so in this publication.
3 Methodology summary
In this study we compared 9 different AB testing tools on loading time experienced by your website visitors. The goal for us was to determine which alternative tool we should be looking at to minimize any negative effect.
Monetate and Adobe Test & Target are also known tools but were not included. We invite them to participate in any follow-up on this study.
We used Speedcurve and our private instance of Webpagetest.org to conduct loading tests during a full week. These are widely used platforms amongst web performance experts.
All experiments were setup the same
- same hosting server (Amsterdam)
- same location and speed settings (15Mbps down, 3Mbps up, 28 ms latency)
- same variation
- implementation according to vendor’s knowledge base recommendations
- background image uploaded to the tool when this option was available
- 80+ test runs per tool during a week to decrease the effect of potential outliers (in hindsight all tools showed stable measurements)
How we measured experienced loading time in an AB testing context
What we really wanted to measure was when specific elements relevant to visitors are loaded. Including any changes applied by the AB testing tools as this directly impacts experienced loading time by users.
Our challenge was that we needed to measure beyond loading the original elements. First when elements were loaded and second when the client-side script (used by 8 out of 9 vendors) changed those elements in the AB test. In some cases the final text was directly visible, in other cases the original text was shown and then changed on-screen to the final text (flicker effect).
In the end we automated the process of analyzing the filmstrips in order to measure these changes on-screen for the user. This way we could measure when the variation text “How FAST is …[AB tool]? and the background were applied to the page.
Originally we intended to look at metrics such as Speed index, start render and visually complete to compare the tools. However, when we looked at the render filmstrips and videos it became clear that these data metrics were not really representative of what your site visitors would experience. In fact, one of the tools allows the page to start rendering very very soon, but what the visitor in the end sees in his/her browser turned out to be much slower.
Speedcurve does have a great solution to measure experienced loading time using custom metrics. Essentially you add scripts to elements on your page that are most relevant to the user when entering the page: Elements above the fold such as your value proposition and important call-to-actions.
In this case we needed to measure the changing text and background that were part of this experiment. Although we could measure the original headline with custom metrics, doing the same with the variant headline turned out to be more challenging.
Comparison in graph
We hope the results we found were useful for you! If you’d like to share this post on Twitter with other people it is greatly appreciated, thank you!
5. What to do with these results?
With clear performance differences between AB testing tools it it wise to choose carefully . Lower performance can lead to lower conversion, test reliability and a drop page speed won’t do good for your SEO position.
Don’t panick and get data on your site using simple tools such as Google Page Speed and Webpagetest.org and ensure that the tool you are using works properly on your site. If not, you can try to improve your setup. Brian Massey from conversionsciences.com gave a great way to reduce flicker effect by using CSS to apply changes instead of jQuery. Then the changes can be already applied before the element is displayed. Of course then you do need a tool that is fast enough for your particular page.
Doing this research took us quite some time, so we hope it is useful for you. If so feel free to share this post on Twitter. Thank you!