A/B testing is a well-known concept for customer experience optimization (CXO) or optimizing Ads. By defining a completely new test method in the Hunkemöller SEO channel, we have proven that SEO A/B testing not only works, but also yields substantial results.
Why SEO A/B testing?
Search engines are an important traffic source for almost all companies and organizations. On the one hand, traffic and thus search behavior is constantly changing. On the other hand, all changes to the website also have impact. A data-driven SEO strategy is important to anticipate these changes. With a data-driven SEO strategy, all changes that can be implemented with the confidence that they yield positive results. For an organization like Hunkemöller, it is essential to make the right decisions due to the competitive market and the entry of competitors. Data-driven SEO, as to be read later in this article, is an important key to future business success.
The data-driven SEO methodology answers the question:
“What will it bring us?”
Not only in organic traffic and visibility, but also in business metrics such as turnover.
Until recently, it was difficult or impossible to estimate the impact of a specific change directly to organic traffic as an uplift. With our developed methodology for SEO A/B testing, we can say with statistical certainty whether the change has a positive impact and what this change will yield. This finally brings wellfounded decisions to implement site-wide changes. Our method makes it possible to test changes at a detailed level, in order to find the perfect mix in reaching the target group via the organic channel.
In addition, SEO A/B testing can prevent you as an organization from wasting time and money on changes that ultimately don’t contribute. Or what the result of a deferred change will be.
How does SEO A/B testing work?
As said. With an SEO A/B test you can validate almost all SEO changes and optimizations before they are site-wide implemented.
When we say almost every SEO optimization: we think for example of:
- Optimizing content elements such as: headings
- Optimizing metadata such as: page titles or meta descriptions
- Adding content
- Adding structured data
- The impact of client-side rendered vs. server side rendered content
- The impact of Web Vitals optimization (site speed)
A/B testing for SEO operates differently from CXO A/B testing. Although we present users with two different versions of our content. However, in a SEO A/B test, the control group and variant are divided within a group of pages that use the same template. Different from CXO where one page with multiple variants (one control and at least one variant) is tested.
SEO A/B testing is page oriented instead of audience oriented, in the sense that we test on a section of the site by splitting the pages (for every visitor, including Googlebot) instead of splitting the audience.
The pages are optimized to provide Google Search users with a better result with the aim that Google will appreciate these pages better, resulting in an increase in high quality traffic.
Within the OrangeValley approach we use a prediction model (forecasting). This allows us to isolate and compare the variant and the (modeled) control.
Our approach takes into account external factors such as:
- Google algorithm updates
- Competitor SEO changes
- Other external factors
Any fluctuations such as updates to the Google algorithm will equally affect both the control group and the variant. During the analysis, we won’t look at the traffic trend, but we look at the performance difference between the modeled control group and the variant. We cannot directly compare the variant and control due to possible differences in the variant and control groups, for example the number of pages or the absolute number of sessions per group.
In the following case we prove that SEO A/B testing is a reliable and accurate way to get results with SEO changes.
The test concerns a change of the page title and was eventually carried out on 50% of a selected group of pages on hunkemoller.nl. Where previously page titles were mainly optimized at keyword level, the focus of the test is on the intent of a Google user. In order to return a better and more appropriate result to a Google user.
The test started in early April and has been active for a total of 54 days. We looked at the following metrics to assess the test:
- Organic sessions
- Organic clicks (non-branded)
- Organic impressions (non-branded)
- Secondary metrics
- Number of transactions
Per metric we evaluate three topics to get an idea of the impact of the change:
- Test score (percentage change including confidence interval)
- Significance scoring (statistical certainty to be sure that the effect is fully traceable to the change)
- Daily and cumulative additional traffic or visibility (including confidence interval)
Then we will make an impact forecast to determine what the results will be when the optimization is applied to 100% of the selected pages.
For this case we zoom in on the result of the organic non-branded clicks and impressions from the Google search results.
Non-branded organic clicks
One of the test metrics is the number of organic clicks that arrive from search engines through searches without the Hunkemöller brand name. The non-branded traffic within SEO is the most important part for most organizations. We therefore look at the result separately from searches related to the brand name.
In the image above you see a development/progress of the variant (black line) compared to the forecast (blue lines). The forecast is based on historical data that we use to predict the expected traffic to the variant.
The lines diverge immediately in the first week. This confirms that traffic to the variant group is outperforming what is predicted, and immediately shows that the test is a success. Ultimately, based on non-branded clicks, we see an uplift in organic traffic of no less than 30%!
We use a set of control pages to give the model context for trends and external influences. So if something else changes during our test (e.g. seasonal influences), the model will detect it and take it into account. The graph above shows seasonal influences. The good May weather in the Netherlands is driving demand for swimwear.
The graph above shows the cumulative effect of the increase in traffic compared to our model. The shaded blue area represents a 95% confidence interval. When all three curves perform below (negative) or above (positive) the y = 0 (cumulative gradient) axis, the test is statistically significant.
On April 10, we had 95% certainty that the results were not caused by chance. At the end of the test, the significance score with a value of 99.9% was even higher.
Non-branded organic impressions
In addition to clicks, it is also relevant to look at non-branded impressions or organic visibility as a metric.
Following this metric shows that it takes a little longer for the lines to diverge. On the one hand, this has to do with the fact that Hunkemöller has many number-1 positions on searches with many impressions. On the other hand, the positive effect on clicks and focus on a user’s search intent seems to pay off in better positioning and wider visibility. Ultimately, based on non-branded impressions, we see an uplift of 13%.
After two weeks, we also had 95% certainty that the results were very positive and significant. We kept the test active for a considerable time after this to have a full picture of the impact of this change. At the end of the test, the significance score was 99.8%.
Result of the test
The change of the page title has achieved the expected/desired effect, namely; an increase in organic visibility and traffic. Based on all measured metrics, this test was positive and significant. Based on the findings of this test, the change has been made on all selected pages on hunkemoller.nl.
This case confirms that the page title is still a powerful SEO element. Responding to a user’s intent with a relevant page title works when it comes to presenting the best answer to a user.
The outcome of this test led us to subsequently set up similar tests for other countries where Hunkemöller is active. We know from experience that the results of a similar test can differ per site, industry and country. We are also working on a schedule to test other relevant SEO changes in a data-driven manner.
Predicting the impact of an SEO change used to be an almost impossible task. With the introduction of the SEO A/B test method, a proven, significant and reliable solution has been found to implement or postpone changes with a defined outcome. The Hunkemöller case proves that the outcome was not only easy to predict, but also delivered excellent business results. The innovative courage of the Hunkemöller team, combined with our SEO A/B method, has paved a completely new path to take their SEO to the next level.
Are you curious how you can use SEO A/B Testing for your organization? OrangeValley has the knowledge and experience to support you in this process. We are ready to think along with you!