An introduction in A/B testing—what it is, and why it’s useful

By: on May 29, 2020

What is A/B testing?

A/B testing is a methodology that allows you to compare two different states of a webpage or an app to determine what performs better. It’s an experiment where two or more variants are shown to users at random, then analysis is done to determine which variant achieved the better conversion goal.

AB testing example

By running an A/B test, you can create hypotheses against your app to determine success against KPIs, for example, conversion rate and average order value.

How AB testing works

In AB testing, you take an initial state from your website or app and create variants on top of that default version. Changes can be simple like header changes, or button colour changes, or the changes can be complex, like a redesign of the page or a new checkout flow. Generally, traffic is distributed evenly across your default and variants so users get an even chance of seeing either the default version of your page or a variant of it.

But maybe you are concerned about the variants being set live. To mitigate risk, you can turn down the weighting of people seeing a variant. You will still get a comparable data set to analyse, but it will take longer to get a confident result from the A/B test (more on that later).

As users start to see the A/B test, you’ll start to see data in your analytics platform. By reading this data, you’ll be able to see whether a test variant has had a positive, negative, or neutral effect on your KPIs.

So why should I AB test?

One of the main benefits of A/B testing is data-driven decisions. By following a good A/B testing roadmap, you can carefully iterate the UX experience on your site. This also allows you to create hypotheses and discover which elements of your website affect user behaviour. On the flip side, A/B testing can prove your hypotheses wrong by producing a negative result, which is much less harmful than deploying the variant to your entire userbase.

Because A/B testing is an iterative process; by using a detailed roadmap, you can see where KPI uplifts were made in your A/B testing process, and also what didn’t work so well, meaning you can piece together evidence for your design decisions super easily. Another key benefit of A/B testing is the powerful segmentation you can do on a user base. By plugging your preferred A/B testing tool into your CRM, you can create powerful segments that lead to more targeted ad campaigns. A good example of this is that you can change the homepage hero image based on the weather in the users’ area. If it’s raining, show a variant with coats and umbrellas; if it’s sunny, show a variant with summer clothes and sunglasses. This out-of-the-box functionality is possible natively in code, but this customization saves lots of development time in and creates a more carefully tailored user experience.

What if I want to compare multiple elements on a page?

Imagine this scenario, you have a CTA and you want to test the background colour and the text content (ie., two variables). You want to test three colours; red, green and blue, and three text variants; Submit, Checkout Securely, and Continue. To accurately test all possible scenarios, you would have to create SIXTEEN variants:

  • Red, Submit
  • Red, Checkout Securely
  • Red, Continue
  • Red, DEFAULT TEXT
  • Blue, Submit
  • Blue, Checkout Securely
  • Blue, Continue
  • Blue, DEFAULT TEXT
  • Green, Submit
  • Green, Checkout Securely
  • Green, Continue
  • Green, DEFAULT TEXT
  • DEFAULT COLOUR, Submit
  • DEFAULT COLOUR, Checkout Securely
  • DEFAULT COLOUR, Continue
  • DEFAULT COLOUR, DEFAULT TEXT

Painful, right? This is where MVT testing (Multi-variate testing) comes in. MVT testing creates the above scenario, but takes the pain out of combining all the relevant combinations. In your A/B testing tool, select the test to be an MVT test and you are good to go. Now all you have to do is create the below variants.

  • Red
  • Blue
  • Green
  • Submit
  • Checkout Securely
  • Continue

Your A/B testing tool will automatically combine all possible scenarios automatically and tell you the winning combination. One thing to watch out for is that your customer base will see one of sixteen variants. The issue with this is that, with low traffic sites, it will take a while to get the relevant data for analysis, so this approach only works if you have enough traffic to spread across the multitude of variants. Most of the time, an A/B test with multiple variants should cover your use case.

What is the process of testing?

By following a defined process, you can get a solid foundation on which to create A/B and MVT tests.

Step 1: Do your research

Look at your app and see where you need to improve the user experience or see where you feel you need to improve your KPIs. Obvious things to look out for are drops in conversion rate or a higher bounce rate for example.

Step 2: Come up with success metrics

Success metrics are metrics that determine how your variations are doing against the default version of your test. For example, some easy goals to consider are button clicks, email sign-ups or purchases.

Step 3: Generate hypotheses

Once you have some success metrics defined, you need to create hypotheses for the tests you want to create, explaining why you think the variants will perform better than the incumbent version. These hypotheses then need to be prioritised based on effort and impact.

Step 4: Create the variants

This is where you need to create the variants that perform against the default version of your app. Things to change could be colours, text, image sizes or changing navigation. In your A/B testing tool you should have a visual editor to help with this, but if you have developers handy, you can write code to create a more customised experience.

Step 5: Run the test

Now to launch the test. Once launched, your users will see one of the variants (or default) defined in the test. Once users interact with the test, relevant data is gathered and crunched behind the scenes in your A/B testing tool.

Step 6: Look at the numbers

When ready, conclude the test. Your A/B testing tool will present how the variants performed against the default version of your app based on the KPIs you defined in Step 2. Your A/B testing tool will also let you know if it needs more data; if so, then it will recommend you to turn the test back on and will give a guide as to how long for.

SEO considerations

Google actively encourages you to perform tests on your app, however, there are a few SEO considerations to bear in mind when running your A/B tests.

  • Don’t be misleading: Don’t use A/B tests as a form of cloaking a part of your site in order to gain an advantage for SEO, Google isn’t silly, it will pick up if you do this and penalise you.
  • Use canonical URLs: If testing different pages with different links, make sure that you tell Google what the main page is via a rel=”canonical” tag. This will stop the confusion when Google finds two very similar pages on your site.
  • Use temporary redirects: Using 302 redirects as opposed to 301 redirects will tell Google that any redirecting done as part of your A/B test is only temporary, and not part of the permanent site structure.
Share

Leave a Reply

Your email address will not be published.

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>

*