In this post, we’re going to take an honest look at A/B testing, including when it’s important and when not to bother, what tools to use and how to design experiments that will get results you can actually use.
There’s a million great posts on A/B testing out there. We’ve included our favourites in the ‘further reading’ section at the end.
What is A/B testing?
Simply put, A/B testing is when you create two variations of a web page, email, ad etc and show variation A to one group of participants and variation B to another group. Think of a scientific experiment where the scientist gives one set of participants medicine and the other group (the control) a placebo. The scientist is essentially doing an A/B test to see if the medicine produces a significant relief in symptoms compared to the placebo.
Of course, you can have more than two variations, in which case, you’d typically call the experiment an A/B/n test or a split test, but for the sake of this post we’re going to keep it simple. In this post, when we say A/B test, we’re going to be referring to testing two variations of a webpage in order to compare the results.
Things to consider before A/B testing
The thing a lot of posts on A/B testing ‘forget’ to mention is that the first thing you should do is consider whether you actually need to do an A/B test. It’s easy to get caught up in the benefits of data-driven product development, but sometimes, this means we forget to trust our instinct and experience. We’ll cover this in a bit more detail in the “is it worth it?” section of this post.
Which takes me to the second point – can you afford it? There’s no point setting up an A/B test if you don’t have very much traffic. You’d have to leave it running for months to get a statistically significant result (i.e. a result you can trust isn’t down to pure chance), which kind of defies the point of testing and rapid development in the first place. If don’t have masses of traffic, you can always use paid ads (Google, Facebook, Twitter etc) to drive traffic, but it is going to cost you.
Thirdly, have you done your research? Its tempting to come up with a random idea (e.g. let’s change the header to something else!) without considering the deeper logic behind that idea (e.g. if we change the header to this, it will create a sense of urgency to buy our product and we’ll get more clicks on our sign up button). Take time to develop your tests based on research, experience and your goals.
Finally, remember what you’re testing. If you’re trying to get more people to create an account, an uplift in pricing pageviews doesn’t mean your test was successful. If you don’t figure out what to test and what success looks like beforehand, your results will be muddled and you’ll end up chasing a different goal than the one you set out to reach. Trust me, you don’t want to go down that rabbit hole.
What tools to use
We’ve always used Optimizely for A/B testing, and we’ve always been big fans but recently, Optimizely has moved into the enterprise market and priced themselves out of the water for most startups.
In response to this (or more likely the driving force behind it), Google has recently launched Optimize, a free Optimizely-style A/B testing tool.
While its free, we have noticed a couple of annoying things about Google Optimize:
- Its buggy – it’s new, so bugs are to be expected, but bear in mind there’s no support for the free version so if you get stuck, you’re going to have to rely on kind people in Google support forums to help you. That being said, Optimizely’s support can be slow, especially for UK users who want to speak to a support rep in working hours (Optimizely’s support team are based in the US).
- You need events and goals set up in Google Analytics – tools like Optimizely will track simple click goals for you as standard, but if you’re using Google Optimize, you’ll need to set up these goals yourself in Google Analytics. If you use Tag Manager, this should be pretty straightforward. If not, you may need a developer to help you get these goals set up.
There are other options that are worth looking into and with new tools popping up all the time, it’s certainly worth signing up for some free trials to find the right tools to suit you. You can also run A/B tests manually (i.e. create variations from scratch, send paid traffic to each variation) but it’s probably best to get dedicated tools to do the heavy lifting for you.
Running an A/B test
You’ll find loads of good posts on running A/B tests in the Further Reading section, so in this post, I’m just going to focus on some of the key gotchas to watch out for.
When designing an A/B test, you need to think big.
When researching A/B tests, you’ll see a lot of examples where the only difference between variations is the colour of a button, or a background image. This is fine for companies like Facebook and Amazon who have a practically unlimited budget and millions of users, but for most companies, it’s not worth the cost, effort or time. A 0.01% uplift in ‘buy now’ clicks could have a big impact for Amazon, but it probably won’t for you.
Instead, we find the most effective A/B tests are those that compare radically different variations. If your current homepage is all about product and features, create a variation that’s about lifestyle and benefits. Or you could try creating a landing page speaking directly to a specific subset of users and see if those people are more likely to buy from that page.
Remember that there’s no definitive right or wrong for your website. Sure, A/B tests are more scientific than trusting your gut, but that doesn’t mean the results are always accurate (watch out for: false positives, statistical significance, possible external variations, bugs and plain old coincidence).
What works now won’t necessarily work next year, next month or even next week, so you can’t just take the results of an A/B test as gospel.
So… is it even worth it?
Whether or not A/B testing is worth the effort depends entirely on your circumstances but, done right, you can learn so much about your product and your audience that you’d probably miss out on otherwise.
The only way an A/B test will work is if you:
- Have enough traffic (whether it’s organic or you pay for it is up to you)
- Do your research, be considered and think big
- Know exactly what success looks like (e.g. success = an x% uplift in ‘buy now’ button clicks)
- Understand how to interpret your results (think about statistical significance, external factors and don’t get distracted from your original hypotheses)
TLDR: If you’re going to do A/B testing, do it right.
- A/B testing mastery: from beginner to pro in one post – ConversionXL
- A/B testing for beginners: 70 resources to get you started – Quick Sprout
- The ultimate A/B testing guide: everything you need to know, all in one place – Conversion Sciences
- 12 A/B Split testing mistakes I see businesses make all the time – ConversionXL
- Most of your A/B test results are illusory – Kissmetrics
Get in touch with Simpleweb to grow your startup today.