The Three Golden Rules Of A/B Testing

Did You Know?

The basic principles of A/B testing were established in the 1920s through agricultural testing.

In the world of digital marketing, A/B testing is an important tool to ensure you are continually improving on the results you have achieved. Whether you’re testing different Google Ads variants, trialling a new email layout or simply deciding which Call To Action is best for the button on your landing page, it’s important to keep producing new iterations to evolve your campaigns.

The idea of an A/B test is simple in theory – you pit two different versions of something against each other and see which one performs best. However, there are a few golden rules that you should follow when running tests like this to ensure you get the most out of your experiments. After all, A/B testing can be quite time-consuming, so it’s important that you can be confident in your results.

Examples Of Recent Tests At Reach Digital

Google Ads Campaign Type Test

When Google introduced Performance Max campaigns in 2021, they were hailed as an all-singing, all-dancing option to cover all possible bases within Google Ads. However, with less control given to the user and fewer data points offered in the results, it’s not a given that this is the best type of campaign for all clients.

With this in mind, we have recently run tests between standard Search campaigns and Performance Max campaigns to see which brings the best results. For e-commerce websites, it’s also a good idea to test between Shopping campaigns and Performance Max. Our test was quite close, which led to it being extended, but ultimately showed that PMax was the best choice this time.

Landing Page Test

Landing page tests are one of the things we experiment with most often. This is usually because there are so many elements to consider and creating the right combination can make a big difference to Conversion Rate.

For the most recent test, we wanted to see the effect putting the contact form higher up the page would have on the number of leads. We launched an experiment within Google Ads and had 50% of the traffic go to the original page and 50% go to the new version. In this instance, after a month of testing, the variant with the form nearer the top won out.

Meta Audience Test

When running Meta Ad campaigns, there are many different audience options to scrutinise. More and more, Meta is pushing advertisers to choose its Advantage+ set-up options, but it’s important to not just take it for granted that this will be the best choice.

Advantage+ allows Meta to decide the audience targeting itself, using machine learning to pinpoint users likely to respond to your ad the best. However, various other audience types may bring about better results. For new campaigns/accounts, we always like to test the Advantage+ audience against an interest-based alternative and an audience created using a combination of customer data and lookalikes. In this particular test, we ran the latter against a Facebook-controlled audience for three weeks and established that customer data was a better targeting option this time.

The world of A/B testing is a vast one and every situation requires its own set of considerations and success indicators. However, if you remember the golden rules above and take a flexible approach each time, the learnings you’ll establish will have a powerful effect on your digital marketing.


About the author

Chris Mayhew has great experience in a wide range of digital marketing practices, focussing predominantly on paid media. By implementing iterative A/B tests, he optimises Google and Meta Ad campaigns to maximise results

Are you ready to start your next project?