A/B TESTING: WHY SHOULD YOU DO IT AND WHAT SHOULD YOU TEST?

The best way to strengthen any email marketing campaign is to A/B test it.

A/B testing, also known as split testing, allows you to test different elements of an email marketing campaign against each other and see which works best. But why should you do it, and what should you test?

The Why

Email marketing is a great way to strengthen relationships with existing customers and get repeat business. It is both time- and cost-efficient at building up your business and retaining customers.

At its core is a database of contacts, all of whom have actively consented to join your mailing list. With some already having done business with you, this database is the perfect base for testing.

When an A/B test is executed correctly it can give you tangible evidence as to which marketing strategies work on your audience and which don’t.

Mailchimp conducted a test where they analysed 1,720 A/B tested campaigns, sent to 6,226,331 inboxes (every one of the campaigns selected was delivered to over 500 recipients). They then performed a comparison against the average email marketing stats across all MailChimp customers. The results unsurprisingly came out in favour of the campaigns that were A/B tested.

They found that the average open rate across all lists was 21.7% whereas the average open rate for A/B tested groups was 24.1%. Their further analysis found that the average click rate across all the lists was 4.7%, whereas the average click rate for A/B tested groups was 5.5%. This is a significant margin and one that makes a huge difference when you think of it in terms of conversion rates. It is also a sure-fire way of making sure your customers stay engaged with your brand.

With this high rate of return, it is strange that this strategy is often overlooked. According to eConsultancy’s Email Marketing Industry Census, only 16% of marketers test email frequently.

What you are looking for and why

It is important to know what you are looking for so that you can choose what to test and judge the results accordingly.

You may be looking to increase views of your website or sales figures, or you may just be looking to improve customer engagement and the read and open rates of your emails. A/B testing will provide you with all of these answers and more by confirming to you what will increase both your open and your click-through rates.

  • Open Rate – This tells you how many of your subscribers opened your email. It is calculated by dividing the number of emails opened by the number of emails sent, minus the number of emails bounced, and is expressed as a percentage. For example a 20% open rate would mean that if 105 emails were sent and 5 bounced, 20 were opened. It is particularly important to pay attention to the open rate when you are testing things like subject lines and your message preview.

  • Click-Through Rate – This tells you how many of your subscribers click on a link in the body of your email. This is calculated by dividing the number of subscribers who clicked a link by the number of emails sent, minus the number of emails bounced and is also expressed as a percentage. The click-through rate is important when it comes to testing CTAs, email content, layout and personalisation.

What to test

  • Subject line – This is the first thing your customer will see and will determine whether your email gets opened or is tossed into the virtual trash can. Should it be short, phrased like a question, contain stats? Different things will work for different types of reader.

  • When to send – The time at which you send an email and the day on which you send it can also impact on whether your contact list will open it.

  • Call to action – What will make your readers click?

  • Personalisation – Will your audience engage more with a personalised subject line, header or address?

  • The layout of the message – What will make your audience keep reading? (Example: single column vs. two column, or different placement for different elements.)

  • Types of content – Will your audience respond well to image-heavy emails, embedded videos or excerpts of blog posts?

  • Testimonials – should they be included? Do they add value?

  • Specific offers – For example: “Save 10%” vs. “Get free shipping”

Things to remember

  • Focus on frequent emails – You will get more out of testing emails that are frequently sent out, such as newsletters and client follow-ups, than if you test emails that are sent out infrequently, like “Happy Holiday” mails. When you test emails that are sent frequently you will have more scope for testing and applying the results.

  • Keep it random – To get the best results you need to split out your mailing list randomly. Most email marketing software will do this for you, but if you are having to do it manually, then the easiest way to do it is to download it as a CSV. file and randomly sort it in Excel.

  • Be bold – One of the issues with A/B testing is that sample size can sometimes be a problem. Larger sample sizes usually translate to better results. If you have a smaller sample size, it is best to be divergent with what you are testing. For example: plain-text versus HTML emails, or one link vs. six.

  • Don’t jump the gun – Don’t analyse your results for 4-5 days. Be sure to let your test fully run its course.

A successful A/B test will help to shape your marketing campaigns, improve conversion rates and strengthen your business. For more information about A/B testing and for all of your email marketing needs and advice, contact Marc at Winbox for help.

See how we transformed the fortunes of Wealth West Medical’s email marketing here.