I’ve been promising to write a blog about split testing for ages, so now I’m finally biting the bullet. I’ll admit, I’m not a split testing expert by any means but I’m certainly a convert.
What is split testing?
Forgive me if you’re already an old hand at split testing. Just in case you’re not, I like the definition given on Optimizely that split testing (also sometimes called ‘A/B testing) “is a simple way to test changes to your page against the current design and determine which ones produce positive results. It is a method to validate that any new design or change to an element on your webpage is improving your conversion rate before you make that change to your site code”.
Let’s look at some examples of things you might split test.
Perhaps you want to create a stronger call to action on a landing page on your website. In design A, you use a green ‘Proceed to secure checkout’ button and in design B, you use the same button but in red. The only variable is the colour. The position, button design, page headline, etc. all otherwise remain the same. You then email your mailing list, giving 50% of people a link to design A and 50% a link to design B. Which one gets the most click throughs or purchases? A split test of this nature would help you make an informed decision about the colour of your call to action button.
As another example, you might be in two minds about the email subject for your latest newsletter. What will keep people reading and ensure the best engagement? You could write two different subject lines and send one to 10% of your mailing list and the other to the remaining 10%. Your open rates should help you build a picture of the kind of subject lines to which your customers will respond. You can then send the winning subject line to the remaining 80% and should, in theory, see higher open rates.
My first split test
Back in January, I decided to do my first ever newsletter split test. I thought long and hard about what to test and decided to go for sending time. I scheduled my newsletter to go out to 50% of my mailing list at its usual time of 8.45am on Tuesday (group A) and to go to the remaining 50% at 10am (group B) to see if people are more likely to read a newsletter if it comes in after they’ve had a bit of time to get settled at their desks.
The report showed that although one more recipient from group A opened the newsletter than from group B, group B showed far higher levels of engagement, with almost double the total number of opens and three times the unique clicks.
It was food for thought. Had I been sending out my newsletter too early in the day? Perhaps people are more likely to spend time interacting with the content of a newsletter if they’re not wading through their inbox for urgent emails needing their attention first thing in the morning.
(As an aside, that newsletter got one of my highest open rates to date by asking the question, “Did your headshot survive the Google Authorship shake-up?)
The following week, I did a second split test using the same times to check whether my campaign saw the same results. This time it was hard to pick between the two groups.
Group A (8.45am) was successfully delivered to one more person than group B (10am) and opened by one more person. There was the same number of click throughs in both groups but group A accounted for two more opens of the campaign. It was hard to separate the groups and define a winner.
I therefore ran the same split test the following week (newsletter 48: Frazzled and overwhelmed? These 7 productivity tips have saved my sanity). This time, group A (8.45am) was the hands-down winner with a 40.7% open rate against 27.8% for group B. There was also a 20.4% click through rate for group A against 11.1% for group B. On average, people spent five minutes on my website as a result of clicking through to the site and I received a higher than usual volume of enquiries about work.
Testing subject lines
I decided to stick with my 8.45am sending time for a while, given the overwhelming data for the last split test. I didn’t see the results as conclusive but worth thinking about for a later day.
My next step was to try two different email subject lines.
For newsletter 49, 10% of my mailing list was sent an email with the subject line: ‘Is jealousy making it hard to do business?’ (group A), while a further 10% was sent: ‘Ever fallen foul of the green-eyed monster?’ (group B). A significant 45.5% of recipients from group A opened the newsletter, against 27.3% from group B. However, far more people from group B clicked through to my website to read the full article.
Now, the best laid plans are often made in vain, and I realised I’d put a broken link into the newsletter. I then had to send the campaign out again with the correct link. This time 81.8% of recipients in group A opened the newsletter and 27% clicked through to my website.
Group A was the clear winner in both split tests, so the remaining 80% of my mailing list received a newsletter entitled: ‘Is jealousy making it hard to do business?’
I had hoped group A would win this split test. The headline was stronger, tapping in to a feeling most of us experience at one time or another. It was also clear that the article related to how jealousy can affect our businesses. Group B’s subject line was just too vague.
A question of timing
A couple of weeks ago, I was reading an article about inbox management. It stated that people are more likely to read newsletters in their lunch breaks, whereas they will begin their day by scanning their inboxes and deleting anything that doesn’t require their attention.
I decided another split test was in order. This time, I scheduled my newsletter to go out at 8.45am (group A) and 12.15pm (group B). The results were interesting. Group A had the higher open rate of 40.4% against 31.6% for group B but group B were far more likely to click through to my website and read the main article in full.
Last week, I ran the same split test. This time, group B accounted for more opens and clicks through to my website, which would suggest that people are more likely to interact with a newsletter during their lunch break.
My split testing conclusions
My journey into split testing is far from done and, even with the few tests I’ve done so far, I’ve got some thinking to do. I need to think about sending times, and plan to run a few more tests to compare mornings to midday.
You may also find yourself receiving this newsletter on a different day in the week as I run some split tests on days.
Equally, I will continue to split test subject lines and may segment my mailing list so that I can split test design features within the newsletter, such as whether people are more likely to click through to an article via a link that says ‘Read more’ or if there’s a link which clearly states the benefits of clicking through, e.g. ‘Click here for my five top tips for beating procrastination’.
It’s clear, looking at the stats for all my newsletter reports, that subject line questions that tap into fears or promise solutions to a problem are likely to achieve high open rates. I will keep testing this though.
These split tests have also encouraged me to think about what has more value in terms of my newsletter. Is it the open rate or the actual click throughs to my website? On the one hand, open rates are important as they show that there’s still a good level of interest and interaction, but click throughs to my website encourage potential customers to go deeper into the site and find out more about my services.
I’d love to hear your thoughts on split testing. Have you run any split tests? Did you get clear results? Did it make you change what you were doing? What do you think has more value when it comes to your newsletter – opens or click throughs?
Why not leave your thoughts in the Comments section below?