You likely already know how essential an email marketing program is to your business, but how well do you really know your audience? A great way to find out their preferences is to send one version of an email to half your audience, and a slightly different version to the other half to compare the performance of each variant. This is known as A/B testing. This type of testing is an essential business tactic because it can provide valuable insights into what type of marketing your audience connects with the most, and using that intel to improve practices for better results going forward. Your tests can vary in complexity. A simple test could consist of testing the same email just with different subject lines to see which produces a higher open rate, while a more complex test could consist of testing different email layouts against each other to see which one generates a higher Click Through Rate (or CTR). Not only could it help you increase your open rates and clicks, but it could also lead to more conversions and overall sales.
Step 1
The first thing you should do when conducting an A/B test is to identify the problem you want to solve or hypothesis, and then select one or all the elements that you feel might contribute to the problem. For example, if you’ve noticed that your CTRs are below the industry average, there could be multiple contributing factors (such as unclear calls to action or CTAs, or not enough clickable buttons, or poor button placement).
Step 2
Once you’ve identified the elements you want to test, make a plan to test each one by one. It is critical to only test one at a time because if you try to conduct multiple tests at once, it will be impossible for you to identify which element is making the impact. When you’re only testing a single element, it’s clear and obvious based on whatever metrics you’re measuring what the winner was.
Step 3
Now that you have your plan in place, it’s time to determine your ideal sample size. We usually recommend using an even audience split. You need to give your test enough time to reach your audience multiple times in order to most effectively test your hypothesis. A one-time result could be a complete fluke, and is unlikely to give you any valuable information. We recommend a minimum of 3-4 email sends for any sort of A/B testing to provide any sort of reliable and usable intel to be implemented in future sends.
However, in special circumstances or if you’re pressed for time (for example, if you’re promoting a big sale but can’t decide between 2 taglines or 2 main graphics), some email platforms offer the option of choosing a fractional test audience (i.e. sending to 20% of your audience and letting it process for a set amount of time) for a campaign, and once the winning variant has been determined, the system will send that variant to the remainder of your list.
When determining your ideal sample size for A/B testing, consider the following factors: total list size, desired accuracy, and desired level of confidence (the percentage of which the results would range if the test were repeated, so higher confidence levels require a larger sample size). The larger the sample size, the more statistically significant the result. We recommend using a sample size calculator (Survey Planet has a great one here) to determine your ideal sample size based on your list size.
#1. Email Subject Line
If you’ve already read our blog post about writing effective email subject lines, then you already know that this is your first barrier to entry when it comes to email marketing (and if you haven’t, you can read it here [insert link]). If your email never gets opened, then it really doesn't matter what’s inside. So aside from following the tips outlined in our email subject line blog post, you can learn what works best for your specific audience by testing one against another. For example, you could test a more direct subject line against one injected with some humour or trendy slang. Or you could try testing one subject line with emojis, and one without. The possibilities are endless!
#2. Send Time
Another factor that can affect whether or not your email even gets opened is send time, so figuring out the optimal time at which your audience is most receptive to your emails is essential, and should be one of the first A/B tests you conduct. One major component to pay attention to when setting up your email send times is whether you’re sending from one specific time zone (meaning that the time your email gets delivered will differ depending on where the recipient lives), or local time (meaning that for example if you want a morning send, and set the send time to 9am local time, each recipient will receive at 9am, no matter where they live). You want to catch your audience when they are most likely to be checking their inbox, but if you’re sending at the same time as every other mailing list they subscribe to, your email might end up getting lost in the crush. It may take several rounds of testing to get it right, but it will make all the difference to your open rates (and ultimately conversions). Timing is everything!
#3. Copy Style
Once you’ve gotten a good handle on your subject lines and getting your audience to actually open your emails, try playing around with the content. You could try running different promotions simultaneously to see which one ultimately yields higher conversion rates, or simply change up your language or tonality from one version to the other. Maybe this whole time you’ve been keeping your email copy on the shorter side when really your customers prefer longer emails, or perhaps they might respond better to more images rather than more text. But whatever it is you decide to test, don’t forget about the golden rule!
#4. Colour
Colour has a significant impact on our emotions. There have been many psychological studies conducted about the use of colour and the types of responses certain colours elicit in the brain (this one by HubSpot for example). But an easier way to figure out which colours your audience responds best to is to test them. While you should always work within your brand’s colour palette, try testing out different colour combinations. For example, you could test high-contrast colours against a more monochromatic colour scheme, or try darker text on a lighter background against the inverse. It’s also important to be mindful of accessibility standards (such as using a font size and high enough contrast between text and background colour that your text is easily legible), and how your email will look when viewed in dark mode.
#5. Layout/Structure
The way your emails are visually laid out can also greatly impact their effectiveness. For example, if your email contains an offer, but it’s not mentioned until the very end of the email, it’s possible that your reader never even made it that far and will miss it entirely. So it might be worth your while to create different layouts playing around with where certain elements are placed in relation to one another. Your audience might even have a significant preference between something as simple as changing from left-aligned text to centre-aligned text. You might be surprised at how changing up the placement of visual elements and how they interact can impact an email’s effectiveness (or measurable analytics such as click through rate).
Ultimately, A/B testing lets you get to know your audience better so you can tailor your emails to specific segments, and encourage engagement using tactics proven by your own audience. In order for your testing to be effective, you need to make sure you’re tracking the data collected by your tests and reviewing it to better understand WHY one version worked better than the other.