Email A/B Testing is a Marketing and Sales Superpower: Here’s How to Use it

Email is the bread and butter of marketing. It’s how we engage with our existing customers and leads, and often, we get cold leads to become aware of our offerings and turn them into opportunities. But how do we know our emails are effective? We A/B test.

In marketing, A/B testing is a science experiment. It compares two variants of marketing material—whether landing pages, ads, email, or something else—and measures which perform better.

To get these results, users are randomly shown the different variations (A or B), and then conversion rates of the desired outcome are measured against each other. This could be open rate, click rate, form fills, SEO success, or any success metric you aim for.

A/B testing is incredibly important for sales and marketing teams. Without it, they could be running mediocre campaigns for years and have no idea. Money could be getting left on the table! That’s why it’s important to Always Be Testing (ABT)...Well, maybe you don’t need another acronym, but you get the idea.

Why Is A/B Testing Important for Emails? A Marketer’s Best Friend

Why is A/B Testing Important for Emails? A Marketer’s Best Friend

Marketers (at least good ones) should love to run experiments and record meaningful results. It’s really the purpose of marketing…to find what works and do more of it. For email campaigns, A/B testing is how we achieve that goal.

For your email marketing strategy, A/B testing is a measurable, scientific method to improve the following in your campaigns:

  • Open rate: The percentage of emails opened out of emails sent
  • Click-through rate: The percentage of emails that have a link in the email clicked
  • Conversion rate: The percentage of emails that result in a successful conversion (e.g., getting a response or scheduling a meeting)

While there are many variables to test (more on that below), these are the three main success metrics you will use in your email A/B testing. You can find what works and what doesn’t for these three important factors through your carefully crafted tests.

The result will be higher open rates, higher click-through rates, and better conversions. With proper A/B testing, the ROI on your email campaigns will go through the roof (and your sales team will like you a lot more).

A/B Email Testing is an Outbound Salesperson’s Friend, Too

Outside of marketing emails, outbound sales teams can also experiment with A/B testing in cold emails using the same three metrics. The variables may differ slightly from marketing emails, but the benefits are similar.

For example, you may want to experiment with different cold email templates, such as asking for an introduction, referring to recent events, or addressing competitor pain points that your company solves. There is so much to play with here, but testing these templates against each other can lead to huge wins in a tough game.

In the world of sales, successful email outreach hinges on effective email deliverability. Explore the steps to achieve this through proper technical email setup.

How to Choose the Best A/B Testing Elements

For effective A/B email testing, you need to know what to test. When choosing a variable, consider your overall goal. For example, if you want to increase open rates, you might test send time or subject lines. If your goal is to increase conversions, CTAs, images, or layouts are a good place to start.

For most companies, the following variables are the easiest and most impactful in testing and measuring results.

Subject Lines

Try various subject line strategies, such as making your subject a question or a statement, using emojis in the email subject line, creating a sense of urgency, using statistics, being mysterious vs. being direct, etc.

You may be surprised by what works and what doesn't. If you’re struggling with variations, try our email subject line generatorit's powered by ChatGPT and trained on our most effective email subject lines.

Pro Tip: Here’s a pro tip that I always follow: Use sendcheckit.com to test your subject lines before sending them. You’ll get a grade on how well you can expect them to perform and be informed if you're using any words that are likely to land you in the spam folder.

Personalized Names

You can use a sender name in the email signature or make the email "from your company" rather than an individual. You can also experiment with using the person’s name in the email copy. I usually go with "Hi (name)."

Depending on the context and audience, personalized names will either help or hurt. If it’s obvious it’s not coming from an individual, then maybe it doesn’t make sense to use the subscriber’s name. If not, try getting more personal.

Images, Design, and Layout

An interesting test is whether a plain text email performs better than an intricately designed visual email. The simple text email seems like it's from a real person, but the well-designed one is more visually stimulating and may get more engagement.

Within email design, you can also experiment with different images, designs, and layouts, creating an "A, B, and C" version and seeing which one performs best. Again, some emails come off better as a well-designed masterpiece, while others are better as a simple "me-to-you" text that you would send a coworker or friend. See what works for you.

Call-to-action (CTA) Variations

More often than not, you’re going to measure success by how often your CTA button gets clicked—but how do you make sure the CTA is inviting itself to be clicked and not scaring people away? Here, you can play with elements like using arrows, buttons vs. hyperlinked text, playing around with fonts and colors of the buttons (ALL CAPS PERHAPS?), and more.

Preview Text Variations

Often overlooked by marketers, preview text (also called "preheader") appears next to the subject line in the inbox. It’s the second thing most readers will see, and it can make or break your chances of getting clicks. When ignored, preview text defaults to the beginning of your email, but you can include whatever you want.

Try different variations to entice the reader to click. Think of it as an extension of your subject line. Maybe include some stats, pose a question, or hit on the hardest pain point, then see what works.

How to Choose the Best A/B Testing Elements - Preview Text Variations

See how the preview text appears after the subject line in my "promotions" Gmail tab, and note that each of these companies took the time to include preview text.

Email Copy Variations

Of course, you’ll probably want to test different email versions. There are infinite ways to say the same thing with different words, so test different emotions, pain points, and benefits within your emails to see what works best for your audience. Do they respond to hard emotional copy or get turned off? Do hard stats perform the best?

Try a couple of different variations of the same email, but don’t make them too different. Then, you won’t know which change is making the impact. Consider using them separately in the same campaign if you're writing very different emails.

Send Time

Another simple A/B test is to vary the time and day of the week you send your emails. You may find that email open rates are better at 9 am versus 2 pm or that people are most likely to unsubscribe on a Friday for some reason. You can run a simple test on the same email to see what works best.

→ Need some fresh email copy ideas? Check out our amazing, AI-powered cold email generator and get your creative juices flowing.

One Variable at a Time!

Don’t go too crazy when testing; you may get lost in the weeds. When you test many variables, you don’t know which ones make the difference. Instead, focus on one variable at a time. Try the same email with the designs and images vs. text-based, change just the subject line but nothing else, try one with preview text and one without, try the CTA as a button vs. a link, etc.

Don’t do all these things simultaneously because you won’t know what variable makes the difference.

Build on Data and Insights from Previous A/B Tests

Because you’re testing one variable at a time, you’ll want to save all the data from each test and use it to create hypotheses for new tests. Essentially, you’re building on what you learned from each test to build a rich tapestry of results that will inform your future campaigns.

This helps you get around the "only testing one variable at a time" problem and allows you to see how all variables have worked together in the past. Use your rich library (or spreadsheet) of results to help your writers create more effective email copy and pitch better ideas for future A/B tests.

As we delve into setting up A/B tests for emails, it's important to have the right tools at your disposal. Our AI Email Writer tool can provide a solid starting point, offering customizable email templates optimized for engagement and conversion.

How to Set up an A/B Test for Emails

Now that you’ve figured out some variables to test (although one at a time of course), you’ll want to set up an A/B testing email marketing campaign in your customer relationship management (CRM) system. To execute this, you can follow the simple steps below:

Define Clear Goals and Objectives for Your A/B Test

You should have a solid understanding of what you hope to achieve with this A/B test and a hypothesis for how it will turn out. Your goal could be one of many things, for example, increasing the open rate, increasing CTA clicks, scheduling more demos, or starting more conversations (i.e., response rate).

Write out a clear hypothesis that includes what you hope to achieve, why you think you will achieve it, and a benchmark for measuring success. If you’re measuring the open rate, a common success rate is 35 percent. For response rates, 10 percent is considered solid, and anything less than 5 percent usually means you need to do some work.

Create Two or More Variants for the Selected Variable

Once you’ve chosen a variable to test (e.g., preview text, subject lines, or CTA variations), you must make versions A and B. You aren’t limited to just two versions, however. As long as you’re testing the same variable, you can test as many versions as you’d like.

Maybe you’re testing using compelling statistics in the headline, and you have several stats from which you can choose. You’ll want to choose one email without stats and maybe two or more with them. Here are some examples you could choose from (warning: these stats are made up by me and not real, just an example):

Version A: Learn how email A/B tests help marketers get better results

Version B: Learn how A/B email testing helps marketers increase open rates by 47 percent

Version C: Learn how A/B email testing helps marketers increase response rates by 34 percent

Split testing these three closely related but different subject lines will give you a clear understanding of whether or not using a stat makes a difference and which of these two stats is most compelling if it does. You can use these results for future emails to other segments or campaigns.

Segment Your Email List the Smart Way

When breaking up your email list into segments, you want to ensure that the different segments within your list are as equal as possible. Otherwise, your results could be skewed towards one demographic and not accurate as a whole.

Use all your contact data in your CRM to distribute as evenly as possible. You should have data on your contact’s company name, industries, levels of seniority, job titles, location, and more. Create lists that evenly distribute your contacts based on these factors.

For example, ensure you have an equal number of people with "VP and CEO" in each segment. If you’re working in the US, distribute evenly across geographic areas. Don’t have an "East Coast" and "West Coast" list, as your results would be culturally skewed.

You might not get this perfect, which is fine, but to ensure statistical significance, try to make sure your list isn’t obviously biased or skewed in one direction or another.

Track Your Results

Once your goals, variables, segments, and split tests are set up, it’s time to deploy your A/B test and measure the results. If you’re using the Close CRM email communication tools, you can set up email automation and easily track response rates, open rates, and other key metrics for your email sequences.

Do the results match your hypothesis or end up surprising you in some way? It’s funny how things work out in the real world vs. in our minds, and A/B email testing will likely blow up some of your assumptions.

Lastly, these results will inform future email A/B tests and email copy. For example, if you know that using stats works better in email subject lines, keep doing more of that. Using a spreadsheet or project management tool to track your results is ideal, as you can easily update and look back on them when you need insights and are setting up new A/B tests.

How to Set up an A/B Test for Emails - Track Your Results

Best Practices for Email A/B Testing That Ensure Impactful Results

Email testing done poorly is not only a waste of time but can also throw you off the cold, hard truth. In other words, it’s not accurate, so the results aren’t real or meaningful in any way. Following those results would lead to worse outcomes, not better ones.

To ensure that you always have impactful results from your email A/B tests, follow these simple best practices:

Conduct Tests on a Significant Sample Size

You won't get statistically significant results if you’re only testing emails on a handful of people. Ensure you’re testing on hundreds or thousands of people rather than just a few dozen. The results you get will be much more impactful and useful on a larger scale.

Run A/B Tests for a Long Enough Duration

You’ll need to let your email campaigns brew for a bit to capture reliable data. Don’t just run them for one or two days and think you’re done. Let them go for weeks or months and watch the results roll in over time.

Correctly Interpreting Insights from the A/B Test Results

In A/B email testing, the common fallacy of "correlation equals causation" can occur. Don’t immediately assume that because X happened, it’s what caused Y. Look more deeply at your results and be sure to make sure before you make assumptions about them.

Avoid Bias

This relates to "segmenting your list the smart way" above. You need to be sure that you’re not using one segment that is biased in any way. That means paying close attention to your segmented lists and ensuring that they are as equal as possible.

Use Phone Calls to Bolster Your Results

If you follow up email campaigns with a call, you can use that call to get more data on why they responded (or didn’t respond) to your emails. If they opened it, you could ask, “I’m sure you get hundreds of cold emails daily. I’m curious: why did my cold email reach you? What piqued your interest? Why did you respond to it—what did you like about it?”

This will give you more qualitative results than quantitative, but you can use these anecdotes to fuel new ideas, insights, and strategies that pure numbers can’t provide. Add a "notes" column to your A/B testing results spreadsheet and jot down notes from these conversations for future use.

Email A/B Testing Helps You Learn, Grow, and Get Unstoppable Results.

Email A/B testing is like a never-ending science experiment that gradually sharpens your marketing or outbound sales efforts. When valuable outbound sales tools are incorporated, this testing becomes even more impactful, enabling you to continuously enhance your approach.

It’s a true use case of "continuous improvement" that you can build on for greater optimization over time.

Start using email A/B testing as soon as possible to get real, impactful data that can help you grow your business.

→Need some ideas for your email campaigns? Check out our free Sales Email Sequence Templates and get 47 pre-formatted email sequences you can plug right into your CRM today.

Table des matières
Share this article