Skip navigation

See It

1 Post authored by: gaea.connary

From Anne Holland’s fun and informative blog “Which Test Won?”, Sonic Foundry (an Eloqua customer) tested a long versus short subject line for an upcoming webinar event. I won’t spoil it by telling you if A or B won, but the version that outperformed did so in stunning fashion: It boosted the open rate by 58.9%, CTR by 224%, and registration conversions by 279% – with just a one-word difference between the two subject lines.

 

Also highlighted in the article is that Sonic Foundry tests the subject lines of every webinar invite they send. I love that they’ve made testing a regular part of their campaign deployment process: making a habit of testing takes away the “scary” factor and encourages your organization to continuously improve.

 

A/B testing can be an amazingly powerful tool to help grow your email engagement and conversions. Even the best marketers make incorrect assumptions about what appeals to their target audiences. You can see this from the “Which Test Won?” poll results – for each test, there’s a quick quiz that lets you decide which test version you think did better. If you think you may be missing opportunities with your emails, A/B testing is great way to prove (or disprove) your theory.

 

How A/B testing works

A/B testing is generally executed by doing the following:

  1. Pick one element to test. Create two emails or two landing pages – set up one version, then copy it and tweak the copy for your test element.
  2. Carve off a random 5-10% of your target list. Pull enough contacts that you’ll feel confident that the results of your test are statistically significant.
  3. Divide that smaller list into a random 50/50 split. Send version A to one half, and version B to the other.
  4. Wait a bit to collect responses (generally 2 business days is sufficient). Look at the engagement reports to identify the winning version.
  5. Send the winner to the remaining 90-95% of your list.

 

What should I test?

If you’re just starting out with A/B testing, we recommend testing just one element. You may have heard of “multivariate testing” (testing multiple elements across multiple messages), but this can be complex to analyze for newbies. Here are some ideas for what to test:

  • Email subject line content, style and length
  • Body copy length or tone
  • Link placement or style
  • Plain versus highly-graphical
  • Landing page form placement or length
  • Form field or button labels

 

Produce one version of your email or landing page and then decide what element you want to test. Subject line testing is a good place to start.

 

But I don’t have time for testing!

I often hear from marketers that they haven’t tried testing because they don’t have time. Admittedly, you’ll need to add a little extra time to your production calendar to wait for the results of the test. You may also need to get buy-in from your peers and managers to start testing. But once you commit to the discipline of testing, as Sonic Foundry has, you can potentially start seeing amazing results. Not every test will give you the lift in engagement like Sonic Foundry’s; in fact, for some of your tests, there may be no discernable difference between version A and version B’s results. But the possibility is definitely there!

 

We encourage you to do a pilot test: pick an upcoming email (perhaps for your own next webinar) and move your email production schedule up a few days earlier. Write two versions of your subject line – engage your colleagues by creating your own version of “Which Test Won?” with a quick poll. And test! Share the results with your team, and get them excited about testing on a broader, ongoing scale.

 

See what an A/B test looks like

Take a look at Sonic Foundry’s emails and choose the version you think won at “Which Test Won?” by clicking here: http://whichtestwon.com/archives/7353.

 

Did you get it right?

 

Now it’s your turn! Share the results of your A/B tests: What did you test? Were you surprised by the results? What suggestions do you have for other marketers starting their own testing programs?

Filter Blog

By date: By tag: