Skip navigation

Amit Varshneya at Hexaware Technologies won a Marketing Visionary Markie Award in 2010 - here's why:


New leads are often the most coveted prize, but repeat business and cross-selling to existing clients are often an overlooked source of revenue. Hexaware, a rapidly expanding IT and Process outsourcing services company, was generating most of its repeat business from strong service and delivery referrals, but struggled to manage those referrals without a centralized  database or CRM system.


Quickly progressing from “non-existent” marketing to implementing automation software from Eloqua in 2007, Hexaware centralized its database to efficiently integrate with a CRM system. The company went from having little to no market intelligence to having the ability to record and track all online (web and email) and offline (events) touch points.


Check out the rest of the article at DemandGenReport.

From Anne Holland’s fun and informative blog “Which Test Won?”, Sonic Foundry (an Eloqua customer) tested a long versus short subject line for an upcoming webinar event. I won’t spoil it by telling you if A or B won, but the version that outperformed did so in stunning fashion: It boosted the open rate by 58.9%, CTR by 224%, and registration conversions by 279% – with just a one-word difference between the two subject lines.


Also highlighted in the article is that Sonic Foundry tests the subject lines of every webinar invite they send. I love that they’ve made testing a regular part of their campaign deployment process: making a habit of testing takes away the “scary” factor and encourages your organization to continuously improve.


A/B testing can be an amazingly powerful tool to help grow your email engagement and conversions. Even the best marketers make incorrect assumptions about what appeals to their target audiences. You can see this from the “Which Test Won?” poll results – for each test, there’s a quick quiz that lets you decide which test version you think did better. If you think you may be missing opportunities with your emails, A/B testing is great way to prove (or disprove) your theory.


How A/B testing works

A/B testing is generally executed by doing the following:

  1. Pick one element to test. Create two emails or two landing pages – set up one version, then copy it and tweak the copy for your test element.
  2. Carve off a random 5-10% of your target list. Pull enough contacts that you’ll feel confident that the results of your test are statistically significant.
  3. Divide that smaller list into a random 50/50 split. Send version A to one half, and version B to the other.
  4. Wait a bit to collect responses (generally 2 business days is sufficient). Look at the engagement reports to identify the winning version.
  5. Send the winner to the remaining 90-95% of your list.


What should I test?

If you’re just starting out with A/B testing, we recommend testing just one element. You may have heard of “multivariate testing” (testing multiple elements across multiple messages), but this can be complex to analyze for newbies. Here are some ideas for what to test:

  • Email subject line content, style and length
  • Body copy length or tone
  • Link placement or style
  • Plain versus highly-graphical
  • Landing page form placement or length
  • Form field or button labels


Produce one version of your email or landing page and then decide what element you want to test. Subject line testing is a good place to start.


But I don’t have time for testing!

I often hear from marketers that they haven’t tried testing because they don’t have time. Admittedly, you’ll need to add a little extra time to your production calendar to wait for the results of the test. You may also need to get buy-in from your peers and managers to start testing. But once you commit to the discipline of testing, as Sonic Foundry has, you can potentially start seeing amazing results. Not every test will give you the lift in engagement like Sonic Foundry’s; in fact, for some of your tests, there may be no discernable difference between version A and version B’s results. But the possibility is definitely there!


We encourage you to do a pilot test: pick an upcoming email (perhaps for your own next webinar) and move your email production schedule up a few days earlier. Write two versions of your subject line – engage your colleagues by creating your own version of “Which Test Won?” with a quick poll. And test! Share the results with your team, and get them excited about testing on a broader, ongoing scale.


See what an A/B test looks like

Take a look at Sonic Foundry’s emails and choose the version you think won at “Which Test Won?” by clicking here:


Did you get it right?


Now it’s your turn! Share the results of your A/B tests: What did you test? Were you surprised by the results? What suggestions do you have for other marketers starting their own testing programs?

Ever wonder just how many MQLs you'll need to generate in order to meet the revenue goals of your CEO?


Do you know how many Sales Accepted Leads will you need to generate?


These and other questions can be answered in less than a minute using the attached handy dandy Revenue Goal Funnel Calculator. It basically works the funnel backwards for you, so you can see just what it will take to meet your revenue goals at each stage of the funnel.


Simply fill in the fields in yellow and see the results instantly!


To get accurate numbers, you will need the following:


  • The overall revenue goal set by the CEO
  • Your average revenue per deal
  • Your company's conversion rate at each stage of the funnel.


Playing with the conversion rates is extremely insightful and makes it easy to pinpoint where in the funnel you should focus your conversion improvements.


Give it a spin! It beats napkin math!


To download the spreadhseet, click on the attachment below my signature.


Steve Kellogg

Eloqua Certified Best Practices Consultant

Filter Blog

By date: By tag: