As part of the Marketing Automation team, our mission is to be “a catalyst for exceptional performance.” We look to assert our expertise wherever we can through outreach emails, landing pages, forms, nurture campaigns and integration optimization. To help us achieve our mission, we leverage the Analysis/Action/Repeat model presented in the RPM and Data Management courses.
But, when our company merged with another, we had to prove our methodology to a new culture that had a very different approach to marketing automation. We had to do a deep dive into all of their processes including, messaging, campaign formats, system integrations and culture to understand how they operated and where we needed to assert our formula.
How could we get this new company to put more thought into the RPM model of Analysis/Action/Repeat? How could we prove that if they spent a little more time crafting a customer-centric email and incorporate simple best practices (optimizing copy length, subject lines, email format, and targeted segmentation), instead of rushing through their content to get it out, they would see better results?
We had to get them to revisit their emails after they were launched to see what worked and what didn’t. Sure, we reported on the metrics to this new group and provided recommendations, but after receiving multiple requests with the same issues already addressed in previous report recaps, we found that our recommendations were not resonating. So, how could I get them to utilize the insight reports in a way that influenced future sends?
Enter the A/B test.
Before proceeding further, it’s important to note that there were two databases and marketing tools used in this campaign, one being our Eloqua instance and the other an in-house tool produced by the newly merged company. The A/B test was sent out exclusively through Eloqua and version A was also sent through the in-house email marketing tool. Reporting data was then compared to our running-year benchmarks for our asset-type deliveries.
Goals to validate success
- Achieve higher engagement activity through simple edits.
- Show the importance of segmentation.
- Prove expertise in marketing automation.
What I needed to get started
- Two emails - Version A and B in Eloqua, Simple A/B test canvas, and Insight reports for each.
- One email – Version A in the in-house tool and report.
- Segmentation filters for inactivity in Eloqua.
Describe the campaign process
First of all, never underestimate the impact of a simple A/B test. I tested everything with this group - subject lines, tone, images, formats, segments, etc. Every time I got a request, I used this as an opportunity to make changes and test. For the purpose of this post, I will discuss one A/B campaign where I tested engagement and segmentation within Eloqua to the metrics retrieved from the in-house tool.
Like most of their outreach requests, the original request was not to run an A/B test. However after viewing the content, I knew I could make significant changes and improve engagement. Some items that stuck out to me can be seen in version A below, they are:
- The CTA is to click to download the 5 “Pearls of wisdom”, yet the email copy already outlined the 5 takeaways. This completely negates the need for the reader to click the CTA.
- I changed the copy to be more of a teaser to generate interest and CTA clicks, to prompt the reader to take an action.
- The images are not adding much value, especially since there isn’t any context. In addition, they are nearly impossible to read.
- I opted to remove the two images and replaced them with a single, larger image that is more relevant to the copy.
- Not shown, the subject line is extremely vague - “Read the Pearls of Wisdom”
- Although they wouldn’t allow me to change this much, I did add an important element to provide industry context - “Read the Market Access Pearls of Wisdom”. Now, it reads a little less like a fortune cookie and more like a B2B email.
- The copy is lengthy.
- Shortening the copy into a teaser format made the email more manageable for the reader.
- Regarding segmentation, the in-house tool does have targeted lists created, however it does not filter for items like inactivity. As you can imagine, there are numerous stale contacts who haven’t engaged with their emails in years that are stuck in their database, bringing their numbers down drastically.
- Within Eloqua, I added filter criteria to exclude those contacts who have not engaged with any email in the past year.
Once the emails were finalized, I created a “Simple A/B Test” campaign canvas and adjusted the send to be a 50/50 split, as I wanted to get the largest sample size possible to support my findings. My sample size ended up being over 58,000 contacts.
After running the test, a few key metrics came back supporting my hypothesis.
Benchmarks for Assets
Unique Open Rate
Unique Clickthrough Rate
Opt Out Rate
Comparing ELQ-A to IH-A, because both emails are identical with segmentation being the only major differentiator, ELQ-A outperformed the in-house tool with a rate of +260% total open rate and +129% clickthrough rate!
Comparing ELQ-A to ELQ-B, the audience opened and engaged much more frequently with ELQ-B. ELQ-B had over 48% more clicks and 11% more opens than ELQ-A. It also had an unsubscribe rate 42% lower than ELQ-A and a more beneficial click distribution.
The results are pretty clear. These simple changes and a more customer-minded approach lead to higher metrics.
Which Oracle courses impacted this campaign
Revenue Performance Management
Testing Campaigns and Assets
Targeting and Segmentation
Closing comments and business impact
I’d like to say that this campaign made the new marketers rethink how they speak to their audience, but I suspect it will take time, plus many more A/B tests before they fully accept and incorporate these ideas. This outreach achieved its goals and our team proved its credibility by increasing opens and engagement through using simple email best practices. We proved our methodology and the importance of Analysis/Action/Repeat. It also proved our theory about this group’s in-house database, which needs serious cleaning before it can be integrated with ours.
While these tests are simple, they are powerful tools that are helping us move towards our ultimate goal – complete integration between systems and cultures.