Skip navigation

Background

As part of the Marketing Automation team, our mission is to be “a catalyst for exceptional performance.” We look to assert our expertise wherever we can through outreach emails, landing pages, forms, nurture campaigns and integration optimization. To help us achieve our mission, we leverage the Analysis/Action/Repeat model presented in the RPM and Data Management courses.

 

But, when our company merged with another, we had to prove our methodology to a new culture that had a very different approach to marketing automation. We had to do a deep dive into all of their processes including, messaging, campaign formats, system integrations and culture to understand how they operated and where we needed to assert our formula.

 

 

Marketing Challenge

How could we get this new company to put more thought into the RPM model of Analysis/Action/Repeat? How could we prove that if they spent a little more time crafting a customer-centric email and incorporate simple best practices (optimizing copy length, subject lines, email format, and targeted segmentation), instead of rushing through their content to get it out, they would see better results?

 

We had to get them to revisit their emails after they were launched to see what worked and what didn’t. Sure, we reported on the metrics to this new group and provided recommendations, but after receiving multiple requests with the same issues already addressed in previous report recaps, we found that our recommendations were not resonating. So, how could I get them to utilize the insight reports in a way that influenced future sends?

 

Enter the A/B test.

 

Before proceeding further, it’s important to note that there were two databases and marketing tools used in this campaign, one being our Eloqua instance and the other an in-house tool produced by the newly merged company. The A/B test was sent out exclusively through Eloqua and version A was also sent through the in-house email marketing tool. Reporting data was then compared to our running-year benchmarks for our asset-type deliveries.

 

 

Goals to validate success

  1. Achieve higher engagement activity through simple edits.
  2. Show the importance of segmentation.
  3. Prove expertise in marketing automation.

 

 

What I needed to get started

  • Two emails - Version A and B in Eloqua, Simple A/B test canvas, and Insight reports for each.
  • One email – Version A in the in-house tool and report.
  • Segmentation filters for inactivity in Eloqua.

 

 

Describe the campaign process

First of all, never underestimate the impact of a simple A/B test. I tested everything with this group - subject lines, tone, images, formats, segments, etc. Every time I got a request, I used this as an opportunity to make changes and test. For the purpose of this post, I will discuss one A/B campaign where I tested engagement and segmentation within Eloqua to the metrics retrieved from the in-house tool.

 

Like most of their outreach requests, the original request was not to run an A/B test. However after viewing the content, I knew I could make significant changes and improve engagement. Some items that stuck out to me can be seen in version A below, they are:

 

  • The CTA is to click to download the 5 “Pearls of wisdom”, yet the email copy already outlined the 5 takeaways. This completely negates the need for the reader to click the CTA.
    • I changed the copy to be more of a teaser to generate interest and CTA clicks, to prompt the reader to take an action.
  • The images are not adding much value, especially since there isn’t any context. In addition, they are nearly impossible to read.
    • I opted to remove the two images and replaced them with a single, larger image that is more relevant to the copy.
  • Not shown, the subject line is extremely vague - “Read the Pearls of Wisdom”
    • Although they wouldn’t allow me to change this much, I did add an important element to provide industry context - “Read the Market Access Pearls of Wisdom”. Now, it reads a little less like a fortune cookie and more like a B2B email.
  • The copy is lengthy.
    • Shortening the copy into a teaser format made the email more manageable for the reader.
  • Regarding segmentation, the in-house tool does have targeted lists created, however it does not filter for items like inactivity. As you can imagine, there are numerous stale contacts who haven’t engaged with their emails in years that are stuck in their database, bringing their numbers down drastically.
    • Within Eloqua, I added filter criteria to exclude those contacts who have not engaged with any email in the past year.

 

AB

 

Once the emails were finalized, I created a “Simple A/B Test” campaign canvas and adjusted the send to be a 50/50 split, as I wanted to get the largest sample size possible to support my findings. My sample size ended up being over 58,000 contacts.

 

After running the test, a few key metrics came back supporting my hypothesis.

 

 

 

Eloqua A

(ELQ-A)

Eloqua B

(ELQ-B)

In-house A

(IH-A)

Benchmarks for Assets

Open Rate

54%

59%

15%

28%

Unique Open Rate

38%

43%

11%

19%

Clickthrough Rate

2%

3%

1%

3%

Unique Clickthrough Rate

1%

2%

<1%

1%

Opt Out Rate

<1%

<1%

<1%

<1%

 

Comparing ELQ-A to IH-A, because both emails are identical with segmentation being the only major differentiator, ELQ-A outperformed the in-house tool with a rate of +260% total open rate and +129% clickthrough rate!

 

Comparing ELQ-A to ELQ-B, the audience opened and engaged much more frequently with ELQ-B.  ELQ-B had over 48% more clicks and 11% more opens than ELQ-A. It also had an unsubscribe rate 42% lower than ELQ-A and a more beneficial click distribution.

 

The results are pretty clear. These simple changes and a more customer-minded approach lead to higher metrics.

 

 

Which Oracle courses impacted this campaign

Revenue Performance Management
Testing Campaigns and Assets
Targeting and Segmentation
Advanced Segmentation

 

 

Closing comments and business impact

I’d like to say that this campaign made the new marketers rethink how they speak to their audience, but I suspect it will take time, plus many more A/B tests before they fully accept and incorporate these ideas. This outreach achieved its goals and our team proved its credibility by increasing opens and engagement through using simple email best practices. We proved our methodology and the importance of Analysis/Action/Repeat. It also proved our theory about this group’s in-house database, which needs serious cleaning before it can be integrated with ours.

 

While these tests are simple, they are powerful tools that are helping us move towards our ultimate goal – complete integration between systems and cultures.

How Sales and Marketing can work together!
A non-Standard Lead Scoring for a non-Standard Industry

 

Chapter 1: Intro

Our brand marketing proposition is complex when compared to many businesses. We are a B2B company and run more than 50 brands in as many vertical markets. When it comes to marketing and data, we use Eloqua, Salesforce and four other data platforms – you can imagine the complexity and size of our database. We also use the same instance of Eloqua for all marketing campaigns for each of the 50 brands. So, in essence, we use one instance of Eloqua for 50 different companies, each with at least 2 different audience groups.

 

Chapter 2: The Challenge

With Eloqua introduced, we were then equipped with the tools to facilitate a key marketing and sales goal: Lead Scoring. Our challenge was to: Implement multiple, different Lead Scoring models and integrate them into Salesforce.

 

Chapter 3: The Goals

We had one key goal: implement Lead Scoring. I have read about companies who started this journey only to quickly abandon it. I wanted to make sure that wasn’t our story by bringing our sales and marketing teams, with no experience in this area, on the journey with us. We focused on the following:

•    Sending MQL to different sales teams

•    Monitoring performance and refining the model

•    Reporting

 

In order for Lead Scoring to work, it has to generate revenue and in a very short time, demonstrate to the sales team that it’s not just “another short lived marketing initiative,” but will define the future of our business.

 

Chapter 4: The “Before” State

We had ways to prioritize leads, but what Lead Scoring did was to ensure that the score was tied to engagement and allowed a more timely interaction.

 

Chapter 5: The Actions

For the first 2-3 months, I read every piece of documentation available online and learnt from other successes and failures. It’s important to accumulate as much feedback as possible for any marketing initiative.

 

Start Low and Aim High was the key.

One of the advantages of working across multiple brands is that you have a large quantity of data and information about your customers and leads. I knew from the beginning that engagement would need to be revised and adjusted over time and would be a continuous effort spent finding the profiling for each brand. I gathered as much information from the experts in the industry – those people who have been running the brands successfully over the last few years.

   

How to build the correct profile?

There is no easy answer to this, but I do think that it’s a combination between the individual and the company. So, I have used the following as the main tiers for all our models and added more information specific to each brand. Every single option was weighted differently and divided into at least 4 sub tiers to allow a better spread of the customers over the profile fit.

 

-    Retention [35%] – company data

-    Products [25%] – company data

-    Job Role[25%] – individual data

-    Location[15%] – individual data

 

As mentioned earlier our instance of Eloqua supports data for more than 50 brands, so it’s almost impossible to apply the profile from the Eloqua Data especially that we are only synchronizing contacts from our CRM.

 

The solution was a Custom Data Object for each Brand containing all the profiling information. This way I was able to bring as much information as needed in Eloqua but also maintain a clear structure of our existing data without affecting the other contacts.

 

Below is the structure of our first CDO:

cdo1.jpg

Note: The 2 fields which are not visible are specific for our company and we use them as unique identifiers in all the data platforms.

 

And how about the engagement?

As a well-established company in a well-defined industry, when it comes to marketing communications, we have a very good defined campaign flow and also good planning, so half of the work was already done. Also, single page from our websites is tracked by Eloqua, so the first step was to define a baseline and analysis and improve as we were progressing.

 

The baseline looks like this:

-    Submitted an enquiry form – 100%

-    Visited High Value Pages – 50%

-    Visited the website at least 3 times 25%

-    Clicked Sales Email – 20%

-    Opened Sales Email – 5%

 

The total is 200% because in our industry submitting a form is a clear signal to buy. If the person is not contacted in the next 24 hours to 48 hours, they could be a lost opportunity.  Also, all the main tiers have sub levels with specific time frames for the email frequency, high value pages, form submission time, etc.

 

The next step was to build them in Eloqua and pass this to the sales team.

 

The main challenge and how to pass the MQL to the sales team on a daily basis?

Initially the best way was to integrate them in Salesforce and set up the integration to have all the MQL in a queue and every sales person can pick them from there. But we hit another wall and realised that will be impossible to manage over 50 marketing queues with over 100 sales users and also having one contact being a MQL for 2-3-4 brands in the same time.

 

We build interactive reports for each brand and they are updated daily. We knew that the more systems we asked a person to use, the bigger the chance that we would lose them in the process. For this reason we linked the report with Salesforce contact records and with one click from the report you can reach the contact information in Salesforce. All the information is coming from Eloqua and it looks like this:

ls1.jpg

The beauty of the report is that it’s interactive and you can easily select what you want to see. For example, if you click on A1 you will only see A1 contacts and the basic information, such as first name, last name, job role, company, email address, phone and Salesforce ID. When you click on it, it will take you to the Salesforce contact record.

 

You can also select multiple scores and the report will filter them accordingly.

 

ls2.jpgls3.jpg

All this data is automatically extracted from Eloqua with a recorded time stamp, so we are now able to build custom reports on how many leads reached a certain level of engagement in the specific time frame. This also helps in improving our marketing communications. For those who are not familiar with Lead Scoring reporting, you should know that this functionality is not available in Insights.

 

One of our most important findings was that we did not have enough contacts reaching level 1 of engagement. That helped us tailor the tone of our communications and adjust our thresholds for the engagement.

 

We also brought in the mix Profiler and fully integrated it with Salesforce for sales users. This way, once they reach Salesforce the sales users are able to see the contact details and select which lead score they want to check. On top of the lead score they also have the summary of the activities and they can have a quick scan and before accepting / rejecting that lead will have a better overview of their past activities. Profiler is a must-use tool for any modern sales/marketer!

 

Chapter 6: The Results

Ten years ago, I would have said that the only metric that matters is revenue, but times have changed. In 2017, we have so much data available that it’s more a question of how we can make the best use of it.

 

We are now 12 months on with our 37th Lead Scoring model activated, with all of the previous ones being well received by the marketing and sales teams. We continue to help transition the sales process from cold calling to hot calling and provide the sales team with the ability to forecast their sales and prioritize their time. Lead Scoring also helped in shortening the prospect to customer cycle.

 

Chapter 7: The Resources

I could not have done it without the following courses:

 

B2B: Lead Scoring

B2B: Effective Marketing with Custom Objects

RPM: Effective Nurturing

RPM: Lead Quality

B2B: Insight for Reporters

B2B: Closed-Loop Reporting

 

All the courses from the Marketing Academy helped in delivering this project.

 

Chapter 8: Reviewing and Next Steps   

Going forward I want to switch from contact based Lead Score to account based Lead Score. This way we can have a better profile of our target audience and use the CDO to bring in the information and profile on it.

 

In addition, a massive thank you to the Topliner community which is my “go to” when it comes to anything about Modern Marketing and Automation.

Filter Blog

By date: By tag: