Skip navigation

Clean All The Data

Posted by Gatormain Apr 29, 2019


We have a dirty data problem and quite a bit of data flows inbound (about 40k leads per month). As a fairly old organization with tons of homegrown systems, there has never been an effort to standardize and normalize across them. With it being difficult to allocate technical resources to work on updating at the source and cleaning up historical data in their systems, we have to evolve and use what we got, fix what we can, and move towards a path of cleaning data in our system. As we all know, it is hard to personalize and segment if there are disparities. You can't say 'Hi John' if the only name field in your database is their full name without actually splitting first and last from it, which we now do, along with many other updates.



To improve our data quality. This will first be known by lowering the amount of failed external calls (mostly from values not converted to what the CRM expects). Make it reusable.


Benchmark current state

Over a dozen failed external calls daily from Guid should contain 32 digits with 4 dashes (xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx).



Create a program that everything has use. As we grow and find additional items to clean, we can add it to that program and all inbound contacts will be cleaned in the same way.


Currently, this is what it does:

  • Sets Lead Source Original and/or Lead Channel Original with the value in the Most Recent equivalent if the Original is null (Contact Washing Machine)

  • Populates multiple fields based on other fields for the CRM (Contact Washing Machine)

  • Convert all fields that require a GUID or other Value (Update Rules with Lookup Tables)

  • Double opt-in process - makes decisions based on current double opt-in status and country - sends to double opt in campaign if needed for marketing opt-in (Decisions, Picklist, Shared Filter, Contact Washing Machine, Campaign Canvas)

  • If they are in the USA, we use Zip Code to extract city and state - we created lookup tables that we refresh quarterly (Contact Washing Machine, Update Rules with Lookup Tables)

  • Extract First and Last Name from Full Name - almost every internal system asks for Full Name and not First/Last (Separate Program with Decisions, Contact Washing Machine)

  • If they are pre-defined as being a lead, we send them to the CRM program builder. This is primarily for high quality pages, like the ones with our Contact Us form, asking for a reach out.


This is what the canvas looks like:

Marketing Cloud Influence

I have used a lot of the courses in my day-to-day, depending on what I'm assisting with. For this implementation, Profile & Target was probably the most crucial. Multiple others have provided a positive impact for this and other initiatives.



After implementing some of our cleaning activities, we now only get a handful of failed external calls week, which is easier to manage but we are still working towards reaching a point to where no external calls fail. Majority of the fails are "internal errors", which is not because of data issue. We are continuing to run business as usual, while implementing more and more nurture campaigns to move contacts to different parts of the sales funnel. But our number one priority this year is integrating the data we need to be more targeted and personal for our users. With integrations, each time we need to take a hard look at the data mappings and work towards having our data consistent and correct.

Marketers are always wanting to know detailed information about form submission data, landing page views, and email performance.   In our current state I was constantly getting requests to pull these metrics from Eloqua.  One day it might be form submission data and the next it might be email performance.  This led to many distractions and time lost.
Prior to taking Luminary coursework I did not utilize Insight reporting or its capabilities.  The initial goal was to create a standard set of reports for each campaign that would be scheduled for Marketers to receive everyday.

The first step in this process was to schedule a 1 hour meeting with the entire marketing team discuss the universe of reporting that they would like to see.
The second step in the process was for me to take this information and build the reports and agents.  I found these courses particularly helpful in learning how to navigate Insight and start building reports.
B2B: Insight for Analyzers
B2B: Insight for Reporters
The third step was to enable the reports and schedule a 1 hour follow up meeting to make sure the agent reports were being sent daily and that all desired metrics were included.



Landing Page.PNG

Impact to the Business!!
Implementing this automated reporting has allowed us to be more agile and make quicker changes to our campaigns.  This has led to 24% more landing page views, 37% more MQL conversions, and also increased email open rates and click throughs.
A secondary benefit has been a dramatic increase in communication between marketing and sales as a result.  Conversation about what is going well or could be improved are happening almost daily.
A third benefit has been a tighter relationship between us and our distributor community.  We commonly work with our distributors to drive attendance to events and having the near real time data has helped us to organize better events and work more collaboratively.

Going forward I look forward to digging deeper in to the capabilities to further customize reports.    One example of this would be using the Pivot Table function to change the presentation of the data.  With some head scratching I was able to create a report that shows form submission by month by form.  This single page view is very helpful for identifying trends and meaningful conversations.


Form Submission Snip 2.PNG

Next up will be to create additional holistic reporting that shows metrics across campaigns on 1 report rather than creating individual campaign packages.

As a Marketing Automation Consultant, it is my job and my responsibility to have a deep understanding of Marketing Best Practices and Marketing Automation, specifically Eloqua.  I want to be able to guide my clients in the right direction with their marketing initiatives.  When you support a number of clients the challenges can vary drastically depending on the marketing maturity of a particular organization as well as the type of marketing and audience an organization is targeting.  Taking both the courses for the B2B Master and B2B Luminary certifications has allowed me to keep my ideas, practices and skills fresh.  Eloqua is a platform that is always growing in capability and there are always new things to learn.


I am going to share a recent project example where many of the trainings and topics within the B2B Luminary courses is relevant and has helped me to execute this project with my client.  My client is fairly new to Eloqua.  Well actually they have had the tool for about 7 years, but they have not completely used it to its full potential.  They had done a lot of batch and blast emailing, and they had not really standardized their data or their way of collecting data in the first place.  They were interested in setting up their very first nurture campaign.  One of the learnings in the Luminary courses covers Lead Nurturing and the importance of doing a data audit, of using progressive profiling to gradually learn more about your prospects and customers, and to use Lead Scoring to appropriately route contacts both to sales as well as into the right nurture track.  While my client was very focused on setting up the nurture itself (immediately), I had to advise them of taking these steps prior to going down the path of nurture.  And these steps took a few months….


While it took a little convincing, we started out with a data audit.  And what we found was that there were fields that needed to be normalized.  They used a lot of open text fields which is difficult to segment on.  We first mapped and cleaned up the existing data to select lists, and then we came up with a better form strategy using those select lists.  We used a data washing machine to assure any new data coming in was being normalized.



Then we talked about ways to collect those important data pieces that allow us not only to segment properly, but also to nurture properly, WITHOUT bombarding people with long forms that they are very likely to abandon.  We did this through progressive profiling and there is a course in the Luminary track that explains how to do this and the best practices with your approach.




My client actually did not want to take the steps to set up Lead Scoring at that time, but it’s something we have in our parking lot.  And I'm excited to see where we can go with it!


By cleaning up the data and strategizing on the best way to collect and store data in Eloqua we were able to define and create feeders for the first nurture.  The first nurture consisted of 4 tracks, a Welcome track for new visitors to the site, an Awareness track for contacts showing interest and seeking answers, an Education track that offered up ways to meet goals, and an Evaluation track that highlighted the reasons to choose my client.  The content in the Nurture course was helpful in showing us the types of assets to offer in each of these stages.  And throughout the nurture touches, we used progressive profiling to gather the data we needed to ultimately route contacts through the nurture.


With the new nurture campaigns in place, my client actually increased their MQLs by 400%!  We also took historical data on their Open Rate, Click-Through Rates, Conversion Rates and Unsubscribe rates and compared them to those of the nurture recipients.


Here is a snapshot:


AND… this was only the beginning.  The changes my client made with their data and form strategy as well as putting into place real nurturing made a huge impact and continues to make an impact.  I still work with them to analyze how the nurtures are performing, to optimize them every quarter, and to come up with new ideas around data and nurture.


One other big takeaway, especially for my client, was that their segments were dramatically reduced in size.  I think this is often something that is hard for marketers to wrap their head around.  Often times we think in order to hit our numbers, our outreach needs to be big.  But it’s not the size of the outreach, but making sure you are sending the right messages to the right people at the right time.  Even though the segments were smaller they still dramatically increased their MQLs.


The need was to reach all customers of new product features and versions within a 4 week span, twice per year with an integrated campaign. The challenge for our email team was that our best customers may end up receiving 10 versions of each message in the 4-message drip, 1 for each product for which they were a part of the decision unit. With resends, some of our best customers would receive 80 emails. Furthermore, the sending of the same message, to the same address, in a short period of time is known to create deliverability issues that we needed to avoid.

Goals to validate success and benchmarking

Target increases of 5% over the average of previous season’s Send to Delivered and Unique Clickthrough rates was set as the team’s goals. Send to Delivered was significant because we were trying to address the issue of looking like spam when trying to communicate sale offers to our customers. Unique Clickthrough rate was relevant because it was the widely agreed as an acceptable standard to measure email content. (It’s important to note that because previous seasons included one message per product, the unique clickthrough rate was not an apples to apples comparison. A better metric would have been unique responses, but we were unable to build consensus on this metric at the time.)

The Solution

First step was to limit the number products offerings per customer. Strategically, the most important factor was potential spend, so a pareto was developed that let us sort customers by potential revenue for each decision unit. The top 3 decision units for each customer product pairing were selected for participation in the campaign, resulting in the most potentially profitable decision units for each product.


Two solutions were proposed. The first was to wait between sends of the same message to the same customer. “Give it a week,” someone said. Well, a quick calendar exercise shows that the program that needs to run 2 times per year for 4 weeks per run would be running for close to 6 months. A different solution was needed.


A solution was proposed that included dynamically inserting the different product offers, and messaging into the same email, dynamically. Ideas like this had been suggested, but the only solution had ever been to create a unique email, or at least a piece of dynamic content, for each product being promoted. We did not have the capacity to support that kind of a build, let alone the subsequent quality control, and edits that are necessary to bring  a campaign like this to activation. Luckily, Eloqua had been making great strides in terms of its personalization capabilities, and this use case was perfect for leveraging those. 

Example of concept that the team wanted to build in Eloqua.


With some guidelines on the content, and some data organization we could develop a solution that would deliver relevant messages to our customers, in a timely fashion, without overwhelming their inboxes, without triggering spam traps unnecessarily, and allowing a single resource the flexibility to edit last minute copy changes in hours not days or weeks.


The team wanted to use 5 data points for personalization. A “special sauce” snippet, three bullets, and an image. They also had three categories of customers within the campaign, and each category would have it’s own copy independent of the product personalization. The categories were as follows:

-          Legacy customers with no upgrade available

-          Legacy customers with an upgrade available

-          Upgraded customers no upgrade available


For any of these categories, we were promoting over 150 products. We couldn’t very well create 150+ emails. There was no way to scale when a downstream typo was caught and needed to be edited. But, if we store the unique content in a custom object, and pull it into an email with a merge and dynamic content we could change the content on the fly with a simple CO import. We decided on a solution architecture that included 3 emails for each stage of the nurture. One for each category of customer. This decision was driven primarily by requests for reporting, and the desire to see differences in response for each of those cohorts.


So, we set guidelines on the content, and stored them in Custom Object fields where we could merge them into a common message. Guidelines included:

-          All Products must be treated consistently

-          Content teams must provide generic copy (for products without specific content)

-          Must include three bullets per topic

-          Bullets must be limited to 225 characters (giving us room for marketers who think it’s a soft limit)

-          Images must be hosted at the same domain. For example, the hero images for two products must both be from and not and

After the categories and products were properly assigned to the customers, the marketers and content team had just about finished up the content. All that was left was a simple join on the product identifier to bring the content inline with each customer email address. An additional field is added that includes the category, and a count of records for the email address, limited to a total of 3.


The Field Merge and Dynamic content configuration is the trick of this. Using the category, and total records concatenate you can limit the display of content in an email to your customers who have that record. So, in short, if I have a past purchase for an iPhone I get apple content. But if I bought Android, I get Google. If I bought both, I get both, with all the strategic messages for each customer consolidated to a single email asset that they review in one sitting.


The impact of this program was outstanding. Our send to delivered ratio and unique clickthrough rates far exceeded the 5% increase target we set for ourselves, and we ended up setting corporate wide standards for deliverability. Ultimately, so much buzz was created that the demand for insight into on our programs had exceeded our capacity to produce them, and the organization began to look at different means of reaching its ends. Ultimately, we worked with an outside firm to develop an extension using cloud content that eliminated the need to build the field merges and dynamic content to create this effect. We were able to drop tokens into the emails direct from the CDO, and control which CDO record was sent a message direct from Campaign Canvas. It gave us more control over who was sent what messaging without the cumbersomeness of field merges and dynamic content.

Filter Blog

By date: By tag: