Skip navigation

Is your webinar program healthy?  In order to make business decisions related to your company’s webinar program, you need to have a holistic view of webinar success across all of your company’s business units. If you are able to quickly identify characteristics of successful – or unsuccessful – webinars, you can make more informed business decisions on what webinar communications to send to specific segments.

 

We have been utilizing the ‘Webex Event Registration’ cloud connector on the Eloqua canvas to register contacts targeted through Eloqua emails. We have been running into issues with different business units and regions wanting to utilize the connector for webinars on the same days, to subsets of the same customer list. To avoid list exhaustion and minimize competition within our own list we needed to create a holistic snapshot of the webinar program to pinpoint successful campaigns. Once we had this data easily available, we were able to make better informed business decisions on what types of webinars to prioritize when business units and regions proposed competing communications. To create this snapshot, we wanted to gather who is registering, who is attending, their region, and a description of the campaign or webinar.

 

Our old process was a manual pull of webinar data from the WebEx site. An overview of all webinar campaigns was not available to us in Eloqua. We were already utilizing the Webex Event Registration cloud connector to register contacts for webinars, as well as the Webex Event Attended step – but did not have this data in a digestible format that could easily be shared with the client.

 

To move forward we needed to:

  1. Continue to utilize the Webex Event Registration connector per normal process
  2. Break out attendees and non-attendees for every webinar
  3. Include regional details associated with the webinar snapshot
  4. Include webinar name in snapshot
  5. Provide easy to digest report of registration details for every webinar

 

 

 

Design and Implementation of New Process

To create a snapshot of registration and webinar data, we implemented a change to the canvas build across all campaigns that utilize the Webex Event Registration step. The process change creates a report of the data that can be shared with the client periodically.

 

Normal Process – Contact goes through the Webex Event Registration step to register for the specific webinar. Transactional emails are sent from the form if outlined in campaign strategy. Contacts are then held in a wait step until 24-hour after the webex.

 

New Process – Create 2 forms that will house all of the data we want to capture. One for contacts who did attend a webinar, and one for contacts who did not. In both forms we captured email address, webinar name and region.

  1. After 24 hours – contact is sent to ‘Webex Event Attended’ step that sends contacts down a ‘Yes’ (contact spent more than 5 minutes in webinar) or ‘No’ (contact did not attend webinar) path.
  2. The two paths are directed to separate ‘Form Submit’ steps that reference the 2 forms we created to capture the details. In the step, we are specifying:
    1. Email Address – Contact Record
    2. Webinar Name – Static (should be the same for both path)
    3. Region – Static (should be the same for both paths, and depends on the webinar type)

 

Once the form is submitted for both paths, the contacts will either exit the campaign, be sent a thank you/missed you email, or continue to a separate journey as defined by the campaign strategy.

 

Now that we have the details captured in the forms, we scheduled a Form Submission Data report to deliver to the client on a weekly and monthly basis so they can review performance of the overall webinar program.

 

Example of campaign flow:

 

Business Impact

Capturing the details in a format that is easy to digest has allowed us a snapshot of overall webinar health that is easy to read and share. After weekly reviews of the snapshot so far, we have been able to pinpoint our most successful webinar overall, and write rules based on language to narrow our segmentation.

 

We noticed that a specific weekly webinar series consistently had a 75% rate of attendance. Because this rate was so high compared to webinars targeting the same region, we have given priority to this webinar series on a specific day of the week. This is the only webinar that can happen on the specific day, and any communications that promote the webinar are given priority over competing emails.

 

Through further analysis, we were able to put data behind a business decision to only promote webinars with emails written in the language of the webinar. We noticed that Spanish webinars being promoted with an English email had a lower attendance rate than Spanish webinars that were promoted with a Spanish email.

 

Future State

For future improvements of this process, there are opportunities to capture more contact level details in the snapshot. By including details like job title, persona, interests, or other fields used often for segmentation, we can have a quick view of the different contact profiles. Through this view, we can write better segmentation rules to target specific segmentation criteria for webinar topics that they are consistently registering for.

 

There is also an opportunity to create a more automated process by utilizing CDOs that pull in details related to the webinar rather than using static fields on the canvas steps. Setting up a more automated process will eliminate the opportunity for mistakes through a manual process.

Cloud Courses for Reference

B2B: Fundamentals of the Forms & Landing Pages

B2B: App Cloud

B2B: Basic Event Management

Introduction

 

When you think about it, email marketing is a lot like professional wrestling. Just hear me out. While there may be less spandex, both rely heavily on gimmicks. A pro wrestler’s gimmick is his/her character—how they behave, the clothes they wear, their entrance music, etc. The more committed a wrestler is to their gimmick, the stronger reaction (be it positive or negative) they get from the audience. We, as marketers, try to accomplish the same things with our emails. We want to create a clear image of our company, brand, etc. in order to generate a specific reaction. Looking at my campaigns through this lens has helped me better commit to their gimmicks, which has thus far led to greater engagement levels.

 

Marketing Challenge

 

The campaigns I work on are primarily for top-of-the-funnel leads—people who have, at best, limited exposure to my company’s offerings. At this stage, I’m just trying to get their attention. So I try to anticipate their problems and establish our salespeople as thought leaders by sharing collateral I think will be relevant to them. The gimmick is that these emails are designed to look like they’re coming directly from the salesperson. They aren’t supposed to look mass-produced; instead, they should look like they were written for this specific recipient by a real person. The footer at the bottom is an obvious tip-off that the emails are automated, but by the time the reader gets there, I theoretically have them hooked.

 

Approach

 

I thought I had a pretty solid gimmick, but when I ran the Campaign Analysis Overview report, the engagement rates were much lower than desired. I considered all the reasons individuals might not be engaging, and then a possible solution hit me. Out of sheer habit, I’d been inserting header images with my company’s logo at the top of the emails. But who takes the time to put header images in personal messages? Like a heel (wrestling slang for villain) signing autographs outside the arena before the show, I was conditioning my audience to question the reality I was trying to establish. Recipients immediately knew that these were marketing emails, and I don’t know about you, but I tend skim, or completely ignore, marketing emails. Not what I was going for from a strategic perspective, and it showed.

As I learned in the Engagement course, A/B testing is a great way to get more people to read your emails. It gives you the opportunity to test a variable, and see which version performs better. It seemed logical for me to start by concentrating on opens; after all, you have to hook the recipient before you can get to those precious clicks. So for my next campaign, I created two versions of each email—one with the header image at the top and one without—and using this blog post as a guide, I set up an A/B test on the campaign canvas.

 

Results

 

The results backed up my assumption: the emails without the header image got approximately 5-10% more opens. Since the emails are sent so early in the customer’s buying journey, I don’t have any stats to share regarding final impact to the business yet, but my long-term goal is to achieve open and click-through rates that are 10% higher than the standard for my industry (~21% and ~2.5%, respectively).This is an aggressive target, and this single change won't get me there, but I’m optimistic that a barrier to engagement has been removed. And every little bit helps.

 

More generally speaking, another goal is to win the trust of our sales team, which is skeptical of email marketing's capabilities. They want to control every interaction with potential customers, and understandably so. After all, their paychecks are directly tied to the end results. In part, earning their trust comes down to expectation setting. I've found that individuals who don't do email marketing for a living tend to have unrealistic expectations when it comes to engagement rates, so communicating average open and click-through rates (both for my campaigns and our industry as a whole) on the front end is essential. However, if I do my job and maximize the effectiveness of my campaigns, those numbers will appear more attractive during those initial conversations. Especially if I'm beating the industry average by at least 10%. 

 

This exercise helped me re-frame the way that I look at email marketing. Each email presents an opportunity to drive the recipient’s behavior, and if you’re relying on a gimmicky strategy (like I do), you can’t let any seams show. A/B testing affords me the opportunity to identify those seams, and increase the efficacy of my campaigns. That's important because successful email marketing could be a boon to my company, which essentially created the industry in which it functions. While we take pride in this heritage, it also limits our brand image to a certain extent. Potential customers tend to associate us with one service, while in reality we offer a wide variety of solutions. Email marketing gives us the opportunity to share our knowledge, which in turn can change people's perceptions of who we are and what we do. The potential benefits to the business are limitless—not only can we reach new customers, but we can inform existing customers of additional services in which they may be interested.

Issue:

Engage customers of a digital subscription to increase log-ins and video views

 

When launching a new digital subscription platform, we had the challenge of engaging customers in a new and different way. The proposition of this new subscription service is to reach customers on an individual basis with content that is accessible anytime, on any device. To achieve our goals, we needed a campaign that would 1) engage customers and 2) retain customers by increasing their engagement.

 

To develop our strategy, we looked into the best customer loyalty programs from both B2B and B2C companies. Building this tactics into our strategy and campaign from the beginning helped to ensure we were building an engaged audience from day one. A specific goal for the customer loyalty program was to increase user logins through a direct email campaign to reach a 50% user engagement rate for log-ins and video views. This goal was modeled from initial subscriber data, assuming a viewing ratio of 2+ views per login. Our theory was that increased audience engagement would create advocates at our clients who would support continued subscribership and annual renewals.

 

Action:

Execution of a customer loyalty nurture campaign

We did research on various customer loyalty programs and strategies from industry leaders including Amazon, Wall Street Journal and Fabletics. Our team attended seminars with speakers from these companies and personally signed up for various subscriptions to see their customer journey. A successful key tactics we discovered and implemented included:

  • Personalization of all emails
  • Customized content
  • Use of GIFs and video
  • Customer surveys
  • Gamification
  • A/B testing

 

Since our company was already leveraging nurture campaigns for demand generation, it was easier to develop a canvas for customer loyalty goals. The subscription service provides new content weekly, therefore, we used a rolling four week period to plan and adjust a canvas. This is an example of a four week canvas to our new subscriber segment:

Eloqua blog chart 1.png

 

Test and repeat to create user habit

Developing a testing framework is critical. We set a pattern of four week sprints to test various attributes and templates. Monitoring data closely, we adjusted templates and utilized Eloqua features including A/B testing, personalization and dynamic content all designed to create top user habits among our audience. The chart below shows what worked, or did not, based on the data.

Eloqua blog chart 2.png

Key findings:

  • Templates with a GIF and one topic had the best results
  • Dynamic emails need more testing to improve the algorithm
  • 50% engagement reached on one email in 4 weeks
  • 80% of user logins came from Eloqua emails

 

Impact:

Pulling all your levers to get results

By starting with the customer journey and implementing best practices, we achieved the 50% engagement goal and hope to surpass it in the near future. Key levers for our program included:

  • Identify opportunities that are most relevant to your customer journey
  • Create urgency through customized content and market triggers
  • Determine the most optimal email strategy and templates
  • Create habit forming customer behaviors

 

Relevant training:

  • B2B: Engagement
  • B2B: Personalizing Campaigns
  • B2B: Custom Subscription Management
  • Best Practices: Differentiation Nurture Campaigns (WBT)

 

The results

Our results indicate that user activity increased from an average 20% to 50% and we believe these numbers will continue to climb as we implement new templates and best practices. This approach to customer loyalty contributed to the successful launch of a new digital subscription. We have been able to retain 100% of our users while adding new users. On average 50% of our subscriber login on at least 2 time per month. Through our test and learn method we hope to further customize experiences and use dynamic features to increase user log in metrics. In the first year this program led to $4 Million in sales directly tied to subscribers. Eloqua helped us quantify these results and track users throughout the customer journey.

Starting anything new can be scary and daunting.  As we started our first campaign, there was a lot of things that we did not foresee as challenges that we would have to be address and overcome.  One thing that we had to keep reminding ourselves of, is that sometimes change can be difficult, but it is good thing, because it shows that you are growing and learning.

 

The Challenge

We would be running a one-time campaign to get our users to understand a new policy change that would be going into effect and inform them about how this change would affect them.  The challenge would be addressed with a campaign containing a series of emails informing them about the upcoming change, when the new policy would be going into effect and what they needed to do to make sure that they would not lose access to our company’s website program.

 

Goals

Our initial goals were to:

  • Target only current users of our company’s website program
  • Get 100% of users to respond whether they accepted or rejected the new policy within a one-month period
  • Track at what stage users were accepting or rejecting the new policy

 

With the above goals we would then be able to export the list of users and then keep them or remove them from our company’s website program for not complying.

 

Process

Implementation: Before Eloqua we would have had to have each salesman reach out to their customers one by one, to inform them about the upcoming changes.  We would have had multiple people trying to convey the same information in diverse ways to get an acknowledgement.  This previous way was extremely ineffective, because we would have to chase down multiple people to get an update on how progress was going.  With Eloqua it was nice to be able to look at one campaign to get a snap shot of what was occurring.

 

Implementation Lessons Learned:

Before starting a new campaign in the future, we are going to make sure we have a clear direction of what results are expected to come out of the campaign.  As we started this campaign, it appeared that there were many different takes on what the outcome of this particular campaign should be. Moving forward, we will be having a kick off meeting to discuss the details of the campaign including the look of the email, the data results that are expected, and timeline. 

 

 

Segmenting: We received this from our company's website program, by selecting all users who had stated they needed access to the program.  We then split this larger group into smaller sections by regions, so our people would have a visual of users in their area.

 

Segmenting Lessons Learned:

We learned that our company website did not even have the most up to date information on our customers. The first email that we sent out had a bunch of hard bouncebacks.  This informed us that we needed to work on keeping emails and other information up to date and not just allow the information to remain static.  Since we had a lot of bouncebacks in our campaign, we then had to contact the users and get updated information, which really slowed down our progress.  Before starting another campaign, it would be beneficial to do some data cleansing to make sure the information we are feeding into the system is useful.  The bouncebacked contacts really slowed down our campaign and if we had started reviewing the data put into the system from the very beginning, we could have avoided the pitfalls that occurred.

 

 

Email: For this campaign, we sent out 4 emails.  The same policy information was presented in each email, but the email layout was changed and emphasis was added on the deadline for when the policy needed to be accepted.  The emails followed the below.

 

  1. The initial email sent informed them about our new policy with a link to an acceptance page.
  2. The second email sent was a reminder email to accept the policy.
  3. The third email was similar to the second email, but with more emphasis that they needed to accept the policy or they would be removed from the company’s website program by a certain date.
  4. The final email informed that they would be losing access to program since they had not complied by accepting the policy.

 

Email Lessons Learned:

When we first started this campaign, we knew that we wanted to send four emails to those who never accepted the policy, so that they would not be able to say that we did not inform them about the upcoming changes.  However, we started to notice after the first email, that the design of the email was not clear in informing customers that they would need to click the email and accept the new policy.  Changes were made to the second and remaining emails to clarify that action was needed to accept the policy.

 

If we were to do this type of campaign again, the changes that I would do would be to not send the users to a landing page were they had to type in their information before submitting. Instead, I would do a blind form submission once they accepted the policy in the email and then send a confirmation email to them, informing them that they had approved the changes.  Since the email campaign was not clear, we had users waiting until the third or fourth emails to finally submit because the email was too complex to follow.

 

Shared List: A shared list was very important to this campaign. It allowed us to quickly review and export the users who had clicked and accepted our policy.

 

Shared List Lessons Learned:

Shared List was a last minute addition to this campaign.  In the future, when we are determining the layout of the campaign, shared list will be used more to breakdown the groups.  In this campaign, we only used one shared list to capture customers who had accepted the policy.  It would have been helpful to have thought beforehand to have multiple shared list to place users into separate categories. If we had done it properly, we would have also set up shared list to collect which users had rejected the policy, and those that did not respond at all.

 

Form: A very simple form was used as seen by the user.

  1. Field merges were applied to Email and Company only to keep this data consistent with what was already in the system.
  2. We choose to have users enter their First Name and Last Name because in our last review of the data, it was found out that we had many users who had entered false data or skipped over this field.
  3. We decided to use a single checkbox that was already pre-selected to make the user experience as easy as possible

 

Form Lessons Learned:

Our form was easy to follow, but we just did not do an excellent job of communicating and getting people to this stage.  As was mentioned earlier, in the future we will work on performing a blind form submit instead of having users to make their way to this page and then submit. 

 

Form Processing: For form processing, there were a few things we wanted to make sure were included.  To start off, we wanted to make sure that the contact data was updated with form data in hopes to help us clean up our outdated data. We also wanted to send an email to the managers in each region when a user in their area had approved or rejected the waiver.  Finally, we wanted to direct them to a landing page to know that their form was submitted.

 

Form Processing Lessons Learned:

Looking back, there were a few things that could have helped with form processing.

We started to notice that our customers kept on resubmitting the form after accepting the policy, since they were not sure if their form submission had gone through.  Going forward, we will not only send them to a landing page informing them that they had accepted or rejected the policy, but also send a confirmation email to their inbox.

One other area that we can improve moving forward was sending the emails to each region.  It came to our attention that the customer profiles had missing information for the region in which they were in, and therefore, in some cases, no one was notified when a user accepted the policy.  Going forward, we will work on cleaning up the data and setting a default user to receive emails for those that do not follow into any category.

 

 

Landing Pages: Landing pages were very important to this program. We wanted to make sure that user knew that if they unchecked the policy that they would really understand the consequences of what they were really accepting.  If users accepted, they were good to go!

Landing Pages Lessons Learned:

The simplicity of the landing page made it easy for users to know exactly what was occurring.  This is something that we learned worked well and will continue to move forward with the idea of simplifying the information that we provide and not try to shove everything we can on one page.

 

Conclusion/ Lessons Learned

Overall, for this being the first campaign, our company tracked in Eloqua, it was a success!  We had 100 % of all the contacts either accept or reject our policy through the campaign canvas.  However, this was only achieved by removing all the bouncebacks and having sales make a handful of calls to customers who did not respond to any of the emails.

 

After completing our first campaign, and reviewing how it went, there was one area that we determined we would need to spend more time on in the future and that is the User Experience. The User Experience was something that we did not place a high priority on, as we were just focused on making sure that we had everything in place so the campaign would run properly.  Going forward, the User Experience will be improved by allowing more time for testing and getting feedback. 

 

As stated, going forward, more time will be spent testing the content.  The Test Content feature was a tool that we absolutely loved within the Eloqua system, but did not utilize fully. We were able to use this feature to test our emails to make sure everything was working, but sending this Test Content out to more people to review would have been beneficial.  We got into the trap of not letting fresh eyes see the campaign and give their feedback on what changes could help.

 

We will also try to utilize the A/B testing in the future.  This will help with the User experience because we will be able to send our content to a small group to see which email they are responding to the best and will be able to give the rest of our users the most likelihood of receiving the clearest content.  This would have been beneficial in our current campaign because our first email was not very well received and we could have caught this mistake earlier on.

 

Finally, we also found out that Users like to receive immediate feedback.  It turns out when they sent their submission, that a landing page was not enough to confirm that their response had been recorded, but an email stating the results would have been appreciated.

 

When we started out with this campaign, we had no idea of what we were getting ourselves into.  Now that we have taken the first plunge, the system is no longer intimidating, but easy to navigate and will help as we continue to grow and develop and learn what we can do to best reach out to our customers.

 

 

Useful Resources

B2B: Fundamentals of Emails
B2B: Fundamentals of Forms and Landing Pages
B2B: Fundamentals of the Campaign Canvas
B2B: Advanced Editing and Form Processing
B2B: Personalizing Campaign
RPM: Targeting and Segmentation

Our Goal:

Re-Engage the retired leads using Nurture Campaign and Form Submit Cloud Connector.

 

Introduction:

It is always a best practice to keep engaged the leads who are not involved in any sort of engagement for a particular time of period.  Identifying and re-engaging them will help the leads to be active in the system and remain on the marketing list.

 

Marketing Challenge:

What is the business strategy that this supports?

E-mail re-engagement can be an effective way to reactivate previous leads or customers, particularly in B2C marketing, but also in industry-specific B2B marketing. Of top importance is timing. You don’t want to wait six months after initial contact to re-engage with them. Reactivation marketing should commence in the first thirty, sixty, and ninety days of the last contact, depending on the industry and target customer. Sending them a series of brief, engaging e-mails over a period of time will demonstrate to them that you miss them and care about winning back their business.

 

How will the business benefit from re-engaging inactive leads?

Identifying and re-engaging them will help the leads to be active in the system and remain on the marketing list.

 

Implementation:

Nurture campaigning is powerful in Eloqua. We used nurture campaigns, form submit cloud app, pre-built REST API process, lead scoring models and program builder for creating leads to pass on to sales. Two campaign canvases are involved in this process -- one is no lead flow and the other is lead flow campaign.

 

Execution:

The two empty campaign canvases are available in Eloqua with the metadata of the campaign codes present in the CRM system. These 2 codes are flown to Eloqua with the help of the pre-defined REST API process. As mentioned earlier, one code will be no lead flow and other would be lead flow campaign. Now for the execution of the Nurture campaign.First, we need to create all the assets such as Landing Pages, Emails, Segments and Forms to promote the offer in the Nurture campaign. At the contact level, we will be deciding whether the contact is lead or not with the help of the campaign code and this campaign code is part of the contact level data.

 

The Landing pages form submission from the no lead flow campaign would submit a no lead flow campaign code to the contact level. The Emails should have the corresponding landing pages to be sent to each contact. For creating the segment, we need to upload the retired leads into Eloqua if they are present outside of Eloqua. If within Eloqua, then based on the condition, we can pull the retired leads the condition may be like “All EMEA Region” with the job title as “Architect” and “no activity” for a certain time of period. Note: All these details should be part of the contact level or CDO to pull the data into Filter.

 

In this case, we have a pre-defined program builder for lead flow and a Form Submit cloud app which feeds the program builder. The form chosen in the app would have the email address and the campaign code field to update during auto submit.

 

Nurture Campaign steps are as follows:

 

No-Lead-Flow-Image.png

 

Once the Nurture campaign is ready with all the assets and ready to activate, once activated those who get evaluated in the decision steps and moved to the yes path by doing some engagement in the email sent. They will be again moved to the lead flow campaign and the Form Submit cloud app will do a form submit automatically and add the lead flow campaign code to the contact level such that it becomes an active lead.

 

Those who have not done any engagement in the nurture campaign will be just moved to a Shared list which would be similar to exiting a campaign.

 

Following is the flow chart and details how the lead flow campaign can be planned.

Lead-flow-image.png

 

2017-11-24_1648.jpg

 

Conclusion:

 

For retired leads who have not been sent any email or not involved in any engagement with us, this would be an ideal process to make them active again and make them flow to sales. The process is automated with not much manual intervention. Active leads can be created with at least 5-8% of those who were sent the email.

 

Helpful Marketing Cloud Courses:

  • B2B Segments
  • B2B Fundamentals and Email
  • B2B RPM
  • B2B Form and Landing pages
  • B2B Engagement
  • B2B Targeting
  • B2B Technology

Our Goal:

Our goal is to automate an Event registration process system by capturing the registration data using Eloqua with the help of Custom Form and Custom Data Object.

 

Implementation:

Automating a Third Party Event registration process by using a Eloqua landing page to capture data & Posting it to Third party system as well using the Post to Data Server processing step in the form and the Third Party system will send us back the attended data after the event, this data will be stored in CDO an automated process is setup by Third party so that attended data is posted back to Eloqua using a form and in turn to CDO using the Update Custom Data Object – with Form Data we will use the same in Campaign Canvas a Decision making step Compare Custom Object fields to compare the data against who attended and not attended using a unique field. Depending on this unique field we send the post event emails.

 

Execution:

  • We have Two custom data objects created one for registered and other for the attended this comparison is done using a unique field. Using the CDO decision will be taken whether they have registered or not.
  • Email assets will have the registration links with Eloqua form, after the registration the data will be posted to Third party using the Post to Data Server
  • First invite Segment and Email Asset will be built and once the people register they will be moved to the wait step, IF they are not registered they will be getting a second Invite after the evaluation period.
  • In the second invite we have a fresh list of contacts that will be sent along with the people who haven’t registered from the first invite.
  • After the second Invite if the contacts have not registered we can move them to a Shared List.
  • With the Wait decision the reminder Email will be sent as per the requirements.
  • Once the Event is completed Third Party will send back the attended list via form submit into Eloqua.
  • We will build a segment will the list of registered data from CDO, and using the  Compare Custom Object fields. We decide which registrant has attended and not attended depending on the value of the unique field the respective follow up Email will be sent.
  • In the canvas setup, Allow contacts to Enter the campaign more than once is selected.

 

Canvas.jpg

Conclusion:

  • This process will help reduce the human intervention.
  • Faster flow of Leads and Registrations
  • Automated flow of Registrants and attendees to Third Party System.

 

Helpful Marketing Cloud Courses:

  • B2B Segments
  • B2B Fundamentals and Email
  • B2B RPM
  • B2B Form and Landing pages
  • B2B Engagement
  • B2B Targeting
  • B2B Technology

Events are a big part of a B2B marketing strategy. They've been around for a long time, and they're not going away.

 

From a Marketing Automation perspective, there's something that stands out about an event (versus other digital marketing efforts) - the 'offline' element of the campaign.

 

What do I mean by that? Well, as the Marketing Automation specialist:

  • I can control the invitation journey (Email, Paid Media, Social Media, etc.).
  • I can control the registration process (a beautiful landing page).
  • I can control the reminder emails to ensure people attend the event.
  • I can even control the post-event messaging.

 

With all of this, I can ensure the right people are getting the right message at the right time. I can ensure there is consistency in branding, in tone of voice, in user experience, etc. However, there are things I can't control.....what happens on the day of the event. Have the contacts arrived at the event? Which sessions have each of the guests attended? Do the salespeople at the event know when their prospects have arrived?

 

Event.png

 

Now, some of you might have an amazing events team, or a third-party events partner, who has put a process or system in place that allows for all of the above to flow seamlessly. But for those of you who don't (or who would like to save themselves a few bucks), read on

 

Let me tell you how addressed the above questions using just Eloqua.

 

The first thing we want to address is the process of checking people in at the event. Not only do we want it to be a seamless process for the user (and for us). But we want to make sure that each check-in will be recorded online in real-time. The key to everything here is a QR Code. When somebody registers for the event, the Confirmation email (and each of the reminder emails) has a unique QR code that is generated on the fly within their email. This is accomplished using a free, open-source QR code generator found here. The QR code is unique to each Eloqua user by doing a field merge with their Email Address. Here's an example of what the email might look like:

 

confirmation.PNG

 

 

 

When the user arrives at the event, they simply display their QR code (on their mobile device or printed on a piece of paper). Our event staff can scan this QR code using any free QR Scanner app on any mobile device (iPhone, Samsung, iPad, etc.).

 

This will pop up a very simple Eloqua landing page which will display the email address of the relevant user - and a few buttons allowing our staff to choose the action. When they arrive at the event, the action will be simply to check them in.

 

checkin.png

 

Clicking this button does a simple form submit to Eloqua. Of course, from here we can do all sorts of fantastic things such as:

  • Updating their lead score (for attending the event).
  • Updating their Campaign Status in our CRM (Salesforce) as 'Attended'.
  • Sending a notification email to the relevant Salesperson to let them know that their prospect has checked in at the event.
  • Sending a welcome email to the user with some relevant info.

 

Pretty cool, right? But it doesn't stop there!

 

As you can see from the screenshot above, as a user walks into each session, their QR code can be scanned again (printing the QR codes on the attendee's name badge prevents them having to take out their phone/ticket every time). This means that we can track which sessions each user actually attends. This can be really beneficial information when sending follow-up content, and also for salespeople to know which areas the particular user is most interested in.

 

And it doesn't stop there either!!

 

Something that our marketing teams wanted was a real-time snapshot of everyone who was at the event. Thanks to the scanning of the QR codes, this data could be accessed and viewed real-time. We created a simple Eloqua landing page which allows us to see how many QR codes had been scanned for each action. This meant the marketing team would know exactly what attendance was looking like. This could be taken one step further with filtering the numbers by type of attendee (e.g. prospects vs customers).

 

 

What I've outlined above is a very simple and basic example of what could be achieved. There is plenty more things we've done (such as fall-backs for people who arrive without a QR code) and the simple back-end to add options for the various sessions. The most important thing is to make sure that:

  • The process is simple & streamlined ensuring a great user experience.
  • The data is being passed automatically and in real-time.

 

Courses that helped me to work through these ideas included:

  • B2B: Basic Event Management
  • B2B: Events in the Cloud

 

If you've any ideas or questions, I'd love to hear them!

Almost every company is guilty of taking their existing customers for granted.  After all, we got our sale, they got their product, and our product is awesome so everyone is happy now, right?

 

Not so much.  Even the best products have blind spots, customers have learning curves, and failing to meet their ongoing needs is a great way to ensure that their first purchase is also their last.

 

Our company has been as guilty of that as anyone, but this year we decided to change that.  Rather than waiting for the questions, complaints, or system changes, we used training attendance data to develop targeted emails for new users of our most popular products with tips and directions for making the most of their new systems.

 

The Challenge

 

Offer better support to our customers with targeted emails

  • Pull specific training data from registration data of over 100,000 entries
    • Product type (24 products)
    • Job Roles
    • Target only Introductory training
    • Target only recent training
  • Utilize this data for field merges and dynamic content
  • Create nurture paths of 4-5 emails for both leadership and system users

 

The Benchmarks

 

We determined that success would be established by

Reach – Current Benchmark on this was 0

Response – Unique clickthrough rates and  unique open rates were established as our response as it would indicate that people were reviewing and accessing the information we were sending

Benchmark Unique Open Rate  - 10-12% Benchmark Unique Clickthrough Rate - 1.50%.

Average Automated Unique Open Rate is 25.90% and Automated Unique Clickthrough Rate is 3.40%.

 

 

The Process

 

The Data – We would be using the attendance data in two ways.  One would be in building our segments.  The other would be for field merges and establishing dynamic content rules.  Our challenge was twofold.

    1. All of the attendance data was being dumped into a single CDO via a CRM sync, regardless of training type, product, or training date.
    2. Some groups were attending multiple training sessions for different products and we did not wish for the dynamic content to suddenly change mid-nuture.

 

We solved for the first issue by creating picklists for the training types and products that we wished to single out, then filtering based on those picklists, as well as adding a date filter to make sure we didn’t add anyone who had attended a training too long before our nurture launch.

 

We solved for the second issue by first determining that nurture would only occur on their first training, to avoid redundancy.  To ensure that no data was overwritten, we used two contact fields for merges and dynamic content, and set the data to write on the condition that those fields were blank.

 

Contact-Field.png

 

The Assets – We developed two nurture paths for people who attended the training.  The first path was for leadership and was comprised of 4 emails.  The second was for users and was comprised of 5 emails and one landing page. The amount of personalized content varied from simply swapping out images and product names to making an email almost completely dynamic.  We also wanted to make sure we retained our custom branded layouts for consistency.

 

  1. We determined that the simplest way to maintain consistency and the flexibility needed was to create static layouts with “modules” for personalized content.
  2. Field merges were used to populate the program names.
  3. Dynamic content was used to populate everything else.
    1. All emails used at least dynamic content set, to swap out product logos.
    2. Others used up to 6 dynamic content sets, with everything being completely personalized.
    3. Each set had 24 active rules, one for each product.
    4. We found the most efficient way build was to set up one Dynamic Content set first, finalize the rules, then copy it repeatedly.  Then all we had to do was switch the content itself.EMAIL.pngDynamic-Content.png
  4. The Canvas – Canvases were relatively simple.  Only the leadership canvas required decision rules, as some had opted out of the email groups of two of the emails on the canvas.  We created a shared filter that pulled all opt outs of that group and used it to divert relevant people to the next email in the series.  All wait steps were relative, holding people for days or weeks rather than set dates. Segments were set to feed continuously, refreshing the data every day.  Thus all training attendees received their first email within 48 hours of their training, even though training went on for several months.

Canvas.png

 

The Results

 

  1. As a campaign, the leadership canvas averaged a 32.14% unique open rate and a 6.24% unique clickthrough rate, outpacing both our normal emails and our automated emails. The top performing email of the series had a 41.22% unique open rate and an 11.46% unique clickthrough rate.
  2. As a campaign, the user canvas averaged a 32.84% unique open rate and a 6.85% unique clickthrough rate, also outpacing all our benchmarks. The top performing email of the series had a 43.95% unique open rate and a 15.02% clickthrough rate.
  3. We reached out to 150,000+ people with useful information about their new products
  4. Other business units have reached out to re-purpose our data for sending subject-specific newsletters. Metrics for these newsletters again surpassed both our benchmarks and the clickthroughs of our training follow ups was more than twice that of our other product users.

 

Conclusion

 

Overall, this campaign far exceeded any previous benchmarks and has increased the engagement of our existing users.  The campaign is still ongoing, to continue to nurture future purchasers and there are plans to expand the number of products.  In addition, our Demand Strategists are planning to increase the amount of dynamic nurture campaigns we deploy for 2018.

 

Useful Courses

 

RPM: Targeting & Segmentation

RPM: Effective Nurturing

B2B: Advanced Segmentation

B2B: Personalizing Campaigns

B2B: Integrating Custom Objects with Campaign Canvas

B2B: Effective Marketing with Custom Objects

How to: Continuously nurture leads with revolving marketing content

 

Marketing Challenge:

Lead nurturing is an evolving strategy that needs to coincide with changes in the market landscape and also needs to shift to accommodate newly developed marketing content and offers.  Lead nurturing should also try to avoid over emailing prospects and should be considered a secondary email strategy that compliments the more time sensitive email communications like local marketing event invitations and webinars.

 

Goal:

Build a lead nurturing program that compliments other email initiatives like product releases, webinar and event invitations but doesn’t require any additional maintenance due to email rate concerns.  The program will always need to know if a user has been emailed recently.  In addition, build the lead nurturing program to allow for swappable content, the rearrangement of emails, and allow for new emails to be added or removed from the program, and also allow a new email or a series of emails to be added as the next emails all users will receive. 

 

Implementation:

Build a looping campaign that watches who is allowed to be in the campaign, what emails have been sent, verifies a contact hasn’t received another communication recently, has time control measures to determine the length to time between emails, and has an email assembly line to easily check and optimize the email flow.

 

The flowchart:

2017-11-16_16-37-52.jpg

 

 

Things to Understand

  1. This program checks to see if a user has been sent an email before sending the email. This is to prevent users from receiving the same email.
  2. You can update the copy in any email.  As long as you don’t replace the email in the campaign with another email, this type of update won’t be considered a new email and a user that has already received this email not receive the newly updated email.
  3. If you want to add an email in front of another email, you just have to deactivate your campaign, break your flows and add the new emails where ever you want.
  4. You don’t have to have a welcome email.  If you don’t need it, just remove it from your flow.



Things to Determine

  1. Your Email cadence based on your content initiatives and strategy.  In the example above it uses 14 days between regular emails and also it waits 7 days if another email went out to a contact in the system.  It also uses a 14 day delay before the welcome email.
  2. Safe to Send Today Filter – Again this is based on your business.  The example uses filter criteria - contact has not received any email in the last 7 days.
  3. Allowed to be in Program Filter – This is another exclusion filter based on your business that kicks a user out of the lead nurturing program.
  4. If you run out of emails you can just hold the users until newer content gets created.  Once you have new content, you can just push those users back through.

 

Conclusion

With rising CPL acquisition rates, reducing unsubscribe rates and providing valuable content should be the goal for any Lead Nurturing Program.  Marketing should never just build campaigns in a silo without considering the ramifications of all Marketing Communications.  Knowing that marketing is always dealing with a changing market landscape its best to assume that unplanned communications will happen and those communications will need to go out regardless of what we thought was planned. By building smarter programs that embrace this paradigm, you as an organization can strive together as one team and one engine instead of working against each other because of unforeseen challenges.

 

Helpful Marketing Cloud Courses:

  • B2B Engagement
  • B2B Targeting
  • B2B Conversion
  • B2B Analysis
  • B2B Technology
  • B2B Fundamentals and Email
  • B2B RPM

 

 

Bonus items to consider:

  • Wait Steps could be modified based on a Custom Preference Center and user email frequency settings
  • A filter could be added to the program to route users that meet a certain Consideration Stage lead score to a consideration stage email program

INTRODUCTION

 

In our brand, historically, tasks were generated in Eloqua (and then integrated into our CRM system) by using two different programs depending on the type of tasks to be created. Both programs were made of overcomplicated and non-optimised workflows in which the end product (in this case the task generated) no longer satisfied our Business requirements. The Sales team gave us the following feedback: they were overwhelmed by the very high volume of generic, unallocated, unprioritized and sometimes reoccurring tasks being created in our CRM system. Unflagged, important tasks were unnoticed among the total volume of tasks created leading to them not being actioned or passed on to the right person which ultimately had a clear effect on decreasing Business revenue.

 

MARKETING CHALLENGE

 

Due to the adverse effect on revenue, both the Marketing and Sales departments considered acting on this feedback a matter of priority and it was clear that our challenge was to streamline and optimise the existing task generation process so tasks:

 

-were generated from a single, simplified and optimized Eloqua program in all cases.

-were created on the basis of task priority (as being determined by the Marketing Department) so only relevant tasks were generated. This was expected to have a positive effect on reducing task clutter.

-were created hierarchically according to their relevance to the Brand’s business objectives and allocated in a way consistent with Brand's existing lead allocation process.

-were correctly integrated into our CRM system and displayed clearly in a decreasing order of priority (from "1" to  "5") to make it clear the order in which they should be actioned by the Sales Department.

 

THE PROJECT

 

We piloted this project, by creating clearly labelled and prioritized tasks to flag any leads in Eloqua that would correspond to:

 

- Key Decision Makers (i.e. any leads with a "Job Level" of " Senior Manager" and above") AND

- Displayed a lower level of engagement with our brand  (defined as leads with a value of "A3" in the "LS: Combined Rating" field in Eloqua of "A3".

 

This group of contacts was identified by both Sales an Marketing as a highly desirable group of prospects that were most likely to convert and, therefore, were allocated to specific members of the Sales team with a priority of "1" and a clearly visible description. The word “pilot” should appear in the “comments” section of the task when referring to this group.

 

Step 1  

 

Our first step was to create a new Eloqua field called “HR Priority (text)” that we would have to mapped to the “HR Priority” field that already existed in our CRM system.

The values displayed would be identical in both fields and the mapping of those fields would ensure that  the priority associated to the tasks created was clearly displayed in our CRM system.

 

Step 2

 

Our next step was to amend all the relevant external calls in Eloqua that were going to be required to create and integrate the "A3 KDM" tasks and to ensure that all the relevant fields in Eloqua including “HR Priority (text)”were adequately mapped to the relevant fields in our CRM system.

The way in which the mapping of external calls is updated in Eloqua is: Integration>Outbound>External Calls>(select specific call) Options>View field mapping.

 

Step 3

 

After making the relevant changes above, we visualised the way the information in the tasks created in Eloqua would display in our CRM system by testing all the relevant the external calls involved in the integration process.

This allowed us to confirm that the tasks would correctly display the relevant information in our CRM system (as per the requirements defined in the briefing phase of the project) and that the tasks were allocated to the designated Sales Agent.

 

 

To carry out the testing, we introduced a prospect's email address and clicked the “execute test” button

 

Step 4

 

Once we were satisfied that all the information displayed in the task would be correctly integrated with our CRM System, we ran a comprehensive, live test including all the possible scenarios to ensure that the tasks display adequately in Salesforce and the allocation is also correct.

 

Below is the result of our test as displayed in our CRM system:

Step 5

 

Once we had checked that our first type of "pilot" tasks had been successfully created in Eloqua and integrated with our CRM system, we set out to replicate the same process to recreate the rest of the prioritized tasks selected in the Marketing briefing. 

 

Step 6

 

The next step in our project was to consolidate the two current task programs used in Eloqua and transform them into a single, optimised, streamlined and fully revised task program that would satisfy the requirements established by the Business, namely, that only high priority tasks would be hierarchically created.

 

It was decided by the Marketing Operations team that, to achieve this in an optimal way, all the tasks would be hierarchically grouped by "Priority" within the Eloqua task program rather than by "Type" as it had always been the case.

Tasks with the same priority would be captured by specific filters within the corresponding decision points in the program and those filters would look at the information displayed in the “Task Indicator” field in Eloqua which was unique each of the possible actions that could be taken by the prospects (i.e. taking a trial). 

 

We sought to further optimize the task creation process by leveraging this unique value displayed in the “Task Indicator” field in Eloqua and used it in the "Subject" field of the task, therefore singling out the specific action taken by the prospects that had resulted in a particular task being created.

 

RESULT

 

As a result of the implementation of this new program, Sales reported:

 

- a 87% decrease in the volume of duplicated task created

- a 95% decrease in non-descriptive tasks generated

- a 68% increase in number of highly relevant tasks being worked in time due to the priorization of the tasks

- to date, 2 extra deals closed (worth approx. under £400k) as a direct result of the Sales Reps working high value tasks

 

CONCLUSION  

 

The strategic planning of the new task creation program took a big chunk of our time during this project as did ensuring that the new program fully satisfied the requirements specified by Marketing. It was important to come across a solution that was simple yet effective, comprehensive enough to cover all required scenarios in terms of requirements of task creation and robust enough to be escalated if necessary.

 

 

As with all processes, we rigorously tested all possible task scenarios involved using newly created testing contacts and integrated them as we would normal leads. In order to complete the testing, we worked with our allocation/integration program to ensure that the testing contacts would successfully appear in our CRM system and that both the tasks and the leads were allocated correctly as per specifications. We drew a lot of expertise from Topliners when dealing with integration queries and issues and also when creating dynamic external calls. The B2B Eloqua course on Integration offered valuable information and know-how.

Introduction: -

It is always a better practice to associate a score (A1 through D4) whenever a lead is passed over to sales. This helps the Sales rep to know which lead to action first.

In Eloqua, this can be achieved through lead scoring models. It is very rare that lead scoring system is perfect right from the beginning when it is

implemented. It is all about improving your scoring models frequently. As long as we continue to make adjustments, lead scoring system will be on the

right track. However, there will always be challenges on how effectively we could make changes in the scoring models. 

 

Profile_Engagement.jpg

 

 

The scope of this blog is restricted to which scoring technique we used to score Low, Mid and High engaged responses in our Eloqua instance.

 

 

Marketing Challenge: -

When Eloqua was introduced to us, we started scoring customer activities by scoring on the form submissions. This way, scoring was pretty straightforward.

Scoring form by form in the beginning was easy since we had less forms and few scoring models in the system. Later, when more forms and multiple scoring models

were introduced, changing the models became a herculean task. Whenever there is a new form being created, we had to add new rules to multiple scoring

models. More rules in the models decreased performance of the models as well.

 

 

Goal:-

We had to come up with a mechanism where form activities are efficiently scored across multiple scoring models. Also, we had to reduce the manual work done

by the team while making a change in the scoring rules.

 

 

Approach:-

How we solved this challenge was by introducing multiple placeholders for Low, Mid and high value forms.

 

     1.  We introduced folders for each form category in the forms section in Eloqua. We even segregated the forms according to the priority (High, Mid, Low) 

          and placed them in respective folders.

 

Folders.jpg 

 

 

   2. People who were asked to create new forms were given instruction to place the forms in one of these folders (if the form is

       intended for lead creation). 

   3.  We added form folders in our scoring models Instead of actual forms and gave different weight age for low, mid and high engaged activities.

      

          i) In each scoring model, under Engagement section, we created 3 submitted forms rules for each one for the form categories.

 

         HighMidLow1.jpg

             ii) and added the necessary rules under it.

              HighEngaged.jpg

 

   4. Forms which were not intended for lead creation were placed in a separate folder so that it will not be scored at all.

 

 

 

Results:  By creating the form folders we have achieved below benchmarks

 

   1. Less Downtime : Whenever a new form being introduced, we just have to place it in the form folder instead of changing the form rules in the scoring models.

       So no downtime at all while activating and deactivating the models.

   2. Performance Improvement : We reduced number of rules in each model which in turn increased model performance.

   3. Faster changes : Changing the models became faster since we don’t have to modify every form submission rules.

   4. Right Scoring : Whenever there is a new form got introduced, there was no need for changing the scoring models since the Low, Mid and High engaged sections

       were scoring the responses rightly across all models.

 

 

Conclusion:   

By introducing Low,Mid and High Engagement Form folders, we have not only reduced the amount of hard work done by the team on modifying the scoring models but

also standardized the scoring model changing process. Through BI dashboards, the Sales and Marketing team had a good view on how the Low, Mid and High engaged

responses are being scored. This also helped the stake holders to take business decisions on when to change the scoring rules.

 

 

Useful Courses:-

  • Fundamentals of Forms and Landing Pages
  • Advanced Editing and Form Processing
  • Lead Scoring

Marketing Challenge and Goal

Lead nurturing is more than just piecing together content in a multi-touch email campaign with the intent of getting contacts to purchase. You need to recognize the journey your potential buyer's take as they identify their need, learn and evaluate solutions, and ultimately make that decision to purchase. Our digital communications team has recently been focused on converting dormant contacts from a specific market segment in our Oracle Eloqua database into qualified sales leads. The first attempt at a nurture campaign did not provide the results we were looking for so we needed to revise our approach. We needed to better guide this segment through the buyer’s journey to get the right content to the right contacts, at the right time.

 

Benchmarking our Initial Efforts

The first iteration of the nurturing campaign for this market segment did not prove to be as successful as we hoped for a couple of reasons:

 

  • We didn’t include any way of soliciting additional data from a contact which meant they could make it all the way through the campaign and we may still only have their email address.
  • The campaign was setup so the email sends were spread out a week apart so the contact had no way of moving through the campaign quicker if they were ready to consume the next piece of content.
  • The results were simply not there. We generated less than five sales leads from nearly 500 contacts that entered the campaign with a 29% unique open rate and a 6% unique clickthrough rate for six emails overall.

 

We needed to improve the campaign and start seeing better results.

 

Designing and Implementing our Revised Campaign

Our team reviewed the overall nurture campaign and associated assets and came up with three improvements that needed to be made:

 

1. Our first step was to revise the email content to better align with the buyer’s journey. We also needed a way to gather more data from the contact. We removed one email and made updates to the five existing emails:

 

    • Email 1 – “Need Recognition”: We introduce the audience to our solution and create the need. This included a free offering that could be obtained by filling out a simple form.
    • Email 2 – “Learn”: We explore the potential cost savings of our solution.
    • Email 3 – “Alternative Evaluation”: We compare our solution to the competitors’ solution and outline how we are superior.
    • Email 4 – “Purchase Decision”: We introduce our tools that can help them determine the appropriate product.
    • Email 5 – “Purchase Decision”: We give the contact the opportunity to book a free consultation with our dealers.

 

2. Next, we needed a lead scoring program that was aligned with the content we have in the buyer’s journey. The call-to-actions in the revised emails drive contacts back to our website, as this is where we host the majority of our content. Therefore, we devised a lead scoring program that relies heavily on page tags to score contacts based on what content they view on our website.

 

screenshot-1.jpg

 

3. Finally, we needed to revise the campaign to get contacts the right content at the right time. By using the “Shared Filter Member?” decision step with an evaluation period of seven days after each email step, we could immediately move the contact to a different email step in the campaign if their lead score determines they are ready for a piece of content in the next phase of the buyer’s journey. If a contact’s lead score doesn’t change within seven days, they will continue down a linear, time-based track and receive the next piece of scheduled content.

screenshot-2.jpg

 

New and Improved Results

We activated the latest iteration of the nurture campaign in late August 2017 and have run nearly 350 contacts through. We have seen encouraging improvement on key metrics:

 

  • Overall unique open rate: 38%
  • Overall unique clickthrough rate: 10%
  • Form submits for free offering: 28 (8% of contacts)
  • Free consultation requests: 16 (4.5% of contacts)

 

Future Improvements for Greater Success

The continual review and revision of this nurturing campaign is critical to ensuring we fully understand the buyer’s journey for this market segment. Here are three areas we need to improve upon in the coming months:

 

  1. We can report on email metrics, form submissions, and contacts sent to CRM, however we are having a hard time identifying HOW contacts are moving through the campaign. Knowing the path a contact takes through the campaign will give us an even better understanding of the buyer’s journey and will help us revise our email content further.
  2. At this time, we are only sending contacts to CRM if they request a free consultation. We need to continue working with sales teams to identify contacts that may be ready for purchase without requesting a free consultation.
  3. Integrating profile data into our lead scoring program will help us discover key decision makers in the buying process and will add another layer of data we can use to determine if a contact may be ready to purchase.

 

B2B Training Recommendations

While all of the B2B training courses played a role in the development of this campaign, the two I would recommend the most would be: the Effective Nurturing topic in the Modern Marketing: Revenue Performance Management (RPM) Series and the Lead Scoring topic in the B2B: Conversion course. These topics not only provide the technical knowledge of how to execute in Oracle Eloqua, but they get you asking the right questions so you are better prepared for success.

Marketing Challenge and Goals

It’s no secret that the relationship between Sales and Marketing teams within an organization isn’t always the best. After discovering the wonders of Profiler and Eloqua Engage, we knew we had a great opportunity to help better this relationship within our own organization. After learning how to navigate the tools ourselves, we set out to get Sales on board with these new tools.

 

We had a few goals:

  • Work with Sales to create lead scoring models for each of the different business units within the organization to help Sales prioritize leads/contacts
  • Educate Sales on the new tools and how these tools will help them close more deals
  • Strengthen our relationship with Sales to where both teams understand how we are helping each other and why it’s important

 

Benchmarks

When we began this process, we didn’t have many Sales users set up with Profiler and our Engage template library was bare. Since we were starting almost from scratch, we didn’t set out to hit a specific number of new users or templates. Instead, our broader goal was to get more users access to the tools and create as many Engage templates as we saw fit.

 

Planning and Implementation Process

In order to accomplish these goals, we followed these steps:

  • Used the information from Eloqua trainings to learn how to set up scoring models, how to use information within Profiler, and how to navigate the Engage tool
  • Created a PowerPoint presentation to use for training Sales that included:
    • What Eloqua is
    • How Eloqua works with Salesforce
    • Where to find Profiler within Salesforce and what information it provides (lead prioritization based on lead score, digital body language of leads/contacts, view exact emails/landing pages people are interacting with)
    • Where to find Engage and the benefits of being able to track opens and clicks, as well as the ability to send customized emails to multiple people in one send
    • How to set up Alerts and what to expect after enabling them

               ppt.png

  • Scheduled meetings with the Sales Leadership groups within each of our Business Units
    • In these meetings, we worked together to create a tailored Lead Score Model that matches up with the criteria they look for in a lead or contact.

                    leadscore.png

  • Worked with the Marketing Managers for each of the business units as well as our copywriting team to create Engage templates and upload them into the system for Sales to use
  • Scheduled meetings with the different Sales teams and set everyone up with access to the tools
    • During trainings, we used the PowerPoint we created, but as we did more trainings, we realized that live demos within the tool were also super helpful in getting our points across

                    profiler.png

  • Addressed questions and concerns during training sessions and made sure that the teams knew if they ever ran into any issues or had more questions, that they could reach out to us directly

 

Conclusion

After looking at reports, we have increased the number of emails sent through Engage from 105 in 2016 to 599 in 2017. We have also increased our unique clickthrough rate from 3.85% to 11.51%. We credit the increase in clickthroughs to the fact that we were working closer with Sales to refine the library with content they want to send out.

 

Throughout the process we have been met with excitement from Sales teams and have been creating more and more Engage templates each week. The feedback from Sales has been great and they’ve truly seen the benefits of using Profiler to see the digital body language of the leads/contacts and the information from these tools has helped them have super specific and tailored conversations on their calls. With these tools, Sales has been able to close more deals, which helps us on the Marketing team reach our Marketing Influenced Revenue goals for the year.

 

Eloqua Resources

We regularly turned to Topliners when we needed extra help. The community here is extremely helpful and friendly. Also, the B2B Conversion and B2B Targeting courses also played a big part in our successes. The live training courses are jam packed with information – I definitely revisited my notes and student guides many times.

 

Future Plans

This initiative isn’t over, but we’re excited that our relationship with Sales is stronger than it was when we started. We still meet with Sales teams for trainings on average about once a month and we continue to create new Engage templates to grow our template library. We remind the teams that we are open to refining the Lead Score Models if necessary and we keep the communication channels open for when questions come up. We hope to continue to increase engagement with these tools in the years to come.

It hasn’t been an easy process, but it’s definitely been a huge win that is worth the effort.

Our strategic initiative was to provide an efficient Global Project Methodology and an adaptable Industry Specific Implementation Framework to support eight Global Business Units in rolling-out Oracle’s Marketing Automation Tool, Eloqua.

 

The Challenge for 2017 was to implementation Eloqua in 8 business units globally spread over Asia, Middle East and Europe in 2017.


The Business Goals

 

A summary of the high-level business goals set to be addressed by the programme follow:

  • Increased capabilities in lead generation, nurturing and conversion
  • Improved alignment and effectiveness of Sales and Marketing
  • Improved skills and experience in customer journey design 
  • Improved consistency and standardisation for best practice sharing
  • Increased intelligence on customer behaviour and engagement
  • Improved customer experience from behaviour based messaging
  • Improved campaign optimisation and effectiveness

 

The business goals are documented in a full and detailed SMART business case, which was approved by the programme steering committee.

 

The Global Approach

 

Step 1: Global Terms.  The first task, prior to commencing implementation was the consolidation of a set of global terms and policies.  These were developed with BUs so that Sales and Marketing could establish a common language and business approach.

 

Step 2: Global Campaigns.  With this foundation in place, a set of Global Marketing campaigns were designed with the BUs, IT and Oracle.  In total, 25 global campaign designs across the Customer Journey were produced, which would enable collective company-wide implementation, reporting and refinement of campaign performance.  The below graphic shows an overview of the assets/materials produced as part of the above two steps:

 

Step 3: Global Data Layer.  With the global terms and the campaign designs established, we set about designing a Global Data Layer that would support the associated data and segmentation requirements and that would provide a data integration gateway.  This involved the development of nine Custom Data Objects (CDOs) to support the business needs for multiple Events per contact.

 

Step 4: Global Implementation Framework.  With the designs in place, we needed a repeatable Implementation Framework.  This consisted of documented templates and guidance on Organisational Change, Best Practices, Education, set up of Eloqua Campaigns, Assets, Data layer, Integrations and a menu of Oracle/Partner Implementation Services to support the BU’s with the implementation work.  The below graphic shows an overview of the interrelated nature of data integrations through to campaign executions (1-7), below that is a summary of the related Implementation Framework assets/materials (A-F):

 

 

Step 5: Global Implementation Methodology.  Finally, before implementation could commence and with the designs and framework ready for use, a repeatable Project Methodology was required to provide BU’s with templates and guidance on project governance, planning, prerequisites, roles & responsibilities and lessons learnt.  The below graphic shows an overview of the project approach:

 

 

Step 6: Local Project Scope.  The initial implementation goal for each Business Unit was to rollout a reduced set of the globally standardized campaigns, making use of the globally standardized data layer, using industry specific Custom Data Objects for a single pilot event – the recommended Global campaigns for the pilot projects were:

      • Visitor Awareness Campaign
      • Visitor Acquisition Campaign
      • Visitor Retention Campaign

 

Step 7: Local Project Support.  Throughout all stages of the projects, the Business Units would receive the Central Programme Team’s guidance with on-site/remote hands-on support as well as access to Oracle services, the Oracle University Marketing Cloud and the Topliners community.   BUs recieved documentation and guidance; related to modern marketing, marketing automation, organizational change as well as data layer design, configuration, integration mapping and set-up and testing of the global campaigns and asset configuration in Eloqua.

 

What have we learned

 

In terms of lessons learnt, we have conducted regular review of KPI’s and Campaign performances along with periodic post-project implementation reviews.   These learnings are regularly shared with the global BU network i.e. through internal BI-Weekly Super User group meetings, which are attended by two Super Users from each of the 8 BU’s.

The key implementation learning is to ramp up slowly, understanding the required organizational/process changes and through training and design project stages learn and plan carefully how you will transition to Eloqua.

Our top 10 Lessons Learnt:

    1. Campaign manager needs to have skills in marketing and technical
    2. Project members allocated to the project – backfill BAU
    3. Need full support of local IT– backfill BAU
    4. Engage all stakeholders up front on the campaign process
    5. Candidate Events for Eloqua – willing/receptive, capable, supportive
    6. Training must be undertaken as a priority
    7. Pilot should only be one event with rollout one by one
    8. Project planning should start from Registration open date (Visitors)
    9. Add a project buffer to deal with unexpected systems/data challenges
    10. Start with global campaigns and basic segments – optimise from there 

 

The global implementation of Eloqua was on separate instances per BU.   This solution design was taken rather than a single instance of Eloqua as a result of an evaluation with Oracle based on the need for global process consolidation, the need for a single global view and considerations related to implementation and operational complexity.  This approach enabled flexibility and resulted in accelerated implementations of Eloqua. 

It is to be noted that there are data volume limits for CDO’s that will need to be considered as part of initial design evaluations.

 

Campaign Performance (Work in progress)

 

In 2018, the programme shifts focus from Eloqua implementation projects to Eloqua optimisation projects.

For each of the Business Unit’s campaigns, a number of KPI’s were benchmarked in 2017 i.e. from data available prior to Eloqua and then from Eloqua every 2 weeks after launch.  Each of the BU implementations are now live and in the process of launching Campaigns for the pilot events.  It is therefore planned that in the first quarter of 2018 during the post-pilot event Eloqua Optimisation phase; Campaign Performance and statistics will be available in the following KPI categories:

 

  • Bouncebak rate (Hard and Soft)
  • Unique Click Through Rate(CTR)
  • Unique Open rate (OR)
  • Unique Click to Open Rate(CTOR)
  • Conversion Rate
  • Unsubscribe Rate

 

 

As an example, early results show that for one of the Business Units we are seeing Unique Open Rates for Acquisition and Awareness Campaigns growing by 13% whilst at the
same time delivering more personalised and targeted segmentation, resulting in a reduction in the number of emails sent by 50%.

 

Oracle University Marketing Cloud Courses

 

One of the project implementation prerequisites for each BU was for all the nominated Super Users to complete the B2B Masters Accreditation in 2017. B2B Master Requirements

 

For the BU Super Users with technical responsibilities, a technical training path was established, which included completing certain B2B Luminary courses e.g. Database Configurations, Custom Data Objects and Integrations.

 

A 2018 Eloqua business optimisation objective is for each BU nominated Super User to complete the B2B Luminary Accreditation. B2B Luminary requirements

 

The Oracle academy course of most relevance during 2017 were as follows:

 

Academy Course -- B2B: Targeting (2-day course) 

Oracle Eloqua: Data Cleansing

Academy Course -- B2B: Engagement (2-day course)

Academy Course - Database Configuration

Academy Course -- B2B: Technology (1-day course) 

Academy Course -- B2B: Analysis (1-day course)

Modern Marketing: Leadership Metrics

Modern Marketing: Email Deliverability

Marketing challenge:

A lot of marketing teams still guess about when is the right time to pass a lead to sales. Is it after the prospect downloads that 17-page e-book? Or, after they click on a few emails? How about after they spend 45 minutes browsing your site and visit your pricing page? We noticed that without a consistent framework in place, our MQL's (marketing qualified leads) will be hit or miss, and our sales team will waste a lot of time pursuing leads that don't convert.

 

Our Goal:

Our goal was to create a lead scoring model to facilitate constant communication, allowing sales to clarify which leads are the most beneficial to them while helping marketing to generate effective content and outreach efforts specifically targeted to those groups. Also, the goal was to make sure the Oracle sales team isn’t spending their time chasing leads that won’t turn into money.


Benchmark:

A lead scoring system can (and should) do a lot of the heavy lifting when it comes to filtering out leads that won’t convert. As a result, it will help shorten the time required to complete the overall sales cycle.

 

Hence, we decided to use Eloqua lead scoring to create our scoring models, based on the agreement signed by both sales and marketing.  Eloqua lead scoring allowed us to assign scores to our marketing leads based on the profile of the contacts, as well as the engagement activity carried out by that contact. This helped us determine which ones are ready to be sent directly to Sales and which ones still need nurturing in Marketing.

 

Implementation:

Before we could get started with any lead scoring strategy, it’s imperative that both our marketing and sales teams sit down and establish the criteria for what makes a qualified lead for our company.

For this, both our marketing and sales came together to determine which demographics, activities and behaviors make a lead more qualified - specifically, what makes us in Marketing like to call an MQL (marketing qualified lead)?

For example:

  1. Which demographics are more likely to buy than others?
    • CEOs or other C-Level executives?
    • Do they represent smaller or larger businesses?
    • What kind of revenue are they bringing in?

 

Once both the teams successfully agreed on the point values and marketing activities to identify a contact to be qualified as a lead, we further decided what score should that lead get to be considered sales-ready.

 

 

The whole goal here was:

  • To avoid scaring leads away before they are ready to speak with sales.
  • Make our sales process more efficient by enabling our sales team to work with only the most qualified of leads.

 

Execution:

 

The Lead Scoring in Eloqua is probably the easiest we found in a Marketing Automation Platform.

There are two views when deciding on what criteria to be scored:

(1) Profile – this would be about contacts information about the company that the contact works for, their title, or simply put their LinkedIn profile

(2) Engagement – is the content that marketing has sent out or created. This could be items on the websites people visit, webinars hosted by the company or even trade shows.

 

These two aspects are the backbone to Eloqua’s Lead Scoring. We call it the Lead Scoring Model.

 

So, let’s get into the nitty-gritty of what this is all about…

 

Profile: This is data driven. The data that is hosted in Eloqua can be used to examine if the person is the right person to make the purchase.

Here, we consider Title, Job Function or Job Role, Industry (this is another great one, because a lot of times there are specific industries that have a higher tendency to buy).  Region, Country, or other unique fields to your business also plays an important role in scoring the profile. The image below represents creating a profile section of scoring in Eloqua using scoring thresholds.

 

Engagement: This is activity and recency driven. Activity is the interaction with marketing materials that have been sent to the contact or that they have sought on their own.  This engagement activity is purely based on the activity, engagement factors in what the contact recently interacted with. 

For example, a contact clicks on an email that was just sent, but doesn’t interact with anything else, eventually their engagement score will go down because they haven’t interacted with any marketing material in a specific time frame. Some of the most common engagement criteria we look into is email click through, website visits, form submits, and page tags (there are many others that can used as well).

Getting to the actual score, is system set, meaning one will always have a score generated by Eloqua that is an A-D and 1-4. The piece that you can configure is the point span that it covers. The image below shows the sliding scale that indicates what range each of the letters or numbers pertain to. We could either want D1 to be smaller than the rest, or maybe you want to have C/B & 3/2 be narrow. Either way, the scale is out of the box set as you see in the screen capture, but it is customizable to what Oracle may deem more appropriate for a score.

 

These two dimensions of lead scoring will help Marketing and Sales create a graph for scoring leads.

We use A, B, C, and D to rate a prospect’s fit along the X axis of graph, and 1, 2, 3, and 4 to rate their engagement along the Y axis of graph. A1, A2 leads – those with ideal fit and maximum engagement – can go directly to Sales. A3 and A4 leads, which have the right fit but minimal engagement, should be nurtured until marketing detect stronger signs of engagement. C1 or D1 leads, which have high engagement but low fit, might be worth engaging in conversation to see if they’re doing research on behalf of a more senior decision maker.

 

 

What happens next?

Once the system processes the contact in Eloqua, that contact would go through lead scoring mechanism along with other quality checks. This contact information is made available in Eloqua Profiler, so Marketing (or whoever is authorized for the view), can view the entire detail in a single place. This same screen is linked to Oracle Sales Cloud for Sales to view the exact same information what marketing sees in Eloqua. Based on this, sales could prioritize their calling.

Apart from Eloqua Profiler and contact view in Eloqua, we do make use of various other Dashboards and reports available in Eloqua Insight. This helps a lot in understanding the numbers in bulk and do various analysis and trends.

 

How does Sales view lead information?

The lead information is passed to sales through Oracle Sales Cloud (OSC), which is linked to Eloqua. It is important to share this information with Sales, so they know which the lead is and how it is scored. This accumulated information helps them to prioritize on which lead to be engaged first.

 

How has Eloqua lead scoring helped business?

  • By creating lead scoring models in Eloqua and scoring contacts based on the profile and their engagement, both marketing and sales were able to achieve a lot in terms of conversion.
  • Quality and high priority leads were sent to sales. Thus, making sales follow up better and faster.
  • Time spent on each contact were reduced to manually identify high value leads.
  • High profile contacts with low engagement fit contacts were sent to Nurture. So, to nurture these contacts over a period and get them sales ready.
  • Less time invested and more revenue generated.

 

Marketing cloud course influence:

B2B Luminary requirements taught us modern marketing best practices and how to apply them in Oracle Eloqua. Especially, the B2B: Conversion topic, which has Lead Scoring as the curriculum subject, helped me learn in-depth about scoring mechanism. Other courses helped are:

      • B2B Engagement
      • B2B Targeting
      • B2B Conversion
      • B2B Analysis
      • B2B Technology
      • B2B Fundamentals and Email
      • B2B RPM

 

Of course, the Topliners community not to forget. This community is the library of topics which one could browse through for any topic, post questions and get answers, live chat option for quick question and many more… this is one of the communities I do refer daily.

 

Conclusion:

Lead scoring is one of the greatest options we have in Eloqua. However, there is so much more we could do in Eloqua. Happy learning, Happy Exploring!!

Introduction:

With any enterprise company, one will have legacy systems and processes. With email automation, it’s particularly precarious when managing legacy campaigns. It is important to apply a consistent and regular governance procedure around automated campaigns, taking particular note of additional functionality in Eloqua, as well as being aware of and creative with data that you have available to you.

 

Situation:

In this example, our legacy campaign had not been updated in about two years. There were a number of issues with the campaign stemming from not maintaining a proper governance review. The result was that there were too many emails going out to contacts. Recipients were feeling overwhelmed and unsubscribing.

 

Implementation:

After refining our segmentation (Fundamentals of Segmentation – updated and refined the campaign segment) and making sure that we were targeting our audience correctly, we made updates to the time span that the campaign was executed. For this campaign, we decided that we didn’t want the email to go out to anyone who had been a customer for more than 90 days. Prior to this, the limitation was two years. This resulted in a more timely and relevant communication and  30% reduction of email sends.

Luminary picture.JPG

Continued Monitoring:

Following the RPM class, we knew that it was important to continue to monitor the success of this campaign following the update. Having contacts flowing into the campaign continually requires frequent updates to the segment contingent on business concerns.

 

The existing campaign also leveraged dynamic content in the body of the email. The email was leveraging dynamic signatures for personalized one to one communications as well as dynamic greetings and logos. Making sure that the right message and call to action were delivered to the right person at the right time is key to the success of this campaign.The resulting message provided a very compelling call to action. The campaign was now overwhelming our support team because contacts were reaching out and setting meetings for low priority concerns.

 

Second Implementation:

With fewer emails going out to a better segment of contacts, we started getting much more of a positive response. Our campaign was more effective than our support team was able to keep up with. We needed to be able to focus their time more strategically. For this, we added another dynamic element to the copy of the email. We made the dynamic element a call to action that only presented to high level contacts leveraging the title contact field -- and removing the call to action completely for lower level managers and supervisors. For lower priority contacts, they would receive a standard instructional email.

 

Results:

Following classes on dynamic content and segmentation, we were able to build a campaign that produced the desired outcome of finding decision makers and connecting our sales team with them to conduct further business. After refreshing the legacy campaign,, we were able to see a 30% decrease in the number of emails that were going out. Following the dynamic call to action for high level executives we were better able to attain a much more strategically focused work flow for our support team.  In the first week following the campaign update, our support team was able to secure an event.

Marketing Challenge:

Happy and engaged customers are critical to current and future business. Not only do customers pay you, but they also determine future sales through word-of-mouth and their feedback, which helps you to improve your product or service. Customers provide so much untapped value, which is why nurturing your relationship with them is critical. Hence, we decided to nurture the leads who are important to us but not yet ready to buy.

 

Goal:

Our goal was to create a lead nurturing camping to assist the lead/prospect down the inbound marketing sales funnel and keep them involved and engaged with Oracle. Which would indeed fetch Marketing Qualified Leads (MQL’s) delivered to the sales team.

 

Benchmarks:

Prior to Nurture Campaigns, normal Email campaigns were used to market our products. To be more precise, predesigned emails are sent on a predetermined schedule geared towards education, branding, positioning, or selling of the product. These marketing emails were distributed to a broad audience. With nurture emails, they are delivered to a proper targeted audience mainly qualified contacts with the goal of converting them to the next stage in the sales funnel (Qualified contacts are those who have expressed some form of interest in a topic, product or service). Thus through nurture we followed a systematic process of building relationships with these contacts.  Hence, the difference between the content in a Nurture Campaign vs that of a normal Email Campaign is that we knew the buyers’ interests.  Consequently, the emails for the Nurture Campaign can begin to market our services or products that will speak directly to their needs or interests. Ideally they were designed to have a long lifespan - 12 - 18 months at least (assuming they will be constantly be monitored and optimized during the duration of the campaign). So we had to use messaging, content, and offers that won't expire anytime soon.

 

Best Practices:

We followed few best practices to make our nurture effective:

  • Sender Name: Personalize, build brand recognition and be consistent
  • Subject Line: Keep it short: Convey value within the first 35-50 characters and always try to build trust
  • CTA: Keep it short within 2 to 5 words and create sense of urgency by using action words
  • Images: Keep them relevant to the content and keep total email size within 100kb
  • Content: Use short sentences and bullets, keeping total body count within 150 words. Personalize when possible.
  • Timings: Keep global time zones in mind when sending emails
  • Segmentation: Break our list into segments and customize the message based on the segmented list

By doing these we were able to achieve the following:

  • Time to Conversion

If the lead is taking a month to make a purchasing decision, then we made sure that we are spread out the communications to keep them engaged throughout the month. We don’t want to pummel them with sales pitch. We started out by sending useful, low-pressure information with content-based calls-to-action; then slowly transition these call-to-action to buy.

  • Level of Activity

  How many of the website pages do leads usually visit before converting into a customer? Knowing these benchmarks helped us appropriately space out our communications.

  • Open and Click-Through Rates

We had to keep an eye on the open and click-through rates of each of our campaigns emails. We had to look for drop-offs or other anomalies in open and click-through rates, and test different messages and subject lines to reduce them.

 

Implementation:

THERE ARE 2 STEPS TO A GREAT NURTURING CAMPAIGN

  • Determine target  audience for the nurturing campaign
  • Build the Eloqua nurturing campaign

Which included:

  • Determine the nurture type based on target audience; i.e.: Pre-MQL, MQL, Re-engagement, etc.
  • Discuss campaign flow architecture
  • Review and complete the Content Inventory Checklist
  • Review segment possibilities using Endeca
  • Develop content plan for emails and landing pads including the number of emails for each campaign flow
  • Create MRM Program Codes
  • Work with your Nurture Best Practices Manager - Kick-off Call with Nurture Best Practices Manager, PMO and List Analyst
  • Work with BDC/BDR/BDGs to properly prepare them for the leads they will receive.
  • Review: Campaign purpose and content and finally lead follow-up

 

Execution:

1. The digital team has created and assembled the content and basic framework. i.e. primary campaign creation in MRM.

 

2. Once we had our target audience chosen (we used Eloqua Segment List to pull this information) and our messages were in place (Email content), it’s time to start building our nurturing campaign. Hence, we have built the  campaign in the campaign canvas.

 

3. This also help adjust Frequency of contact i.e. Daily, weekly, fortnightly or monthly? Or, here’s a radical suggestion – ask people how often they’d like to hear from you?) and Days of the week and the time of day you want your message to be delivered.

 

4. We created Landing page with Progressive profile form to collect basic information of Contacts

5. Responses are scored in Eloqua as per the lead scoring models based on their Profile and Engagement.

 

6. Once the nurturing campaign is active, we had to evaluate and measure the performance of the campaign. This was done through CXD BI reporting.

 

Results:

By Implementing Lead Nurture, we found that Lead nurturing emails generated an 8% CTR compared to general email sends, which generate just a 3% CTR. On an average there was a 20% increase in sales opportunities when compared to non-nurtured leads. The Management is happy with this performance. We started focusing only on APAC Region initially. However, seeing the kind of improvement in the process we are soon to start with other Regions.

 

Helpful Marketing Cloud Courses:

B2B Engagement

B2B Targeting

B2B Conversion

B2B Analysis

B2B Technology

B2B Fundamentals and Email

B2B RPM

Filter Blog

By date: By tag: