Introduction to the Marketing Challenge:

The marketing challenge I was tasked with solving was creating a survey to send to clients to gather feedback on the training they had recently received. As part of the healthcare world, we help hospital systems understand various software solutions that they have purchased from our company. Once we get our clients trained, we love to hear back about how they could have received better training or to give kudos to internal folks that did a good job. The way we decided to do this was to use Eloqua to build a survey to send to clients that recently completed training. I designed a form and inserted this into a landing page with the questions we wanted to ask. Using the WYSIWYG, it was easy to create an email to send with a link to the survey. We actually ended up making three versions of the survey email to test which got the best response. Every two weeks the survey is executed from Eloqua to our most recent trainees.



Goals for Solving the Marketing Challenge:

We had goals of increasing the rate of our survey responses and automating some of the email send tasks. We also wanted to find ways to increase the average overall score the trainee gave their webinar or onsite event. We had been using Survey Monkey to track responses to our survey as well as the scores given. We had been averaging about 30% for the response rate. There were changes to the structure of the scale we used to get feedback, so it was difficult to translate average survey scores and set a baseline.


Project Approach:

The campaigns we executed out of Eloqua were initially very simple with a segment and an email element added to the campaign canvas. Later, as we got more sophisticated, we added a wait step of a week with another email sent prodding for feedback on their training (pictured).CampaignCanvas.jpgFor privacy reasons I can only share the campaign canvas, but this will evolve even more as we add automating pieces and lead scoring. The decision elements were easy to add and configure to help direct the logic on the campaign canvas. The next step we will take is to create some shared lists where we can automatically route those who respond and those that are un-engaged to appropriate campaigns where we can either nurture the relationship based on their feedback, or begin a re-engagement campaign for those who declined to answer. I have recently begun building A/B testing using shared filters and will add this as we develop content that we would like to see what works better. This will be added to the campaign canvas when it is complete.


Influential Courses:

The course that was most influential to me as I set up this campaign was the B2B Engagement (OnDemand) module. Talking about lead nurturing and how to identify and route contacts based on activity or contact data was really helpful in managing the flow of these campaigns. We still have work to do on setting this up fully, but the course has been a big help in the strategy and planning of this initiative. Our initial response rate took a dip to about 17% and we are brainstorming ways to bring this back to where it was previously. There are plenty of areas to focus on, and we are looking to implement a steady stream of changes that we can test to see how much better we are actually doing as an organization.

We are trying to personalize with more field merges and make our emails look friendly to open. We have decreased the usage of pictures in hopes that making the survey look as though it came from a manager’s Outlook and therefore be more likely to be clicked.


Leveraging Outside Help:

Topliners is great to see where people have similar questions and the very useful support of how to solve some of those issues within a community. A/B testing on the campaign canvas is one way that Topliners is crucial for getting functionality for testing various content. The blog post there has been helpful in building out the filters needed to support A/B testing.