Skip navigation


As a senior leader of strategic marketing enablement for a fortune 1000 company, I led an assessment of my organization’s international marketing group.



The purpose of this assessment was to evaluate team skill levels, gaps, understand consistency or process concerns and determine remedial actions for widespread adoption of best practices and improved processes.



At the conclusion of the assessment, I found that the international team was lacking process, standardization, education and understanding of best practice. They also had no identifiable strategic structure for internal standards, core processes and cross functional relationships that impacted marketing system processes.



To advance the marketing maturity of our international team of marketers, I identified a phased approach that included:

  1. Campaign management
  2. Structured templates to provide technology and infrastructure support
  3. Process and education roll out for economies of scale


1. To enable campaign management, I:

  • Met with stakeholders and collected existing process documentation
    • Found that campaign process requirements had primarily been communicated to the team through email and required the marketer to collect, create and update their own processes documentation as they were notified of any changes.
  • Compiled all existing process communications into one cohesive document
  • Authored additional campaign deployment and origination processes focusing on automation, reduction in manual labor and data accuracy
  • Utilized the gap analysis data to determine areas where the team’s adherence to process was lacking and where process was nonexistent, but required
  • Revised existing processes to include explanations of their importance and impact on the organization
  • Devised further procedures to hold team members accountable through Eloqua campaign QA’s
  • Optimized the manual to include an index and glossary to further assist in accessibility of required information and ensure language understanding

2. To provide structured templates for technology and infrastructure support, I:

  • Met with stakeholders to understand our business needs and short and long-term goals
  • Utilized data obtained from the gap analysis to further clarify what technology and infrastructure support would be needed to back various skill and education levels for asset creation and ensure accurate data transmittal for attribution
  • Identified a need for segment, form, campaign, email and landing page templates
  • Built out templates to accommodate our three instances of SalesForce and our various international locations and languages
  • Tested and recalibrated the templates
  • Compiled template process requirements and incorporated within the campaign management documentation


3. To ensure acceptance and widespread adoption of these processes and templates, I focused on process and education roll out for economies of scale throughout. To achieve this I:

  • Sought and utilized executive-level buy in and support throughout the phased roll out
  • Stressed my ongoing desire to support the team through all of the transitions
  • Maintained an open door outlook to questions and concerns and providing rounded support, feedback and one-on-one trainings where needed
  • Created an ongoing communication plan to provide updates on the processes
  • Spearheaded and owned the lead educator role of an ongoing monthly educational forum to discuss process, concerns, best practices and successes of the team
  • Recorded and distributed the trainings internationally
  • Collected and utilized monthly survey data from educational forum participants to ensure efficient communication, subject matter understanding and to accumulate future topic requests


Through this phased roll out, I was the primary author of a processes manual, created various technical and non-technical templates and checklists for consistency and quality control, and began a monthly Marketing Educational Forum where I am the lead educator. 


The impact of this initiative on our organization has been remarkable. As a result of the educational offerings, email, landing page, form, segment and campaign templates — as well as structured process that validate the utilization of these efforts and the accuracy of assets prior to deployment — our international efforts are adhering to best practices, are consistent in voice and appearance and are ensuring that we are collecting and delivering accurate CRM data for attribution. We are seeing 100% utilization in the templates and processes, 100% of our campaigns are deploying without errors and our analysts are now able to more accurately identify marketing’s influence on revenue. Our most recent statistics indicate that marketing has influenced 141 million in closed-won revenue for our organization, whereas just a year ago, before these changes, we were unable to identify marketing’s impact on the pipeline at all. I would be delighted to chat with anyone about what I have done and how you might impart these changes for your organization as well. Please reach out to me at




Marketing Cloud Courses for a similar initiative:

  • B2B: Fundamentals Series
  • B2B: Advanced Editing and Form Processing
  • B2B: Personalizing Campaigns
  • B2B: Progressive Profiling
  • Best Practices: Global Demand Center (WBT)

2017OMCABadge_4x4_OTO-Master.pngOracle Maxymiser students can now earn our Master-level accreditation through Oracle University Marketing Cloud's OTO Master title, now available for 2017.


Students pursuing this achievement will be required to complete the master curriculum for this product and successfully pass the OTO Master exam with 80% or higher. Those who achieve OTO accreditation will receive a certificate of with the date of completion and recognition on Topliners. View the requirements for this accreditation here: Requirements for the Oracle Maxymiser (OTO) Master Title


This achievement opportunity is available to students with a Maxymiser education pass and may be available with other education purchases from Oracle University Marketing Cloud. Our sales team can provide more information about access.


For more information on Oracle Maxymiser training, contact our sales team at or submit an information request form.  Students with an active training account can view our course schedule in the learning portal under Training Schedule.


Contact the Help Desk ( at any time with questions.

An Eloqua Admin’s Survival Guide to Building Successful Lead Scoring Models


Building a successful lead scoring model is both an art and a science. The science derives naturally from data-driven metrics and a programmatic evaluation of digital body language exhibited from individuals. It’s an art because there are no definitive answers to the right or wrong approach to lead scoring. The important thing is to be methodical, maintain change management documentation, and have strong communication with stakeholders to drive alignment.


Lead scoring is so much more than just a behavioral calculator, it can be a powerful tool that drives effective team collaboration and builds greater alignment between Sales and Marketing. When I first joined my current company, one of my key initiative was to migrate our legacy E9 Scoring Program to the “Out of the Box” E10 Scoring Models and create a methodology to scale lead scoring across multiple product offerings. 


In this blog, I’ll be sharing a few tips I’ve learned throughout my journey of migrating, adapting and scaling our old lead scoring model to Eloqua’s E10 out-of-the-box solution.across multiple product offerings.



Tip #1 – Start Small and Focus on Perfecting the One

We started with updating a single yet complex lead scoring program that was originally built by a 3rd party system integrator using Program Builder. I went through an extensive deconstruction exercise and creating detailed documentation with visuals mapping out each scoring criteria, subsidiary program branches, and integration/update rules. I wanted to make sure that the stakeholders involved had visibility into the current design and could agree on baseline scoring criteria that I would be prototyping from.


Example of a Lead Score Feeder diagram I created

The goal was to methodically build a lead scoring template with adaptable components to easily replicate and scale across multiple product lines as needed. I intentionally started small and worked on perfecting a single product scoring model through multiple iterations until it was jointly approved by both Sales & Marketing stakeholders. Throughout that process, I established a consistent design methodology and a change management framework that was repeatable and scalable for new product offerings.


I often referred to the Eloqua Help Center and Self-Guided Lead Scoring Plan Builder whenever I felt lost or just needed some help to get back on track. The Help Center provides detail instructions on building out the scoring model, while the plan builder will ask you a series of questions to help you determine tactical readiness. You can even export all of your answers as an action plan to guide you through the whole process.



Tip #2 – Start Making Friends

Collaboration and appropriate communication is the key to success in any organization. While Lead Scoring configuration is a Marketing responsibility, its impact spans across multiple departments. Staying in alignment with these teams can help the project go smoothly and ensure suitable resources are appropriated for the project. (A little heads-up to stakeholders can go a long way!)



One of the best ways to build and verify your customer profile and scoring criteria is by talking to the people who work on the front lines. Sales people have a wealth of knowledge and I found myself picking up useful insights and ideas that I might have missed if I didn’t have a conversation with them. Sales people are extremely busy (at least the successful ones are) and it can be hard getting their time commitment. I found that staying persistent and showing genuine interest in their feedback would often pay off. Having these conversations is the first step towards building a successful lead scoring model and achieving Sales and Marketing Alignment.


IT/CRM Admin

Depending on your organization, you may have an individual or team who administrates your CRM system. The CRM administrator is a VIP in my strategic alliance team and I’ll always make sure that I am in constant alignment with this role throughout the project. The CRM administrator not only is “The Gatekeeper” of the CRM system where sales people live and breathe in, but also the one who configures and approves any system changes that can have a direct impact on your entire project. Not to mention that CRM administrators are usually responsible for configuring lead assignment workflows that assign the appropriate lead owner. It’s super important to have CRM Administrator on your side, especially when things go wrong and when or if you need help with problem diagnosis.



Marketing is also one of your internal customers, so you should always make sure that this group is in constant alignment with your efforts. Marketing’s complete buy-in is key to building a successful lead scoring model, since they’re the system users that create and execute campaigns that drive scoring. I recommend driving alignment with marketing beginning with standardizing emails, forms, and campaign creation workflows. These activities will maximize operational efficiency and data consistency which is needed for scoring models to work properly. (B2B: RPM course covers this topic in great detail)


Starting from the customer profile feedback you received from Sales, the next step is to work with the marketing team to establish ideal customer profiles and then map those profiles to the buyer’s journey. I recommend using this Buyer Persona Development Guide as a starting point, along with going through the Buyer Persona Worksheet and Buyer’s Journey Worksheet to establish agreements.


Next is to translate the Buyer Persona Worksheet into Profile Criteria for lead scoring. This Lead Score Card is a great uniform tool for documentation and provides great visual context to shape conversations when meeting with stakeholders.


Tip #3 – There Is No One-Size-Fits-All Scoring Model

During this process, I quickly realized that there is no One-Size-Fits-All solutions when it comes to lead scoring. We currently have 4 scoring models (2 in Program Builder and 2 in E10) and each model has been customized based on the unique aspects of each individual product offering. Taking the B2B: Lead Scoring course and B2B: Program Builder Overview gave me a solid knowledge foundation that has helped me tremendously throughout the design and implementation process.


Picking Program Builder Scoring Model vs. E10 Lead Scoring Model

Like I said in the beginning of the article, there are no right or wrong answers to the lead scoring approach. Each approach has its own benefits and limitations, so therefore it largely depends on the specific use cases that you’re building. You should explore a variety of scoring approaches to see which fits best with your business requirements. In our case, we are using both Program Builder and E10 Lead Scoring currently in our instance for separate product offerings.


Program Builder – Benefits

  • Customizable – From forms to websites to webinar registration, pretty much everything on the scoring program is customizable to your liking.
  • Additive Point System – While the E10 scoring models limits scoring to a maximum score of 100%, Program Builder scoring is based on a point system that has no max score limits. This flexible point system in conjunction with its customizability allow finite controls over individual lead score value in relation to how and when a sales ready lead is triggered.

Program Builder – Limitations

  • Not for Everyone – Program Builder is extremely powerful and can perform extensive tasks. It’s usually configured by advanced administrators or vendors who possess extensive Program Builder knowledge. This means any minor tweaks will also need to go through them, which can potentially create bottlenecks and expenses.
  • Scoring Can be Slow – The fastest time Program Builder can score someone is about 5 minutes, and that’s if the program is running in Priority mode. Realistically, scoring programs usually run in bulk mode due to the combination of complex evaluation steps, feeder programs, and integration rules that are running simultaneously. Not to mention Bulk mode runs every 2 hours which can make lead scoring incredibly slow.
  • Program Dependencies – The byproduct of customizability are the unintended complex dependencies that can be created in program builder. In our case, the scoring program grew so big and complex that it prohibited minor scoring changes without the need to go through extensive change review. Having extensive documentation can help with this scenario, but in our case even with extensive documentation it became a bear to handle.


E10 Scoring – Benefits

  • Fast Scoring – The out-of-the-box E10 scoring model is fast! From my timing tests the contact usually get scored within 6-10 minutes after a form fill in our instance.
  • Automatic Rescoring – I love the fact that E10 scoring model automatically rescores everyone every 24 hours! This feature is so convenient and saves me time and resources from having to build yet another program for cadenced re-scoring.
  • Co-Dynamic Score – E10 splits profile and Engagement score values into pairs of a letters and numbers, e.g., A1, B2, etc. The letter grade represents the Profile score and the number represents the Engagement score. In general, A1 would be your ideal profile with most engagement (Sales Ready) while D4 would be your least ideal profile with least engagement. You can easily create segments based on these scores and you can automate nurture tracks for efficient targeting. Sales also benefits greatly from this extra bit of information as co-dynamic scoring helps them prioritize their lead follow-up routine.
  • Folder Level Scoring – One of the biggest improvements I see using E10 scoring is the flexible options on how form submissions can be scored. You can score submissions from Any Forms, Specific Forms, Any Forms in Folder, and Any Forms in and under a specific Folder. I found the folder level scoring extremely helpful to manage hundreds of forms that need to be scored. Marketing users simply place their forms within dedicated scoring folders for submissions to be scored. This saved so much time by avoiding the need to setup and define every single form in each scoring model, especially when new forms are constantly being created.

  • Flexible Scoring Criteria – Another great feature for E10 scoring is the ability to score on different objects including Accounts, Contacts, Shared List ,and even related CDO!

  • Rapid Prototyping – One of the reasons we decided to switch to E10 scoring models was due to the ease of configuration. I love using the simple graphical interface, drop-down menus, and sliders to adjust the weighting of each scoring criteria. With E10 scoring, I can spin up my multiple prototypes within hours and compare results using the preview function to validate results from different scoring criteria simulations.


E10 Scoring – Limitations
Disclaimer: Now by no means am I a “Fan Boy” of E10 scoring model, and I’m sure there are many limitations that I’ve missed in this article. I must admit that I felt that the benefits of E10 scoring vastly outweighed the limitations I experienced, and I’m optimistic that it will only get better with the immense support from Eloqua. That being said, I do have a complaint about it. Since these limitations surround mainly our specific use cases, take it with a grain of salt.

  • Less Flexibility to Override Score Our CRM is configured in a way that Sales will only see leads that reached or exceeded a sales ready threshold – we call it MQL (Marketing Qualified Lead). For certain high value form submissions (Contact Me Form) we needed a “Cut-Through” mechanism that will override the existing score and trigger MQLs for immediate sales follow up. While we can easily do that with Program Builder using update rules, building it in E10 is challenging. With E10 Scoring, the score fields live on a separate but related table that can only be updated by the lead score engine. This means everything needs to go through scoring and it’s not easy override score values and manually trigger MQLs. We did find a way to achieve the same cut-through mechanism in the E10 model, but at the expense of sacrificing a significant portion of our total available percentages. This also affected the proportions of our engagement threshold and limited score weight on additional engagement criteria. 



Our lead scoring projects have placed me in a dynamic cross-functional role privileged to work with different departments. Building the actual scoring model is the easy part, but doing all the work that leads up to the build is where the real challenge lies. Fortunately, Eloqua makes it easy to implement scoring by offering abundant tools, training courses, and resources. These capabilities have helped me and will help others kick start their own lead score journey.


One last thing I learned from this experience is that lead scoring will never be perfect. In a constantly changing business landscape, the only constant is striving towards perfection with gradual improvements through iterations -- which is why lead scoring to me is truly a balancing act between art and science.


Helpful Courses: B2B: Conversion (Lead Scoring), B2B: Program Builder Overview, B2B: Fundamentals Series (2-Day Training)

The Challenge

Our client is a leading international software and hardware solutions provider wanting to have marketing attributed revenue, and utilize omni-channel partner marketing, including paid search, email marketing, and trade shows for the first time.

The client wanted a bigger picture of their marketing efforts, and go beyond emails and static assets. They pursued a virtual interaction by using Ion to power an assessment that gave the audience an opportunity to provide the client feedback indicating the types of solutions and services they were seeking, and inform the client where in the buying cycle they were.


Useful Courses

  • B2B: Targeting
  • B2B: Engagement
  • RPM: Targeting & Segmentation
  • Fundamentals of Emails
  • B2B: System Integration




The client had modest goals for this campaign. They wanted to be able to attribute their marketing efforts to won opportunities, and ultimately, revenue. Also, the client wanted to attribute opportunities to lead sources in order to justify their ad spend for paid-partner marketing. Identifying which tactics, and sources were the most effective would allow the client to become more efficient with their ad spend, and allow them to focus on what works rather than applying their budget towards tactics that may not be as effective. These goals would allow the client's marketing department to gain the trust and confidence of the sales department, and result in stronger marketing-sales alignment.

  • See one opportunity that can be attributed to marketing efforts.
  • Validate the value of using specific paid-partner marketing tactics.



Prior to this initiative the client had a very basic approach to digital marketing. They did not  track the lead source of opportunities resulting in the inability to attribute revenue to their marketing efforts. The client's primary strategy was  batch and blast, sending a generic message to as many as people as possible. They did not incorporate segmenting or targeting to deliver the right message to the right audience at the right time. Their KPIs were built around email metrics versus opportunities and revenue contribution obtained from their marketing efforts. They had not created any true nurture programs to guide their leads and contacts along the buyer's journey to ultimately deliver Marketing Qualified Leads to their Sales teams.



The campaign was an omni-channel effort that directed gathered net-new leads from trade shows/events, and existing leads from sales into nurture campaigns with the goal of the contacts filling out an ION assessment in order for the client to gain insight on what products, or services each lead indicates interest in. The result of the assessment generates a buyer readiness score that indicates where in the buyer's journey the lead currently resides. The Lead's interest and Buying Readiness information is then passed on to their SFDC.


Tools Used:

  • Campaign Canvas
  • Program Canvas
  • Form Report
  • Form Submit Cloud App
  • Forms
  • Email
  • Landing Pages
  • Segments
  • Shared Lists
  • Custom Data Objects
  • SFDC integration
  • ION Integration




Below is the approach utilized to achieve the client's goals:


Dell GSD IT Lifecycle Overview(-2.jpg

Using this omni-channel approach, our team was able to begin the process of refining their digital marketing strategy. By encompassing multiple channels, and tracking the lead source through Eloqua the client is able to differentiate and attribute revenue from each source. The contact's interest, and buyer readiness score advised the client on which nurture campaigns they should be added to, or if they should be passed directly to sales.  This also helped sales to tailor their conversation with leads based on their interests and buying readiness allowing them to create relationships and therefore convert to opportunities. Based on backtracking the opportunity's lead source and campaign membership, the client was able to justify their ad spends. With this effort, the client was able to identify sources that would produce the most marketing qualified leads, and adjust their resources accordingly for future marketing efforts.




The goal of the campaign was to achieve one marketing attributed active opportunity, the results were three. The client now has a better insight to which original sources drive the best quality leads. They also are able to have better educated conversations about budgeting and marketing spend that will result in business won opportunities. To capitalize on this knowledge, the client has begun to host targeted webinars for each of their "awareness tracks," to further educate their leads on the available offerings, and encourage them to move through the pipeline. By aligning sales and marketing, there is a joint effort between both teams, and sales is now comfortable and eager to provide leads for this initiative.



The campaign process is now enhanced by this push. Marketing is able to provide training on what the leads are, when and how to follow up on them based on the assessment results obtained from the Ion assessment. Marketing now knows what messages to send, and sales knows which conversations to have based upon where the opportunities are in the buying cycle. With the Ion set up we were able to originate traffic sources for each referring page, allowing the client to better understand which banner ads drove the most qualified leads.



Is there anything you would have done differently? Or elements you would change as you go forward?


To further increase the value of the campaign, we would have set up a hidden field to capture the Eloqua lead source, as well as the Ion source within Eloqua for a more holistic reporting view. This would have allowed for less manual data manipulation. Currently the results within the Eloqua campaign, and the Ion campaign must be combined and formatted within a spreadsheet.

Filter Blog

By date: By tag: