As marketers, we test everything. From subject lines to images to CTA placement, we are constantly looking for ways to improve conversion and "stickiness". All of this we measure through opens, clicks, bounce and unsubscribe rates. The A/B test functionality in Eloqua allows us to quickly and easily test any one of these elements at an email level. You select the two emails you want to test, the segment you want to send to, the percentage of the list you want to use for the initial test, and the metric that declares the winning email to be sent to the remainder of the list. Easy peezy.
But what if you wanted to test something outside of the email composition? What if you wanted to test time of day? Within a single day. To an evenly split segment.
Our client, a leader in the financial services industry, sends a weekly newsletter to a subscriber base of 104K. With falling open and clickthrough metrics, we rolled out a series of benchmark tests over the course of six months in an effort to reach our targets of 25% open and 6% clickthrough:
- New template design
- A/B testing subject lines
- First name personalization
While each of these steps brought us closer, our targets were still just out of reach. The next phase of testing was around the deployment itself. With such a large audience we wanted to test 3 different time of day deployments. As deployment testing required more than the built-in A/B testing function provided, we looked to Program Builder to split the list evenly into 3 equal segments.
Before setting up a program to split the source list, we created some supporting assets:
- The full segment for the mailing
- 3 shared lists, one for each time of day we wanted to test
- A program to feed the program builder asset (this is only necessary if you're list is already in Eloqua. Otherwise, you can upload the contacts to program builder directly.)
We then created the following program, using Program Builder:
- Step 100 Add List
- This step is fed by the program feeder we created
- Step 200 Split 66/34
- Isolates the first 34% of the list, and passes it to Step 400
- Step 300 Split 50/50
- Takes the remaining 66% and splits it into two equal segments. Passing one to Step 600 and the last to Step 500
- Step 400, 500 and 600
- Pushes those segments through an action step that updates the shared lists
- Steps 700, 800 and 900
- Pushes contacts out of the campaign
The following steps enable us to execute deployment at 3 different times to an equal size list.
- Create 3 separate segments, each based on the shared lists populated by the program builder
- Save 3 versions of the single email to be tested, so that stats can be tracked separately
- Create a complex campaign with 3 streams, one for each time being tested
We ran this same process for 3 consecutive Fridays, testing the same 3 times of day. We identified that 2 PM ET consistently saw the highest open and click through rates, while 4 PM ET saw the highest bounce and unsubscribe rates. Armed with this knowledge we changed our deployment approach and moved the needle closer toward our conversion goals.
Testing is never a 'one and done' deal. It's a process. As marketers, reaching your audience requires getting granular in your testing approach. Whether it's creative elements or deployment strategies, identifying what works is a marketer's top priority. Finding innovative use cases for the automation tools available to you? That's where all the fun is. Happy Testing!