Four Things You Can Do With Inconclusive Split Tests

There is a certain sound a teenager makes when confronted with a choice they aren’t interested in making. It is a sonic mix of indecision, ambivalence, condescension and that sound your finger makes when you pet a frog. It is less committal than a shrug, less positive than a “Yes,” less negative than a “No” […]

Chat with SearchBot
Teens don't care, much like our search traffic.

Sometimes, our visitors don't care enough to choose what they want.

There is a certain sound a teenager makes when confronted with a choice they aren’t interested in making.

It is a sonic mix of indecision, ambivalence, condescension and that sound your finger makes when you pet a frog.

It is less committal than a shrug, less positive than a “Yes,” less negative than a “No” and is designed to prevent any decision whatsoever from being reached.

It comes out something like, “Meh”

Parent: “Would you like tacos, pasta or steak and lobster for dinner tonight?”

Teen: “Meh”

It is a word so flaccid that it doesn’t even deserve any punctuation. A period would clearly be too conclusive.

If you’ve done any testing at all, you know that your search traffic can give you a collective “Meh” as well. We scientists call this an inconclusive test.

It occurs when you put two or three good options out for a split test, drive search traffic to these options and — meh — none of the choices is preferred by your visitors.

Whether you’re testing ad copy, landing pages, offers or keywords, there is nothing that will deflate a conversion testing plan more than a series of inconclusive tests. This is especially true you’re your optimization program is young. Here are some things to consider in the face of an  inconclusive test.

Add Something Really Different To The Mix

Subtlety is not the split tester’s friend. Your audience may not care if your headline is in 16 point or 18 point font. If you’re getting frequent inconclusive tests, one of two things are going on:

  1. You have a great control that is hard to beat, or
  2. You’re not stretching enough

Craft another treatment, something unexpected and throw it into the mix. Consider a “well-crafted absurdity” a la Groupon. Make the call to action button really big. Offer something you think your audience wouldn’t want.

Segment Your Test

We recently spent several weeks of preparation, a full day of shooting, and thousands of dollars on talent and equipment to capture some tightly controlled footage for video tests on an apparel site. This is the sort of test that is “to big to be inconclusive.” However, video is currently a very good bet for converting more search traffic.

Yet, our initial results showed that the pages with video weren’t converting significantly higher than the pages without video. Things changed when we looked at individual segments, however.

New visitors liked long videos while returning visitors liked shorter ones. Subscribers converted at much higher rates when shown a video recipe with close-ups on the products. Visitors who entered on product pages converted for one kind video while those coming in through the home page preferred another.

It became clear that, when lumped together, one segment’s behavior was cancelling out gains by other segments.

How can you dice up your traffic? How do different segments behave on your site?

Your analytics package can help you explore the different segments of your traffic. If you have buyer personas, target them with your ads and create a test just for them. Here are some ways to segment:

  • New vs. Returning visitors
  • Buyers vs. prospects
  • Which page did they land on?
  • Which product line did they visit?
  • Mobile vs. computer
  • Mac vs. Windows
  • Members vs. non-members

Measure Beyond The Click

In an email test we conducted for a major energy company, we wanted to know if a change in the subject line would impact sales of a smart home thermostat. Everything else about the emails and the landing pages were identical.

The two best-performing emails had very different subject lines, but identical open rates and click-through rates. However, sales for one of the email treatments was significantly higher. The winning subject line had delivered the same number of clicks, but had primed the visitors in some way making them more likely to buy.

If you are measuring the success of your tests based on clicks, you may be missing the true results. Yes, it is often more difficult to measure through to purchase, subscription or registration. However, it really does tell you which version of a test is delivering to the bottom line. Clicks are only predictive.

Print A T-shirt That Says “My Control Is Unbeatable”

Ultimately, you may just have to live with your inconclusive tests. Every test tells you something about your audience. If your audience didn’t care how big the product image was, you’ve learnd that they may care more about changes in copy. If they don’t know the difference between 50% off or $15.00 off, test offers that aren’t price-oriented.

Make sure that the organization knows you’ve learned something, and celebrate the fact that you have an unbeatable control. Don’t let “Meh” slow your momentum. Keep plugging away until that unexpected test that gives you a big win.

Photo courtesy duchesssa

Contributing authors are invited to create content for Search Engine Land and are chosen for their expertise and contribution to the search community. Our contributors work under the oversight of the editorial staff and contributions are checked for quality and relevance to our readers. The opinions they express are their own.


About the author

Brian Massey
Contributor
Brian Massey is the Conversion Scientist at Conversion Sciences and author of Your Customer Creation Equation: Unexpected Website Forumulas of The Conversion Scientist. Follow Brian at The Conversion Scientist blog and on Twitter @bmassey

Get the newsletter search marketers rely on.