3 Reasons Not To Run Conversion Tests

Testing is a hot topic. But before you decide what to test and which kind of test to run, step back and ask an even more basic question—whether to bother testing at all. In this article, I’ll take a quick look at three situations where the answer to “should I test?” is most likely a […]

Chat with SearchBot

Testing is a hot topic. But before you decide what to test and which kind of test to run, step back and ask an even more basic question—whether to bother testing at all. In this article, I’ll take a quick look at three situations where the answer to “should I test?” is most likely a big, fat “no.”

5077358641 1df2d160e0

The Current Visual Design Is Hopeless

This is more of a rant than a point: I think I’ve looked at one too many ugly web sites this week. But may I just throw out there for consideration—why bother tweaking an obviously substandard page? One of the reasons to run tests is to discover whether a hypothesis is true. But what if there’s no doubt as to the outcome? What if a design is so bad there’s no question a simple design update will improve performance?

I’m talking about design that ignores basic best practices, such as this actual home page:

5077347081 Fa667cbfc8 O

We have to admit to ourselves that some designs are just not ready for tweaking and testing. In the example above, where would you start? Would you try different colors for the content box borders? Would you change the word “find” to “search for” in the nearly invisible call to action? Add a big red action button? None of those changes would fix this page’s fundamental issues. But a solid redesign could.

An experienced visual designer will, just by virtue of training, lay the foundation of a well-optimized web page. They’ll create a professional, credible appearance, with clear structure and good visual hierarchy. They’ll align content to a grid and add content-rich images. Immediately, things will be better—and there’s no need to test it.

If you’re not sure whether your page or site falls into the “too ugly to test” category, ask a designer you trust to be brutally honest with you, or do a quick comparison of your competitors’ web sites. Take care of the basics first. Thank you.

5077358641 1df2d160e0

You Can’t Make The Indicated Changes For Improvement

Many conversion optimization projects end at an unexpected time—during the presentation of successful test results. We’ve been there. Unaware that the project is already over, we’ve floated on a euphoric 50% conversion rate improvement, thinking that by sheer strength of numbers the proven changes will be fast-tracked and implemented. Then we learn that there are no people and no budget for implementation—only for testing.

Too few companies are both open and able to change. For some it’s a simple matter of inertia; as Paco Underhill put it in The Science of Shopping, “Not every organization welcomes data, especially when it may disagree with long-held beliefs and traditions.”

For other companies, a long deployment cycle or conflicting political goals stop conversion improvements in their tracks. In those cases, it’s unfortunate that test results don’t have any super powers of their own—if only numbers could stop speeding naysayers and leap tall corporate silos with a single bound!

One solution to dead-end testing is to plan for implementation at the same time you plan a test. As you write your hypothesis and create your test plan, ask, “If this blue button wins, how will I get it pushed to the production site? If this new page template wins, who needs to be involved with updating the CMS? Can I make the change myself, or do I need to hire someone?”

Your test plan may change or shrink as a result. You might strip out some nice-to-have options. But the overall effectiveness of your conversion optimization will be improved. No more dead-end tests!

5077358641 1df2d160e0

There’s Only One Person On The Test Team: You

Conversion tests shouldn’t be limited to the skills of a single individual. I find it odd that while other, more mature industries acknowledge the need for a variety of roles in executing complex tasks, the web industry continues to hold up the solo jack-of-all-trades hacker or webmaster as an ideal. Doctors aren’t expected to perform surgery, run the front desk, and write their own scheduling software—why is the expectation so different for internet-related work?

The “do it all yourself” mindset is encouraged by some test platforms, too, with marketing that emphasizes their ease of use. Google Website Optimizer, for example, touts itself as requiring “no special software,” letting you “begin your first experiment in minutes.” A competitor, Visual Website Optimizer, uses the tagline “world’s easiest A/B testing software.”

Now, don’t get me wrong—both of these tools are useful and worth checking out. But the implication in these and similar claims is that if you have the right test platform all the rest—from strategy to design and results interpretation—will somehow fall into place on its own.

It’s a mindset that cuts two ways. On one hand it helps businesses overcome inertia and get the optimization ball rolling (finally!). That’s a good thing. On the other hand, though, it oversimplifies and even marginalizes the testing process. Expectations are set—testing can be easy and quick, therefore it should be easy and quick. And you should be able to run it.

But eventually you’ll run out of easy and quick tests to run. After all, there are only so many different button colors and headings you can try. And for projects that are more complex—and worthwhile, in my opinion—than an image swap, you’ll need more at your end of the field than a testing tool and your wits. You’ll need a team.

Who should be on your team? As an example, our larger projects typically include these core roles:

  • Analyst: gathers and analyzes performance and financial data
  • Test strategist: plans and manages the testing roadmap
  • Project manager: keeps individual roadmap initiatives moving forward
  • Usability researcher: gathers and analyzes audience intelligence
  • Visual designer: interprets test objectives into visual elements and layouts
  • Developer/interactive designer: interprets test objectives into interactivity; supports complex test installations and tracking requirements

Of course, your team might start out smaller. Roles can overlap. But the goal is to extend your reach so you aren’t restricted to only what’s easy and quick. Try new, difficult, and worthwhile ideas. Stretch. That’s when the real fun starts.


Contributing authors are invited to create content for Search Engine Land and are chosen for their expertise and contribution to the search community. Our contributors work under the oversight of the editorial staff and contributions are checked for quality and relevance to our readers. The opinions they express are their own.


About the author

Sandra Niehaus
Contributor

Get the newsletter search marketers rely on.