Advanced testing: What 4,000 A/B tests can teach you
Contributor Jason Puckett summarizes a presentation from SMX Advanced 2017 on lessons learned from 11 years of conversion optimization. In it, speaker Ayat Shukairy explains what's critical for the success of any A/B testing program.
SMX Advanced has always featured the industry’s top talent discussing what they’re passionate about: search marketing. Last week’s conference in Seattle was no different.
Ayat Shukairy, co-founder and vice president of client solutions at Invesp, shared nine lessons from 11 years of A/B testing experience. The goal of the presentation was to present the many nuances which go into a successful A/B testing plan. After 4,000 successful conversion rate optimization (CRO) implementations, Shukairy has developed quite a knowledge base.
Below I have listed Shukairy’s nine key takeaways, along with some additional insights and commentary:
1. We often focus on the wrong thing
Shukairy advised us to stop focusing on the short term. We need to keep our long-term testing plans in mind when executing our strategies.
Additionally, Shukairy reminded the audience to be skeptical of giant lifts in performance; if you see a huge lift in conversion rates (CVR), then check to make sure that (a) you’ve run the test long enough to produce meaningful results, and (b) this test is not an outlier.
2. Create a compelling website narrative
There are three types of A/B testing: element-level testing, page-level testing (layout) and visitor flow testing. Shukairy notes that page-level testing can yield between 7-9 percent improvement in CVR, while visitor flow testing is something that’s going give you larger improvements (successful implementations can show 16-18 percent improvement).
However, even the best testers will reach a threshold without a compelling narrative. Define your narrative and “wow” your visitors.
3. Don’t assume you live in a vacuum
Your website experience is only a single touch point for customers, so it’s important to consider your website within the context of your entire brand. Consider your brand from the perspective of your potential customers — ask yourself, “How does my brand look to my audience?” Understand every touch point and its impact on the consumer, as each touch point can impact conversion rates.
4. Don’t assume you know everything about customers
Listen to customers through qualitative research. Polls and surveys to uncover motivations should be conducted prior to launching A/B tests on your web experience. Well thought-out market research is essential to development of a good hypothesis.
5. Obtain buy-in from everyone
Make sure you can get buy-in from all client stakeholders regarding the philosophies of CRO. You need everyone in different departments to be vested in order to succeed. The more information, discussion and engagement you receive from different departments, the more successful your CRO efforts will be.
6. Embrace failure
Shukairy noted that only 13 percent of A/B tests report significant uplifts — which means that 87 percent do not. Let this fact change the way you think about CRO. More often than not, your tests will fail to drive meaningful growth. That means you must push through and innovate on your failure, using failed A/B tests to generate research opportunities.
7. Align your website objectives with overall business KPIs
A testing tool is only as useful as the actionable insights obtained from it. So, when measuring the results of your A/B tests, ask yourself, “What is the dollar impact?” If you can’t answer this question, you may need to update your reports to ensure that you are measuring KPIs (key performance indicators) that most closely align with business objectives.
Use data to uncover problems that are hindering growth, and make sure the problems you solve have a direct impact on the KPIs that matter most.
8. Understand A/B testing statistics
There is a key statistic that most testers underutilize: statistical power. Statistical power is the probability that a statistical test will detect an effect when there is, in fact, an effect to be detected. A high statistical power reduces the likelihood of making a Type II error (concluding there is no effect when there actually is one).
Tests with a high statistical power and a high confidence threshold will reduce the likelihood of a false positive, so always collect a significant sample size before you determine results.
9. Be careful with pollution of your A/B testing results
Run tests longer, and don’t make changes while the test is running. Any change you make to the test environment during a test will pollute your results. This includes trimming off under-performing variations, changing the allocation percentage of traffic and so on.
If you absolutely need to make a change to the environment, you will need to restart your tests and begin with a clean sample.
Thinking differently about CRO
All in all, I thought Ayat Shukairy gave a wonderful presentation. After it was done, I asked her, “What overarching piece of advice would you give to A/B testers?”
She replied, “Change the way you think about conversion rate optimization.”
https://www.slideshare.net/slideshow/embed_code/key/jzN0tSYC6B6tQY
Contributing authors are invited to create content for Search Engine Land and are chosen for their expertise and contribution to the search community. Our contributors work under the oversight of the editorial staff and contributions are checked for quality and relevance to our readers. The opinions they express are their own.
Related stories
New on Search Engine Land