Marketing gurus tout A/B testing as a critical tactic for improving results. Marketers frequently conduct A/B tests to find the best performing email subject lines, landing pages, advertising copy, other types content and pricing. About two-thirds of brand marketers use A/B testing, according to research by Econsultancy and Red Eye. PR sometimes uses A/B testing for copy on pitches to journalists and on headline variations for news releases.
Yet most A/B tests fail to produce statistically significantly results. Most lack a clear conclusion that either option being tested is better than the other. Fewer than 20 percent of 3,900 marketers surveyed by UserTester reported that their A/B tests produce significant results 80 percent of the time, according to eMarketer.
A previous analysis by Appsumo, a web app deals website, concluded that only one of every eight A/B tests lead to significant change. Other research also indicates that most A/B tests generate inconclusive results.
But experts urge marketers not to abandon A/B testing. Eliminating A/B testing would be irresponsible, argues John Donahue, chief product officer of programmatic platform Sonobi. A/B testing has been known to save marketers 40 percent of their advertising budgets on ad platforms like Facebook.
“The benefits of A/B testing are undeniable,” Donahue told eMarketer. “Developing any creative project there are a lot of assumptions. A/B testing allows you to remove those assumptions.”
How to Get Conclusive A/B Test Results
Experts offer these recommendations to design A/B tests that provide meaningful results:
Create boldly different choices. Many tests attempt to measure choices that customers don’t care about or differences customers barely notice. Perhaps website visitors don’t care about the color of the call to action button. Small differences can produce meaningful results for major companies with enormous traffic, but for most businesses the tweaks cause no noticeable improvement.
Be patient. Realize that obtaining results may require a few thousand website visits or two weeks. Test substantial changes to avoid spending time waiting for small improvements.
Remain persistent. Accept inconclusive results as part of marketing analysis. Some managers may consider a test that shows inconclusive results a failure. It’s not. It shows that what was tested has little influence, and that’s valuable information.
Test for sensitivity. Test a specific hypothesis and identify which elements impact results and which elements don’t “move the needle” and change consumer choices, explains Claire Vo, vice president of product management at Optimizely. Stay disciplined and keep an organized list of what impacts results.
Consider segmenting data. Examine test results across segments like devices, traffic sources and other factors, suggests Brian Massey at Conversion Sciences. Keep in mind that segments need to have sufficient sample size to produce conclusive results. Beware of implementing changes for segments that don’t drive significant revenue or leads.
If tests don’t reveal a definite winner, keep the original version (the control). That’s simpler and conserves resources. Or feel free to pick your favorite version and use it as the control going forward.
Bottom Line: Experts say A/B testing is the secret to greatly increasing marketing results. The real secret is that most A/B tests show inconclusive results and rarely lead to increased conversions. But don’t abandon A/B testing. These tips can help conduct tests that deliver meaningful insights.
William J. Comcowich founded and served as CEO of CyberAlert LLC, the predecessor of Glean.info. He is currently serving as Interim CEO and member of the Board of Directors. Glean.info provides customized media monitoring, media measurement and analytics solutions across all types of traditional and social media.