3 A/B Testing Mistakes You Might Be Making

Posted on March 6, 2015 | Updated on June 13, 2023

3 A/B Testing Mistakes You Might Be Making

Want to avoid A/B testing mistakes? One of the most important factors to consider when designing for websites is user experience. It’s something that is designed with pleasing aesthetics in mind but is not typically thought of as being data-driven. This is where A/B testing, or split testing, comes into play.

This is a process that allows you to introduce alternative test versions of a conversion tool, landing page or product page to members of your target demographic. Then, you can determine which performs better. While mistakes are perfectly acceptable, the most dangerous ones are those you don’t realize you’re making.

The following article will identify and describe three common problem areas where the margin for error is high, and provide actionable solutions. For starters, we’ll look at something we’ve already hinted at.

3 A/B Testing Mistakes You Might Be Making

1. Focusing on Design Instead of Solving Conversion Problems

Your website’s ultimate purpose is to convert its visitors. To do this you need to become familiar with what makes your audience follow through with their purchase intent and at which point they abandon the checkout process.

The Problem:

Designers are all too aware of the fact that their websites only have 5 seconds to capture a visitor’s attention. This leads them to believe visual design is of paramount importance and that copy plays a less significant role in conversion. Not true.

When Alhan Keser of Widerfunnel.com first began conducting A/B testing, he assumed as much. The results of his design/useability-related tests were underwhelming. The problem with presenting two entirely different designs is that your test subjects will respond to each immediately and half generally navigate away from the page – it really is like tossing a coin.

This does not address the issue at hand – namely that sales are caused by reassurance. You’ll want to look at how bold headlines affect consumer motivation and how your tests – yes, multiple tests – can be carried out across other marketing channels.

The Solution:

Create A/B tests that display different calls to action and keep track of your analytics. Make sure you understand the basics of color theory in conversion so you can gain a much clearer picture of what’s going on. Then target areas specifically related to the checkout process.

3 A/B Testing Mistakes You Might Be Making

2. Not Distinguishing Between Mobile and Desktop Users

Almost everybody with access to the Internet will be familiar with the frustration of navigating websites that are not optimized for mobile devices. In many instances, the user will decide that even if they really want what your site is selling, it simply isn’t worth the hassle.

The Problem:

When you’re efforts in A/B testing pull from test groups that are exposed to content on different devices, your end results will be skewed. As a recent study by Pew Internet Research found, 34 percent of U.S. Internet users shop online mostly using their mobile phones. If you cannot control this and target specific platforms, you will run into trouble.

The mistake is in believing that variations in screen size do not impact the results of your tests. Your initial data might look promising and prompt you to implement changes earlier than you otherwise would have, and this further compounds the distortion.

When sample sizes are restricted, you might be inclined to think your data is more accurate. However,  even if your insights point towards positive results and come back with a 95 percent chance to beat the original set of parameters, they may not be replicable.

The Solution:

Take steps to ensure the users you target are actually exposed to the test subjects before you introduce any variables and track their behavior.

Test your responsive designs by restricting the audience to mobile-only users, or establish a separate mobile-optimized website and conduct new tests accordingly. Just be aware of the situations that can arise when your data is gained from broad sources. Do your best to cater to specific browsing devices and consumer groups.

3 A/B Testing Mistakes You Might Be Making

3. Testing Micro-Conversions and Expecting Big Wins

Oftentimes designers’ mentality permits them to think that all it would take for dramatic improvements in conversion rates to occur is to change one small thing. The idea is nice, but it simply isn’t a sound basis for conducting your tests.

The Problem:

While micro-conversions are important, they do not take into account the bigger picture. Thinking otherwise may lead you in a direction that ultimately doesn’t yield the results you expect in practice.

In the event this happens, you will have wasted hours of your time and will be left with unusable data. It may then take you some time to revise your testing strategies and come up with something that will work.

The Solution:

If small changes provided significant and sustainable gains over the long term, there would be no real need for anyone to conduct any A/B testing.

Focusing on macro-conversions can help maximize positive outcomes. As such, you should focus on more than just changing and testing the color of a button.

The Practical Aspects of Conducting A/B Testing

On one final note, remember to double- and triple-check that your outcomes are accurate before you make any changes to your designs or content.

Hopefully you will gain a few new insights into how you can carry out your A/B testing campaigns more effectively. As long as you learn from your past mistakes, your website will get to where it needs to be.

Want to start testing your site today? Right now, JotForm has partnered with Unbounce to help you A/B test your landing pages and web forms. They are offering 50% off of Unbounce for the first three months after your free trial. Click here to read more about the partnership and get the promo code.

About The Author

Eleanor Hecks is the Editor-in-Chief of Designerly Magazine, an online publication dedicated to providing in-depth content from the design and marketing industries. When she's not designing or writing code, you can find her re-reading the Harry Potter series, burning calories at a local Zumba class, or hanging out with her dogs, Bear and Lucy.

Leave a Comment





Related Posts