Are You a Victim of Your Own A/B Test’s Deception?

Source http://feedproxy.google.com/~r/KISSmetrics/~3/OVQAkjPWm4M/

A/B testing is all the rage, and for good reason. If tweaking your home page a bit can get you 25% more signups, who wouldn’t try it?

The best thing about A/B testing is the awesome selection of tools. Optimizely provides a live editing tool that puts page tweaks and goal tracking in the hands of marketers. Visual Website Optimizer offers a suite of interesting measurement tools, including behavioral targeting, which allows you to show different variations depending on a visitor’s actions.

Even with such great technology available, there are a few things to watch out for. The first is statistical significance, which has been written about enough (here, here and a mini-site here if you’re interested).

Another is the common mistake of assigning a goal that measures the short-term effect of a test rather than the long-term effect on your business. We made this mistake at Segment.io, and that’s the story I’ll be sharing in this article.

The Winning Variation is Wrong

Usually the goal of an A/B test is to get people to take a single action on a single page. Common actions include clicking the signup button on your home page or joining an email list. Those actions are great vanity metrics, but the fact is that more visits to your signup page or a bigger email list aren’t very sound business goals.

The problem with the single-action approach is that it assumes a single action provides value to your business, which it usually doesn’t. Most A/B tests are done at the top of an acquisition funnel, long before visitors have proven their worth.

The goal of an A/B test should be to move the visitors who are most likely to become high-value customers from the top of your funnel to the bottom of your funnel.

How We Messed This Up

I’ll share a super simple experiment we did at Segment.io that illustrates my point. We recently ran an A/B test on our shiny new home page. Our test was simple: we created two variations of the signup button text. The control version read “Get Started,” and the variation we chose was “Create Free Account.”

Here’s what our A/B test variation choice looked like:

Before long, “Create Free Account” beat “Get Started” with a 21% increase in conversions. Time to call our developers and make it permanent, right? For most people that would be the next step. But, being an analytics company, we always have an abundance of nerds around ready to dig deeper into our data.

To make analysis easier, we tagged each tested visitor with the variation they were shown. And, since Segment.io automatically sends Optimizely variations through to KISSmetrics, Customer.io and Intercom, we were able to segment out visitors who saw each variation in all of our tools.

How We Found the Real Winner

First, we looked at the immediate “next step” for visitors after they clicked on the call to action. KISSmetrics funnels were our tool of choice for this analysis. We used a simple funnel of Viewed Home Page > Viewed Signup Page > Signed Up, and split out the funnel by the “Experiment home page CTA” property. Half of the visitors at the top of the funnel had the value “Get Started” and the other half were tagged as “Create Free Account.”

It turned out that the visitors who clicked “Create Free Account” were less likely to complete the signup form. This drop in signups effectively wiped away the 21% gain that button made in our A/B test on the home page. That meant there no longer was a clear winner between “Create Free Account” and “Get Started.”

But there was one last thing to examine: the pricing plan people ultimately chose. It turned out that the visitors who clicked on “Create Free Account” were much less likely to sign up for our paid plans. Those who clicked “Get Started” were much more likely to sign up for paid accounts.

So, in the end, the real winner for us was “Get Started.”

How to Avoid This in Your Business

Watch all of your results! Be especially wary of optimizing for a single click or action. Remember, a single click usually does not provide direct value to your business. Long-term gains are always more important than short-term conversion wins.

Don’t decide an A/B test based on an increase in clicks, opt-ins or signups. Tag visitors with the A/B test version they saw and watch out for unintended consequences of the tests you run. A full page opt-in form might lead to a bigger email list, but what if it decays the value of your user base?

Here’s a checklist to help you find the real winner in your A/B tests:

  1. Save test variations to user profiles with a tool like KISSmetrics.
  2. Watch the effect of each test variation all the way through your acquisition funnel.
  3. Look for unintended consequences of tests, like poor user engagement or lower referral rates.
  4. After a few months have passed, check the lifetime value and churn rate of users for each variation.

If you have questions about how to set up any of this, I’ll be watching the comments on this post.

About the Author: Jake Peterson leads customer success at Segment.io, helping thousands of customers choose and set up analytics and marketing tools. If you’re looking for free advice, check out their Analytics Academy. Segment.io is a single, simple integration that gives you access to 70+ analytics and marketing tools with the flick of a switch. Check it out here.

Some of the links in the post above are “affiliate links.” This means if you click on the link and purchase the item, we will receive an affiliate commission. Regardless, we only recommend products or services we believe will add value to our readers.

Leave a Reply

Your email address will not be published. Required fields are marked *

CommentLuv badge