Over the years, I've talked to a lot of companies who seem to struggle with A/B testing. They test idea after idea, but nothing they try really improves their conversion rate.
Worse still, sometimes their conversion rates actually decrease over time.
They read A/B testing case studies and do their best to apply what they learn to their variants, but even that seems to fail them.
Sure, they might have the occasional token success, but by in large, their conversion rate optimization (CRO) efforts fall flat.
What's the problem?
It's frustrating, but unfortunately, only 1 in every 7 A/B tests produces meaningful results. With statistics like that floating around, it almost seems like CRO is just a numbers game—run enough tests and eventually you'll end up with a winner.
However, the key to successful A/B testing isn't more tests, it's strategic testing.
For example, at Disruptive, we recently improved a client's conversion rate by 22%.
At first glance, you might think, "What a great case study! How did you get those kinds of results?"
And, if you really want to know what we did, the answer is fairly simple: we got rid of the testimonials.
Isn't that exciting? Testimonials reduce conversion rates!
If you haven't tested getting rid of your testimonials yet, you should go out and try eliminating the testimonials from your site. After all, your testimonials could be reducing your conversion rate by 22% or more...right?
Well, not really.
On average, testimonials often improve conversion rates. In fact, when we first started working with this client, if you had told me their testimonials were reducing their conversion rate, I probably would have argued the point with you.
So, if we were pretty sure that their testimonials improving their conversion rate, why did we try getting rid of them?
Well, it wasn't just a random guess. We tested a lot of other hypotheses first...
As you can see from this GIF, we worked through over a dozen versions of this page before we finally tried eliminating the testimonials.
Not every test improved the client's conversion rate, but every test taught us something about our target audience and helped us uncover what our traffic really wanted out of their site experience.
Effective A/B testing teaches you something with every test. But, if you want to learn something from every test, you can't just test random ideas—you need to test strategically.
Sure, strategic testing takes some extra planning and documentation, but in the long run, it will save you a lot of time and failed tests.
There are 4 basic parts to a solid testing strategy:
Before you even start to brainstorm testing ideas, you need to create a detailed buyer persona.
Essentially, your buyer persona gives you a structure for defining your testing hypotheses and a framework for understanding your results.
At a minimum, your buyer persona should cover the following:
You might need to talk to your current customers or your sales team to get this information, but knowing your target audience can help you produce great testing results much more quickly.
Thanks to some detailed discussions with this well-informed client, we knew a lot about the audience they were targeting. On average, our target audience was middle-aged men and women with money to invest.
Our audience wanted to be smart with their money, but they also recognized that they didn't know enough to invest their money wisely on their own.
With that knowledge in hand, we were ready to define some goals for our tests.
If you don't know what you're trying to achieve with your test, it's hard to build variants that produce meaningful results.
Now, you probably have a good idea of what you want to achieve with your tests, but let's spell out a few specifics.
Defining your overall goals and the specific steps your potential customers need to take on the path to reaching those goals will give you clear insight into what sorts of changes you should be testing.
In this client's case, we defined success as increased sales (see, I told you "more sales" was the best answer). To achieve that goal, however, we needed the client's website to produce more qualified leads for their sales team.
Additionally, the client had a lot of different offers that potential clients could choose from, so the page need to identify which option or options potential clients were the most interested in.
Now that we knew who we were targeting and what our goals were, it was time to come up with some hypotheses.
At this point, your job is to try and guess at which factors on your site are preventing your target audience from doing what you want them to do and how you can eliminate or reduce those factors.
Here are some of the hypotheses we came up with:
Armed with these (and a variety of other hypotheses), we were ready to start testing.
Your testing strategy doesn't end when your tests start. You need to document everything and use what you learn to develop new hypotheses and tests.
Depending on how you like to do things, your documentation can be fairly simple or quite complex, but your approach needs to be methodical—each test needs to teach you something that you can use to produce better results from your next test.
For example, here's how you might track the results of a series of call-to-action tests:
Yes, it takes some extra effort to document everything, but it is both easier and more effective than simply testing new ideas at random. Plus, if anyone ever asks you why your page is set up the way it is, you'll have a solid, data-driven answer.
Once we started testing, we discovered that our audience converted best when we put them in the driver's seat. A strong sell was a major turnoff to this audience.
As a result, we ultimately ended up with a simple CTA that allowed our audience to indicate what they were interested in before we asked them to submit their information.
We also discovered that our target audience wanted something easy, so we made sure our headline and body copy emphasized how easy and quick the client's services were.
After a certain point, however, there was only so much that we could say to make it clear that working with our client's business was easy and simple.
So, we tried eliminating page elements to see if streamlining the landing page would help reinforce the idea that our client was completely focused on creating an easy, painless experience for their customers.
As a result, our conversion rate improved by 22%—but that was hardly a surprise.
After learning so much about what our target audience wanted from our site, we knew that a simple, focused page would convince our client's potential customers to convert...and it did.
Successful A/B testing isn't the result of luck, it's the result of great strategy.
With a great testing strategy, it doesn't matter if you don't always get case study-worthy results with every test. Over time, you'll be able to figure out exactly what your audience wants and create a user experience that dramatically improves your conversion rate.
You’ve heard my two cents, now I want to hear yours.
Do you agree with this approach? How do you set your testing efforts up for success?