fbpx

Why Your A/B Tests Are Failing

Why Your A/B Tests Are Failing

In a 2013 study by eConsultancy & RedEye, surveying almost 1,000 client-side & agency marketers, it was found that 60% found A/B testing to be “quite valuable” to their business.

Yet, only just over a quarter (28%) report being satisfied with their conversion rates.

EconsultancyRedEye-Most-Valuable-Methods-for-Improving-Conversion-Rates-Nov2013
one in seven

What’s interesting is that another study by VWO found that only 1 in every 7 A/B tests is a statistically significant winning test. In our own research at Convert.com, we analysed 700 experiments and found again that 1 out of 7 experiments (14%) ran made a positive impact on the conversion rate for all clients that were not an agency.

But here’s where it get’s interesting… for the clients who did use an agency that specialized in conversion optimization, 1 out of every 3 (33%) tests drove a statistically significant result.

Yes, you are reading this right: many conversion optimization agencies get almost triple chances on a positive A/B test output vs. other clients using the same A/B testing tool.

photo

What makes these agencies win? I’ll be sharing all the learning from our research.

(Editor’s Note: As Matt Gershoff points out in his article about P-Values, just because something is statistically significant, does not necessarily mean it is an actual winner.

It simply means that the probability of more people taking the desired action is greater, if more people are exposed to that specific variant.

Peep has also said that “conversion” on it’s own is a silly metric, if you have a variation that shows 90% off full price, of course the variation will “win” but if it does not result in extra revenue, than the metric on it’s own is bullshit.

That said, in light of the research I discovered for my recent article on conversion testing frameworks, showing that only 13% of companies are basing their testing off of historical extensive data, I think there is still a ton you can learn from the rest of this article. Ok now back to Dennis.)

A/B Testing is Just 20% of the Conversion Process

As CEO of Convert.com I get every single stat from every single test .

One thing I can tell you quite honestly is that the customers who come to our tool (or any other for that matter) with the expectation that an A/B testing platform is going to do the magic for them, barely ever see the success they were expecting.

A/B testing in and of itself is the last part of the process, not the first. So before you become a client of an A/B testing platform, please at a minimum also have the following:

  • some kind of usability testing tool
  • heat mapping software
  • some level of understanding of your analytics
pricing-page-heatmap

It’s only once you understand behavior that you stand a chance of implementing more successful tests. Here is what we see the best clients of Convert do with their time:

  • 60% of the total conversion optimization project time goes to managing the project and navigating client internal politics.
  • 20% of the time is focused on understanding the problems on a page and getting insights about the visitor behavior.
  • 20% is for designing, developing, testing and reporting.

The last 20% of the time involves A/B testing and the actual test setup. In all actuality, the development of challenger pages might only be 10% of the total project time. So please don’t get carried away when people like Peep or myself give one of these “stand-up” landing page evaluations at a conference or on Page Fights.

These are fun, but only scratch the surface of what goes into the actual conversion optimization process. Critics and expert opinions are there to get the clients’ interested in the topic of conversion rate optimization but they are working against us all when actually starting the process, since it’s not nearly as glamorous as we make it sound.

CRO is a process, most of which involves staring at analytics, looking at heatmaps & watching user sessions. This is what watching sessions is like using a tool like mouseflow.

Exciting right? Here’s what I see our most successful agencies do to get such great results:

Understanding Your Analytics

When MarketingSherpa released their E-commerce Benchmark one graph that drew my attention was the one below.

png;base64e19d80a11ff4cddf
cdn2.meclabs.com pubs MarketingSherpa E commerce Benchmark Study.pdf

From the 1657 surveyed, five out of six groups of the top revenue improving businesses all doubled their chance to success in optimization by using extensive historical data vs the ones that did this on intuition. The same MECLabs study showed that the likelihood for success is way higher when using historical data the majority of companies rely on intuition and best practices for optimization. Only 13% of the companies use extensive historical data as part of their testing strategy. There are tons of great examples on this blog of companies using their analytics data to formulate successful tests.

One that stands out is from Casey Armstrong’s recent article on reducing churn, where Groove used what they call “Red Flag Metrics” to identify churn behavior and proactively win back the customer as they started to show the signs of those who previously canceled.

Takeaway: For the agencies that run more successful tests, that success is primarily attributed to the test being a direct response to observed data within specific segments of traffic on the site.

For more on finding those analytics, check out:

Understanding Your Market

know your target market

You will be surprised how many companies don’t understand who their clients are. Sure, you may understand the keywords that got them to you, but do you know:

  • Why they chose you over the competition?
  • What it is about your value proposition that made you unique in their mind?
  • Why do your most loyal customers keep coming back?
  • Why your new customers leave?

Furthermore, have you mined social media to discover if the problem you’re trying to solve actually exists? Are you looking at any & all publicly available information to see how potential buyers in your market describe the problem? Jen Havice covers that quite extensively in this article.

This is where digging deep to find that product/fit is great, but it’s also one way to formulate test hypothesis that aren’t coming out of nowhere.

Takeaway: Many of the most successful agencies using our platform spend an extensive amount of time researching consumer behavior to understand what is most important to the customer & why certain behaviors take place.

This way, when they conduct a test, there’s already a pretty clear understanding of what will work & why, rather than blindly shooting off a test & hoping there is some kind of positive outcome. For more on understanding your market check out:

Call Customers & Ask “Why?” 5 Times

5-whys-lean-manufacturing-example

image source

I’m fan of Eric Ries Lean Startup Movement and Running Lean (book by Ash Maurya) as tools to understand clients. An important part of the lean methodology is to ask why 5 times to understand the problem and see if you have a fitting solution.This was adapted from a methodology developed by Taiichi Ohno of Toyota.

In this article on FastCo Eric shows an example of how this works:

“When confronted with a problem, have you ever stopped and asked why five times? It is difficult to do even though it sounds easy. For example, suppose a machine stopped functioning:

  1. Why did the machine stop? (There was an overload and the fuse blew.)
  2. Why was there an overload? (The bearing was not sufficiently lubricated.)
  3. Why was it not lubricated sufficiently? (The lubrication pump was not pumping sufficiently.)
  4. Why was it not pumping sufficiently? (The shaft of the pump was worn and rattling.)
  5. Why was the shaft worn out? (There was no strainer attached and metal scrap got in.)

Applying this “5 why” framework to a conversion project, I suggest you pick-up the phone, meet clients in person, and schedule Skype calls to get actual face time with your clients and learn directly why they pick your service over the competition.

Something else you can do is mine your customer service calls (if you have them) to explore the “why” behind cancellation and buying reasons.

Takeaway: Similar to the previous section, many of the agencies that use our tool go out of their way to get face time with real clients & dig to the root of the problem.

But what’s interesting about face to face meetings is that you may also notice similar posturing, or body language, that is specific to your customers.

When taken into account, these extra little cues can be built into the overall tonality of a site. This kind of emotional design can cover everything from copy, images, layout and more.

For more on understanding customer behavior:

Using Surveys To Better Formulate A Hypothesis

Help-us-out-for-a-chance-to-win-a-500-gift-certificate-tommyisastrategist-gmail.com-Gmail-e1404209102831

Because customer interviews are difficult to scale, it’s also wise to send out customer surveys that uses open ended questions.

Ott wrote an excellent article on collecting qualitative feedback in the past and of the things I enjoyed the most is how he recommends codifying the answers to get an “at a glance” view of the feedback.

Beyond that though, a well designed customer survey will provide insights into questions you may not have known the answers to before. For example, Conversion Rate Experts client TopCashback.co.uk found that many potential customers thought the offer was “too good to be true” – which of course, ended up becoming addressed in a test variation.

Takeaway: Reading potential customers hesitations with signing up helps you to formulate tests that directly respond to their concerns. For more on customer surveys check out:

Creating The Wireframes

photo3

image source

After collecting and analyzing as much research as possible, many of the agencies I’ve spoken with about their process will create wireframe mockups to quickly give the client a sense of what the challenger pages are going to look like.

In some cases, those mockups will also be shown to customers, who’ll be taken through some basic usability testing. By walking through mockups with actual customers, the agency can find potential problems with their challenger design & iterate before ever getting into a far more costly design and development process.

Takeaway: Instead of wasting time & resources implementing a design solution that may not work, smart agencies test their hypothesis early. Wireframing, by comparison, is a faster & more flexible solution that gives everyone involved in the process something tangible to evaluate.

For more on wireframing:

A/B testing

The Wonders Of AB testing

It’s only after collecting and analyzing as much research as possible, and doing some basic hypothesis testing with wireframes, do the agencies get into the actual A/B testing process.

There is a lot to be said about running a proper A/B test, but that’s an article all on it’s own.

For now, check out:

Conclusion

We have found many more reasons why agencies get better conversion rates but in this post I wanted to focus on the preparation before the experiment even starts.

There is no secret formula in conversion rate optimization and no magic big data tool that you plug in. It’s hard work, lots and lots of research before and then A/B testing to validate it all.

Related Posts

Join the conversation Add your comment

  1. your webdesign is great and article is nice

Comments are closed.

Categories