No single color is better than another. Ultimately, what matters is how much a button color contrasts with the area around it.Keep reading »
There’s nothing that always works and pretty much nothing that never works either. Websites are highly contextual.
That being said, there are tests that tend to have a very high win rate. These are the test ideas that, while they don’t work 100% of the time, work more often than not.
Naturally, everything depends on the specific implementation — a good idea implemented poorly will not yield any results.
The following 20 testing ideas come from our own client-based research done over the years. Keep reading »
If you ever ran a highly trustworthy and positive a/b test, chances are that you’ll remember it with an inclination to try it again in the future – rightfully so. Testing is hard work with many experiments failing or ending up insignificant. It’s optimal to try and exploit any existing knowledge for more successes and fewer failures. In our own practice we started doing just that. Keep reading »
You run an A/B test, and it’s a winner. Or maybe it’s flat (no difference in performance between variations). Does it mean that the treatments that you tested didn’t resonate with anyone? Probably not.
If you target all visitors with the A/B test, it merely reports overall results – and ignores what happens in a portion of your traffic, in segments.
A/B testing tools like Optimizely or VWO make testing easy, and that’s about it. They’re tools to run tests, and not exactly designed for post-test analysis. Most testing tools have gotten better at it over the years, but still lack what you can do with Google Analytics – which is like everything. Keep reading »
A value proposition is the #1 thing that determines whether people will bother reading more about your product or hit the back button. It’s also the main thing you need to test – if you get it right, it will be a huge boost. Keep reading »
Chances are, you’ve heard of Google Optimize by now. It’s Google’s solution for A/B testing and personalization. It launched in beta in 2016 and left optimizers around the world waiting in line to try it out.
Now that it’s out of beta, you can give it a try without the wait.
But what can you expect? How do you configure it properly? How do you run your first experiment?
Nothing works all the time on all sites. That’s why we test in the first place; to let the data tell us what is actually working.
That said, we have done quite a bit of user experience on ecommerce sites and have seen some trends in terms of what generates positive experiences from a customer perspective.
This post will outline 16 A/B test ideas based on that data.