Every industry is plagued by myths, misunderstanding and half truths – and conversion optimization is no different.
This is especially true in any marketing-related field – partially because there are no universal practices and partially because content marketers are rewarded for producing highly shareable and linkable content (not always 100% accurate content).
In any case, some myths are more poisonous than others. They can create misunderstanding among practitioners and confusion for beginners. With the help of some of the top experts in conversion optimization, here is a list of the 7 most poisonous conversion optimization myths:
1. “CRO is a List of Tactics and Best Practices”
This may be the most pervasive myth in conversion optimization. This makes sense. It’s so easy (and effective) for a blogger to write a post of 101 conversion optimization tips or 150 A/B tests to run right now. Of course, these articles are bullshit. They make it seem like conversion optimization is a checklist, one you can run down, try everything, and get insane uplifts. Totally wrong.
Let’s say you have a list of 100 ‘proven’ tactics. Let’s venture to say they’re ‘proven’ by science or psychology. Where would you start? Would you implement them all at once. Then your website would look like a Christmas tree.
Some changes will work, some will be worse for your site. They’ll likely cancel things out, or possibly even make things worse. Without a rigorous and repeatable process, you’ll have no idea what had an impact – so you’ll miss out on the most important part of optimization: learning.
No Room For Spaghetti Testing
Even if you tried to test the 100 tactics one by one, you’ll find that you’re wasting time. The average A/B test takes ~4 weeks to run, so it’d take you 7.5 years to run them all one by one.
The key to optimization is having a good process to follow. If you have a good process, you can prioritize work and know which ‘tactics’ try and which to ignore. You would have a good indication of where the problems are because you did the research, the heavy lifting.
Conversion optimization – when done right – is a systematic, repeatable, teachable process.
What About Best Practices?
You’ll inevitably come across these tactics, tips and best practices blog posts, because they’re highly shareable and outline idyllic and easy to digest opportunities for people to lift conversion rates (and make more money).
They’re based on “best practices,” which are where you should start – not where you should end up. And of course, best practices can fail, too.
The good news is, for all of the articles you read on best practices, there are some that are biting back. It’s as simple as debunking a few common ‘best practices,’ or discussing common tropes like social proof or the 3-click rule.
Just remember, results are always contextual. The goal is to figure out your specific customers and what will and won’t work on your specific website. Leave the 101 best practices articles to the amateurs.
2. “Split Testing = Conversion Rate Optimization”
Most people equate conversion rate optimization with an A/B split test. Winners, losers, and a whole lot of case studies around button color tests.
Optimization is really about validated learning. You’re essentially balancing an exploration/exploitation problem as you seek the optimal path to profit growth. As Tim Ash puts it, split testing is just a tiny part of the optimization process. Tim Ash is the CEO of SiteTuners, author of the bestselling book Landing Page Optimization, and chair of the international Conversion Conference event series, and here’s what he had to say on the topic:
Also, if you don’t have traffic for testing, you can still optimize. How? You can use things like:
- Heuristic analysis.
- User testing.
- Mouse tracking.
- User session replays.
- Talk to your customers or prospects.
- Site walkthroughs
- Optimize for site speed.
3. “If thine testing worketh not in the first months, thy site is not worthy to be optimized.”
Brian Massey, founder of Conversion Sciences and the author Your Customer Creation Equation: Unexpected Website Formulas of The Conversion Scientist, says that one of the biggest myths in optimization deals with the expectations of results. Often, companies will throw in the towel if results don’t appear immediately. Here’s what Brian had to say:
Another related myth is that conversion optimization is widely understood and appreciated by businesses. Even though it’s been around long enough, that’s simply not the case. This lack of understanding leads to a lot of the problems Brian mentioned above. Here’s how Paul Rouke from PRWD put it:
4. “Test to validate opinions and hypotheses”
As humans, we’re all irrational. So a large part of optimization is trying to mitigate natural cognitive biases in order to reach a more objective business decision. Andrew Anderson has written a large amount on overcoming biases before, and has some great thoughts on this issue:
In other words, you don’t always have to be right. Sometimes – much of the time – what you thought would work didn’t and vice versa. As Andrew explains, being wrong is where the learning occurs, so don’t be afraid of being wrong:
5. “When In Doubt, Copy The Competition”
The internet is brimming with conversion optimization case studies, so it’s tempting to fall into the trap of stealing others’ test ideas and creative efforts.
First off, be skeptical of case studies. Most don’t supply full numbers, so there’s no way of analyzing the statistical rigor of the test. My guess is these case studies are littered with low sample sizes and false positives. That’s just one of the reasons most CRO case studies are BS.
The other is that, even if the test was based on good statistics, you’re ignoring context. Your competition has different traffic sources, different branding, different customers. One thing that works for them might tank your conversion rates. Here’s what Andrew Anderson wrote about contextuality of A/B test results:
“In order to be the best alternative, you need context of the site, the resources, the upkeep and the measure of effectiveness against each other. Even is something is better, without insight into what other alternatives would do it is simply replicating the worst biases that plague the human mind.”
Stephen Pavlovich from Conversion.com explains really well why you shouldn’t just copy another company’s creative:
Finally, if you’re spending your time copying competitors (and case studies from WhichTestWon) what you’re not spending your time on is validated learning, exploration, or customer understanding.
On the same note, stop worry about, “what’s a good conversion rate?” What would you do with that information if you had it? Whether you’re under or over, you’d still (hopefully) be working on your specific site. Therefore worrying about industry averages and other people’s results are simply a waste of time.
Peep summed it up well in a recent post:
6. “Understand statistics? Nah, I’ll just wait until my tool says the test is done.”
As for why you need to know statistics to run A/B tests, Matt Gershoff put it best when he wrote, “how can you make cheese when you don’t know where the milk comes from?!”
Knowing some basic statistics allows you to avoid type I and type II errors (false positives and false negatives), and it allows you to avoid imaginary lifts. It would take a few articles to write everything you need to know about stats, but here are some heuristics to follow while running tests:
- Test for full weeks.
- Test for two business cycles.
- Make sure your sample size is large enough (use a calculator before you start the test).
- Keep in mind confounding variables and external factors (holidays, etc)
- Set a fixed horizon and sample size for your test before you run it.
- You can’t ‘spot a trend,’ regression to mean will occur. Wait until the test ends to call it.
7. “CRO is the same as SEO, social media, email etc…If you have budget you can buy some, if you have less budget, you just cut down a little”
It may seem easier to cut down on conversion rate optimization costs because, essentially, it looks like you’re freezing your ROI instead of cutting traffic/acquisition/revenue. However, optimization is about more than a 10% in the funnel. It’s about decision optimization, learning where and how to change or take away elements to improve your UX and boost RPV. Here’s what Ton Wesseling from Testing.Agency has to say on the topic:
For the integrity of an industry, it’s important to make note of destructive myths and mistruths. That way, those beginning to learn will have a clearer (and less confusing path), and businesses new to optimization will not get discouraged with disappointing (non-existent) results.
Here’s the thing, though: there are many more myths that we probably missed. What are some destructive ones that you have come across in your work?