You like to think that you’re a completely rational person making completely rational decisions, right? It’s nice to think that you haven’t made any major life decisions based on how you were feeling. Well, you have. Many times.
Do you remember when Slack launched last year? At the time, I was a diehard HipChat fan. Needless to say, I wasn’t interested in trying Slack. I considered it nothing more than a passing trend. Now? I use it for an average of 10 hours a day for personal and professional reasons. (Sorry, HipChat.)
You may be wondering, “why should I make my own visualization of my A/B test results?”
Because the A/B testing tools in the market already provide you all the necessary tables and graphs, right? They tell you when an A/B test is significant and what the expected uplift is. So why bother?
Reality check: the level of statistical literacy is pretty poor in CRO world. A major portion of your test results are probably invalid.
While testing tools are getting more sophisticated, blogs are brimming with ‘inspiring’ case studies, and experimentation is becoming more and more common for marketers – statistics know-how is still severely lacking.
Stop being one of “those guys”, and get your act together. It’s actually not that complicated. If you don’t know basic statistics, you won’t be able to tell whether your tests suck.
We all know that you have to work your butt off to get anywhere. Same with conversion optimization. In order to get results you could be proud of, you need to put in the effort and hours. But how are you spending that effort? Keep reading »
Here’s an uncomfortable truth about conversion rate optimization: lots of people are running bad tests without even knowing it. They’re making decisions based on false positives, they’re acting on inconsistent data, they’re avoiding the issue of sample pollution – I could go on.