If you’re doing it right, you probably have a large list of A/B testing ideas in your pipeline. Some good ones (data-backed or result of a careful analysis), some mediocre ideas, some that you don’t know how to evaluate.
We can’t test everything at once, and we all have a limited amount of traffic.
You should have a way to prioritize all these ideas in a way that gets you to test the highest potential ideas first. And the stupid stuff should never get tested to begin with.
How do we do that?
Throughout his life, architect Bill Claudill wrote down notes on this and that, and organized those thoughts under the title This I Believe. When Tom Peters turned 60, he decided to scribble down 60 thoughts, one for each year – that seemed to capture his professional and personal journey.
I turn 36 in a couple of days (June 24). Much younger than the above named folks, but I decided to do the same. I wrote down 25 TIBs – things I believe to be true. Keep reading »
We’ve been in this industry many years – doing optimization work, teaching, consulting. The core problems we’ve observed with the optimization people seem to still boil down to these things
- Not knowing what to do in general (resulting in using random hacks and tactics)
- Not knowing what to test (and testing silly things that make no difference instead)
- Not sure how to find the time to improve skills (as everyone’s so busy and half-life of digital marketing is only ~2.5 yrs)
We all know that you have to work your butt off to get anywhere. Same with conversion optimization. In order to get results you could be proud of, you need to put in the effort and hours. But how are you spending that effort? Keep reading »
Lots of people on the internet are running tests, can I just copy their winning tests? Let other people do the failing, I’ll just test (or implement) the winning stuff. Good idea, right? Keep reading »