What Tools You Need When You Start a Testing Program
One of the great truths that people ignore when it comes to optimization is that you can fail with any tool. It’s only when you are trying to succeed that differences in tools really matter.
One of the great truths that people ignore when it comes to optimization is that you can fail with any tool. It’s only when you are trying to succeed that differences in tools really matter.
There are many, many, many lists of conversion optimization best practices. Some are sacrosanct:
These practices often come from broad trends observed over many experiments and they highlight what usually and typically works. Often, they’re tapping into a kernel of persuasion wisdom.
All testing programs, no matter how great or awful, think they are doing pretty good and can get better.
Another one bites the dust. Here are our most read articles for 2014. Make sure you didn’t miss any.
As more and more people start to look to testing and conversion optimization as a consistent and meaningful tool for their marketing and other initiatives it is important that people start to realize that optimization as a discipline is not just a false add-on to existing work. Testing when done correctly can and should be by far the number one driver of revenue for your entire site and organization, and yet according to 3 of the major tools on the market the average testing program only sees 14% of their tests succeed.
As of next week, I, Tommy Walker, will be leaving my post as editor of CXL.
I’ve learned so much in this past year, and I owe so much to this blog. If you’ll allow, I’d like to share the 6 major lessons I’ve learned as the editor of this blog.
When most beginners start with conversion rate optimization, they get carried away by the rosy picture of A/B testing. Let’s test button colors! Or maybe call to action text! That should get us a win of at least 30-50%…I think.
Unless you’re a giant like Amazon, you need to go beyond the random let-us-change-button-color-today kind of tests to move the needle through A/B testing. It’s not about tactics. It’s about the reaction your design creates in the mind of your visitors.
Here’s something scary, according to MECLabs & Magento’s 2014 eCommerce Benchmark, only 13% of those studied base their testing on extensive historical data.
David Skok, who is a must read for all startups, explains how as your SaaS companies grow, the size of your subscription base (or customers/users) increases so that any kind of churn against that base becomes a large number.
This equates to a loss of revenue, which requires more and more bookings (think signups) from new customers just to replace what you are losing every month.
Have you ever dreamed about learning which products your customers are most likely to buy in advance?
How great would it be if you could determine the highest price a customer would pay for a product? What if you could optimize customer service to resolve concerns proactively before they become issues?