Andrew Anderson

Andrew Anderson

Andrew specializes in building optimization and data programs into world class and efficient revenue producers. He has 14 years experience in conversion optimization and has worked with over 300 different organizations.

All thoughts and opinions are his own and do not reflect on any organizations with which he is affiliated.

Read his personal blog for great optimization insight.

Your Organization Really Doesn't Want Optimization To Succeed

Here’s something not many people talk about: no one at your organization really wants optimization to succeed – at least not in way that is most powerful and revenue impacting.

Let that sink in.

Keep reading »

The Narrative Fallacy in Optimization (and How To Avoid It)

Show an A/B test case study to a group of 12 people and ask them why they thought the variation won. It’s possible you could get 12 different answers.

This is called storytelling, and it’s common in the optimization space.

Keep reading »

5 Tactics to Changing How Your Organization Thinks About Optimization

When I look back at the most important work I have done with my current company or any of the over 300 other websites that I have worked with, by far the most important work I do is in changing how groups think and operate. So much of optimization is about asking people to go past their comfort levels and their inherent biases and asking them to act rationally. At least 90% of the time and effort I put in is in dealing with these much harder and deceptive factors in success of a program.

Keep reading »

The Discipline Based Testing Methodology

This is the methodology that I have developed over 12 years in the industry and working with over 300 organizations. It is also the methodology that has been used to have a near perfect test streak (6 test failures in 5.5 years), even if most others do not believe that stat. Keep reading »

Lies Your Optimization Guru Told You

Before you get out your pitchforks, I want to stress that this article does not represent Peep’s views.

The easiest lies to believe are the ones we want to be true, and nothing speaks to us more than validation of the work we are doing or what we already believe.  Due to this we become naturally defensive when someone challenges that world view.

The “truth” is that there is no single state of truth and that all actions, disciplines, and behaviors can and should be evaluated for growth opportunities.  It doesn’t matter if we are designers, optimizers, product managers, marketers, executives, or engineers, we all come from our own disciplines and will naturally defend to the death if we feel threatened even in the face of overwhelming evidence.

Keep reading »

Lies Your Designer Told You (or Data vs Design)

Designers versus data more than ever deserves its place in the pantheon of great conflicts: the Hatfields vs. McCoys, Android vs. iOS, Social Media Marketing vs. Results, Athens vs. Sparta, the Doctor vs. Daleks, Auburn vs. Alabama, and Fox News vs. reality.

We make this out to be some great collision of disciplines when in fact they are not opposites and they can and should work together. Keep reading »

What Tools You Need When You Start a Testing Program

One of the great truths that people ignore when it comes to optimization is that you can fail with any tool. It’s only when you are trying to succeed that differences in tools really matter.

Keep reading »

The Psychology of a Successful Testing Program

All testing programs, no matter how great or awful, think they are doing pretty good and can get better.

Keep reading »

The Greatest Factors Limiting Your Testing Program

As more and more people start to look to testing and conversion optimization as a consistent and meaningful tool for their marketing and other initiatives it is important that people start to realize that optimization as a discipline is not just a false add-on to existing work.   Testing when done correctly can and should be by far the number one driver of revenue for your entire site and organization, and yet according to 3 of the major tools on the market the average testing program only sees 14% of their tests succeed. Keep reading »