The Endless Suck of Best Practice and Optimisation Experts

Unicorns and Optimizers

So what’s this all about? Unicorns, Useful Best Practices and Optimisation Experts — all rather rare and mystical beasts.

And this quote encapsulates everything I’m going to talk about today:

“Stop copying your competitors — they may not know what the f*** they are doing either”- Peep Laja

So what’s the problem? The uselessness of ‘best practice’ as a tool for correctly predicting what the outcome of a test or change will be.

So let me explore this using a UX situation. Let’s say you’re a UX designer and you’re testing a process that involves persuasion, elements of friction (a form), lots of data entry and interaction points inside each page.

We all know the inspection methods you’re likely to use — right? Usability testing, interviews, diary studies, session replay, voice of customer  –  imagine these are all in the mix. You’ve observed lots of people struggling and found the interaction points in the process that really hit the exit rates for this process. High five everyone!

But wait a minute — we haven’t solved these problems yet. We’ve maybe quantified them (a rarity for most UX designers) using sources like analytics and we’ve got the qualitative feedback from the user testing and voice of customer.

Fixing these should be a piece of cake, eh?

And that’s where the problem comes in. My failing as a UX practitioner when I first started out was to fall into this trap and to think my guessing was expertise.

Are we all guessing then?

There are phenomenal amounts of people just guessing out there. Marketers, CEOs, Business owners, Small companies, Big Corporates, International Brands, hip startups — it’s an equal opportunity game.

hippo opinion
Image Source

So surely they hire CRO or UX experts (or other types of experts) to save them from guessing? Usually it’s for another reason but that’s a longer story about the psychology of organisations and people!

For those experts that actually have good experience, surely all that counts for something? Surely thousands of hours observing, fixing and testing products makes you better at finding solutions?

And yes — of course it does. But not in the way some people think.

The Pros and Cons of Experience

So let me illustrate three problems I found on a form recently:

1. When people enter their postcodes, they sometimes don’t enter a space (for example, “SE136DH” instead of “SE13 6DH”). The form rejects this.

Postcode form

2. People click the continue button on the form, rather than the ‘search postcode’ box — nothing happens. They think they’re doing a search.

postcode form continue

3. People are missing how to fill out a different delivery address (from billing) so their payment could get rejected

For this third part, there was no option to tell the website whether this was the billing address or not. Later in the form, people simply got asked for their billing address — but without being able to say ‘yeah, the same one I put in earlier, dumb website’.

In the first example on this form where the postcode failed validation because it expected a space, this is just a really stupid moment. We simply fix and test the postcode validation before making it live. I looked at all these retailers who were called out on this simple practice here and they’ve all fixed the problem. However, I’m constantly delighted by how people implementing new interfaces on the web fail to leverage the good work done before. Companies keep doing this all the time.

We might measure the impact of fixing this kind of validation issue but mostly, we’ll just accept we’re doing something dumb and fix this as a bug or BAU (business as usual) change. The solution is completely straightforward and implementable with little or no discussion required.

In the second case, my experience tells me that they’re not seeing the button to perform the search for a postcode, because there are two call to action buttons. You have one that says ‘Find postcode’ and a second one saying ‘Continue to next step’.

Two potential solutions — I could either emphasise the search postcode button or I could remove the continue button — only showing it once they picked their address. I have a high degree of confidence in one of these working, probably the latter, but I’m not 100% sure. I need to see evidence that this is working or I need to run a test. I’m quietly confident but I can’t be sure.

With the billing address problem, this involves working across multiple pages to solve. Should we simply ask if the delivery and billing address are the same? Where should we ask them? How does that impact further screens people see?

This is a complex multi-step problem and I not only don’t have a solution — I know that whatever design hypothesis I come up with, it WILL need validation.

Experience Does Count

This example illustrates where experience actually counts. It helps me to:

  1. Spot soluble and straightforward defects and to know what is and ISN’T in this category.
  2. Find stuff that I’m very confident I have a pattern, solution or approach to take.
  3. Understand which things I need to iterate, test, improve and keep optimising.
  4. Know what I don’t have a bloody clue how to solve.

It does NOT let me know what solution will work — only the tools, the method, the journey I need to go on to get closer to whatever that solution may be. It’s not about the AB testing — it’s the journey that counts.

Heuristics, iteration, and expert mechanics

If I was an Expert Car Mechanic — my skills would allow me to discover and query the symptoms or car setup to isolate, understand — to diagnose the problem. I may see something like a flat battery and know immediately that this is probably the entire issue. I may have something really complicated where I need to inspect the engine, trace the electrical signals or plug in a diagnostic system.

mechanics and optimizers

And I’ll experience a complete range of problems from the bonehead obvious all the way to the head scratching end of the scale. And that experience gained working on cars, helps you work out the approach to iterating your way to the solution, finding the fastest path and picking the right tools at the right point in that process.

And hey, replacing the battery is a nice pattern to spot. But sadly, it was the alternator which was dead. The battery was flat from the failure of another component. You won’t find any good mechanics saying they can fix any problem or improve any engine from just looking — they’ll more likely be able to tell you how your problem and their skills line up.

You won’t find any mechanic saying “Yup — found it — it’s the battery. We’re done here.” when they haven’t checked and validated that solution.

Opinions, meet Humble Pie

“The skills an expert builds up are the means by which they correctly diagnose, fix and validate solutions or change — not the means by which they can predict the freaking future.”

And so we can take UX experts and CRO experts who are good at spotting problems and say one thing confidently. They don’t always know, with a wide range of things, what the hell will happen once they’re implemented.

The range of confidence varies depending on your experience and the approach but a large amount of the ‘solutions’ that any CRO or UX person presents will be an ‘informed guess’ or ‘hypothesis’. It’s not a guarantee, a solution, a dead-cert or an easy win — it’s a potential solution waiting to be tested and validated.

So there’s a kind of arrogance here that I developed — that because I knew the problem domains intimately, I had to also know the solutions too (and I’m biased, lol). And a bad sign of any UX or CRO expert is unshaking confidence in a solution, pattern, wireframe — especially when that confidence may be unfounded or based on limited evidence or data. I was that idiot a few years ago but plenty of testing, humble pie and seeing my predictions shattered — has cured me of this disease.

Humble pie

Knowing the extent and boundary of your knowledge about a problem (or system) and therefore the confidence you can have about potential changes, is one of the best things I ever learned from testing, observing and measuring people using my designs. The more inspection methods I used (diagnostic tools for my work) the better and faster my ability to achieve behavioural shifts and move clients forward — in the same way that good tools for a mechanic helps them solve problems with your engine.

So what the hell does this have to do with Best Practice?

Well, UX researchers and CRO people will often cite these as examples of what to test or what to do. You’ll be presented with an example or a screenshot of a site and in some cases, an exhortation to try or test them.

And they can be useful, if presented with a pinch of salt. They’re example patterns of something that might have been used elsewhere but to be honest, the UX expert doesn’t KNOW if that nice checkout payment pattern that stripe implements actually works or not. They might be running a test and it might totally suck.

The CRO researcher doesn’t know either. They might be suggesting you do a test like some other company, but they’re up against another barrier. There are many factors that are different than the test you’re looking at (context, customers, background, data, traffic):

1. Context

When you look at AB test results, they often don’t tell you about the customers, the flow, the intent, the traffic source, the target audience, the company brand or credibility, the barriers, frustrations and worries — you’re looking at one page in a stream and so see the smaller, not bigger picture. You have a tiny window onto the site and data.

2. Customers

I laugh when people say ‘Oh this worked on X site and this would be great to try on yours’ — particularly when it’s not a ‘confident pattern’ in my book and secondly, when the site it was tested on is NOTHING LIKE your business.

An ‘Adult Shop’ may hand out lubricated prophylactics to customers as they enter the store. This might work wonderfully for their business and increase sales and customer happiness but would it work in your store? If you read about this test in a newspaper article, would you decide to start handing these out in your shoe store without looking for evidence it might work?

So if you copy things (your competitor, an AB test pattern) — it’s very easy to do this without any understanding of what you’re copying, what bits worked and the relevance or indeed comparative point (in any meaningful way) with YOUR customer base. You don’t know why it worked (the tester might but may not tell you) so without that context, it’s of limited value.

3. Background

You have no idea if people saw an awful landing page, were paid leads at great expense or organic traffic. You don’t know if the page was responsive, how many mobile/tablet/desktop visitors saw it, what browsers they used and anything else that might potentially make copying this test almost useless. Even if you’re running an identical business, would your traffic (and response to any test) be likely to be the same? Unlikely unless it’s just a dumbass fix you know how to make anyway.

4. Data

Most AB test results pages are kinda like stats porn. You may look at what you see but trying this at home might not work out for you. They also bias our heads — making us think that if we can only ‘do stuff like this’ then our form would convert better.

Data on the test result helps but you often have no traffic composition, cost data, sources or an idea of upstream traffic — most results published often show almost nothing of the journey. Some AB tests don’t even show sample sizes, confidence and particularly error bars — the former and latter being criminal omissions.

5. Traffic

The traffic on my customers’ sites vary all the time. Unless I can be sure that the sample I’m testing is somehow similar and representative to the sample THEY used in their AB test, how could I expect the response to be the same? Sure — if it’s something bonehead that the new creative solved — that works at the fundamental level of core usability, clarity or persuasion. But maybe not.

So even if you knew everything, had seen every test and had access to all this data that’s missing from AB test examples, you STILL couldn’t guarantee a similar response. Knowledge of this is both ego crushing (a good thing) and also your salvation.

The Downfalls of Overconfidence in Optimisation

Do I cook like Gordon Ramsay at home? In my head, of course but not in real life. I might not even follow his method or his recipe — he’s informed me about the right way to cook a dish and serve it — but my target audience is very different. He’s provided me with a suggestion or a recipe I might try — but that response is the precious thing. No matter how nice it looks on telly, my wife and daughter may not like it. Probably my cooking but also their taste.

Best Practice and Experts who tell you confidently that they know what to do are just a confusing smokescreen for the real truth. It’s about the site, the product, the service, the customers, their context, their fears, worries, barriers, motivations, emotions and responses — with you, and not someone else’s product. Start loving and understanding them, rather than chasing the illusory value of slavish or dogmatic copying, whatever the seeming promise of the source.

A disclaimer — I’m not denigrating the great UX and CRO people that I know. Just the bullshitters out there who fall victim to the conceit, the arrogance of thinking that they actually know the solution. The best experts are those who shrug, say “I don’t know but we sure can find out. Here’s how.” They have confidence not in their belief about what they don’t know but confidence in their ability to get that information and to really know.

Great UX and CRO people will be using constant and iterative feedback and testing to shape and improve products towards user task & goal outcomes and some business goals too. They won’t profess to know the answers but can explain the likely cause of problems and a range of potential solutions or tools to move away from where you are and towards where you need to be.

Hire The Humble One

“There is no such thing as best practice for me. There are only users, the boundary layer between their minds and my product — and the tools that I can use to understand what’s happening there.”

My experience in observing and fixing things — these patterns do make me a better diagnostician but they don’t function as truths — they guide and inform my work but they don’t provide guarantees.

So next time you hire a UX or CRO practitioner, go for the one who shows humility — who may not know from looking at your company’s engine with one glance what’s wrong — but who will roll up their sleeves and find out, using every tool at their disposal to find truth, solutions and the desired outcome.

And lastly, please stop copying slavishly. If you copy without knowing why it worked, you may not get the same result at all and it may not teach you anything useful about your customers. It’s a race to the bottom in business terms too, because by the time you’ve seen any test or pattern — the company who did this will have moved on.

So you want to go back 18 months in time, to copy something someone tried that long ago, and being presented to you today? Most of your competitors would love if you were wasting time trying their old stuff or something random you saw — better for their business, all day long.

Feature image source

Join the Conversation Add Your Comment

  1. Well said. Optimization is by far the most difficult aspect of our business. Summary – trust nothing and no one, test everything! :-)

  2. Very true. People sometimes test colors, or other easy stuff, instead of the stuff that really matters … did you convice your visitor they want what you have. That takes time and skill to test.

    1. Agreed – it’s a stage quite a few companies go through. There is:

      Not testing
      Starting testing
      Testing lots of stupid things
      Making mistakes
      Rethinking testing
      Doing it better
      Doing it properly

      The timeless aspect you mention is knowing the visitor. If you know their heads inside out, your product and web experience inside out – then you have all you need to make it work better.

      Thanks for reading!

  3. Wow Craig! You’re the man!!

    I love the perspective! And you helped me justify why I always don’t know what to say when people want me to throw ideas at them… haha

    1. Thanks Daniel,

      Always be learning – as I’ve seen you put into practice yourself!

      It’s OK not to know – I guess the change from being a knowall to knowing how to get answers to stuff you don’t know, is what comes from experience. It’s not about what you know – it’s about how you inspect, discover, find out about, examine – a problem domain – in order to find things that you can test about those problems.

      Thanks for reading!

  4. High five Craig! There is too much slavish thinking and not enough “creative rigour” in all forms of digital optimisation.

    Well done for calling it out.


  5. Good points, listening to the users is the only way.

  6. I agree with the premise that you can’t copy from your competitors and must always test things out yourself to see if they work. But there is a but. What makes a CRO great is his/her ability to take what a competitor or “guru” is doing successfully and make it work for their business.

    1. Eyal,

      That’s true only if you’re looking at a competitor, figuring out why their practice *works* and then applying it within the context of your users and service. If you don’t have their data, their test results, the identical customer segmentation – how can you copy anything? It would be a guess!

      Looking at competitors is fine – deciding you *know* what to test using competitors site is stupid, because you’re missing all the information that constitutes knowledge – this thing on their site? Is it working? Does it make things better? Or worse? What customer groups do they have? What marketing? What promotions?

      Unless you worked for your competitor, you’ll only ever see the surface layer of the actions of people within the company changing the site. I can walk down the high street and see a shop selling stuff cheaper than me, or trying a new line. I can be informed about these things but they don’t guarantee they’ll work in MY store.

      So whilst I agree with your point, I often see people spending too much time looking at competitor sites and not enough time on their own, with their customers.

    2. With regards to competitors. There are increasingly useful tools that allow you to understand what your competitors are doing, how they’re doing it, and why. You don’t have to be a former employee to make certain assumptions about your competitors tactics.

      Data is only a part of the decision making process. It’s there to support or disprove hypotheses or assumptions that are derived from your own data but also from professional experience, competition analysis or even, dare I say hunches.

    3. Of course – and I agree. I spend a lot of time looking at competitors, using SEO tools (and others) and generally finding useful information.

      What I’m talking about is the common practice of ‘seeing something’ which you have no information about or very little, apart from the visual layer. If you don’t know the analytics data, the abandonment, the customer types or personas, the device experience – what can you infer from simply ‘looking’ at a page or process on your competitors site.

      I agree entirely with the premise that you can be informed by this examination of competitor activity and site experience – you just cannot use that observation to conclude you can repeat it on your site.

      I had this conversation with a large retailer and he was convinced he could simply take test results and get the same impact on his site. I suggested he take a look at some Adult Shops (ahem) here in the UK – who give out free samples at the door.

      I suggested he ask them for some free samples and try handing them out in his shop. He looked aghast “That would never work in a million years – they’re not the same customers”.


Comments are closed.

Current article:

The Endless Suck of Best Practice and Optimisation Experts