#6: Using Qualitative On-Site Surveys

Last lesson was about learning from your customers. But most people on your site will not buy anything. How can we get more people to buy? One thing that helps us figure that out is website surveys.

There are 2 ways to survey your web traffic:

  1. Exit surveys: hit them with a popup when they’re about to leave your site.
  2. On-page surveys: ask them to fill out a survey as they’re on a specific page

Both are useful.

There are many tools to use that, I usually use Hotjar or Qualaroo, but there’s also WebEngage, Usabilla and many other tools. The tool itself doesn’t matter as long as it gets the job done:

  • Configure which page(s) will have the survey on
  • Set your own questions (no pre-written template bullshit)
  • Determine the criteria for when to show the survey

If these 3 criteria are met, you’re solid.

Aren’t surveys annoying to people? Sure they might be to some – but the data you get out of it is well worth it. And you typically only run the surveys for a limited period of time.

 

What should you ask?

Remember the key: actionable data. We need information to act on.

Since our goal is to get more people to take action, start with learning about friction. What are the FUDs (fears, doubts, hesitations) they are experiencing – while on a specific page?

Every page on your site has one job – and your survey question should be about that one page, one job.

For ecommerce product page it’s to get people to click on ‘add to cart’. For checkout page its to get people to take out their wallet, and enter their credit card info. And so on.

  • Step 1: Determine the most wanted action for the page.
  • Step 2: Come up with a question that asks about friction.

So for instance if this is an ecommerce product page, the goal would be cart adds. So the question to ask could be something like “What’s holding you back from adding this product to the cart right now?” or “What’s keeping you from buying this right now?”.

You don’t always know which question is the best one to ask – there is no single best question. Some questions will get far better response rates, but you won’t know in advance which ones.

So try to come up with multiple different wordings to the question.

Another way to ask about friction could be “Do you have any questions that you can’t find answers to?” – give them a Y/N option, and if they choose No, have them type in their question.

This is my pro tip actually: Ask all questions in the form of Y/N. It’s easy to just choose Yes or No. If you hit them with a complicated question right away, less people will take the time to write. But if you start with Y/N, and only once they choose ‘No’, then pop the question, they’re much more likely to respond.

I see 2% – 4% response rates all the time.

So instead of “what’s holding you back from…” you would ask “Is there anything holding you back from …”? Y/N. And ask to clarify.

And remember – a different question for each page (e.g. pricing page, category page etc) – that’s the only way to learn about the specific friction they’re experiencing on that very page.

When to pop the question?

In most cases not right away. You want to qualify the visitor first – a bunch of your traffic has no intention to buy at all. There’s nothing you can do to get them to buy. Lots of people want to drive a Tesla, few can afford the $100k.

So you only want to survey people who have demonstrated a level of engagement. Maybe there’s a micro-conversion they need to complete first (e.g. join the newsletter). Or look up your average time on site, average pages per visit – and trigger the question only once they’ve spent above average amount of time, clicked through above average amount of pages.

You need to do some experimentation with this, there’s no universal rule.

How many responses do you need?

One answer is better than none, but you don’t want to put too much weight on a single response. They might be an outlier, an edge case.

So I typically try to get in at least 100 responses before even reading any. 200 is better. 500 responses will only slightly add more value, but is way more work – diminishing returns.

How long it will take to get 100 responses depends on your traffic, so if you have a low-traffic site, you might have to do with less responses.

What to look for

As always, see if you spot any trends in the responses. See if your customer survey responses about friction are similar to web traffic survey results. You’re mining for insights! See if you can validate or invalidate some of your observations from the heuristic analysis.

A bunch of people will tell you that pricing is an issue (“too expensive!”) – that’s to be expected. If that’s a dominant response, you’re either driving too much unqualified traffic to the site, or you’re not doing a good enough job communicating the value of your product(s).

  • #1: Mindset of an Optimizer

    You seek to understand your customers better - their needs, sources of hesitation, conversations going on inside their minds.

  • #2: Conversion Research

    Would you rather have a doctor operate on you based on an opinion, or careful examination and tests? Exactly. That's why we need to conduct proper conversion research.

  • #3: Google Analytics for Conversion Optimization

    Where are the problems? What are the problems? How big are those problems? We can find answers in Google Analytics.

  • #4: Mouse Tracking and Heat Maps

    We can record what people do with their mouse / trackpad, and can quantify that information. Some of that data is insightful.

  • #5: Learning From Customers (Qualitative Surveys)

    When quantitative stuff tells you what, where and how much, then qualitative tells you 'why'. It often offers much more insight than anything else for coming up with winning test hypotheses.

  • #6: Using Qualitative On-Site Surveys

    What's keeping people from taking action on your website? We can figure it out.

  • #7: User Testing

    Your website is complicated and the copy doesn't make any sense to your customers. That's what user testing can tell you - along with specifics.

  • #8: From Data to Test Hypotheses

    The success of your testing program depends on testing the right stuff. Here's how.

  • #9: Getting A/B Testing Right

    Most A/B test run are meaningless - since people don't know how to run tests. You need to understand some basic math and statistical concepts. And you DON'T stop a test once it reaches significance.

  • #10: Learning from Test Results

    So B was better than A. Now what? Or maybe the test ended in "no difference". But what about the insights hidden in segments? There's a ton of stuff to learn from test outcomes.

  • Conclusion

    Conversion optimization is not a list of tactics. Either you have a process, or you don't know what you're doing.