How To Use On-Site Surveys to Increase Conversions

How To Use On-Site Surveys to Increase Conversions

Show a landing page to a panel of experts and ask them what’s wrong with it – everyone will have an answer. Oh yes, everyone will have an answer.

But how cohesive are these answers? How accurate? How actionable?

Turns out, even if the panel consists of experts, opinions still aren’t worth the weight of solid research.

And on-page surveys can be crucial to deriving insights for conversion optimization.

On-Site Surveys Defined

I wrote an article awhile back about customer surveys, and while both types of surveys fall under the broad category of ‘qualitative research,’ on-page surveys are different in their goals and execution.

While customer surveys ask questions from people who bought something from your site (your current or past customers), on-page web surveys ask questions from people while they’re on your site (could be a variety of different segments).

In conversion research, the big goal is still the same with the two types. You’re trying to identify sources of friction. On-page surveys provide an interesting look at this, because, as opposed to asking about past experiences, you’re getting their feedback as they’re experiencing your site.

Web & exit surveys are kind of pop-up boxes that appear to the visitor based on certain rules – like time spent on site, number of pages visited, activity (e.g. moves the mouse cursor next to the browser window closing X). This is what they look like:

quala

What Can You Learn With On-Site Surveys?

We can learn a lot with on-site surveys. For instance, we can:

  • Uncover UX issues
  • Locate process bottlenecks
  • Understand root causes of abandonment
  • Distinguish visitor segments whose different motivations for similar on-site activity are undetected by analytics
  • Identify demand for new products or improvements to existing products
  • Figure out who the customer is, feeding into accurate customer personas.
  • Decipher what their intent is. What are they trying to achieve? How can we help them do that?
  • Find out how they shop (comparison to competitors, which benefits they seek, what words they use, etc)

Most of all, however, we’re seeking to learn where the friction occurs in the purchasing process. What fears do they have about handing over their credit card number? What doubts do they still have about your product? What’s stopping them from buying – emotionally, functionally or otherwise?

Note: there are other things you can learn from on-site surveys, of course, like NPS. There are tons of good articles on those other purposes, so we’ll just focus on the conversion research side of things.

Using On-Site Surveys To Remove Friction, Increase Conversions

On-site surveys are critical for conversion research. [Tweet it!] Here’s how Dustin Drees, optimization consultant, put it:

“dustinDustin Drees:

“On-page surveys are great for in-the-moment feedback, which means they’re well suited for pages in your conversion path that underperform. What are the key pages on your site with a high exit percentage? What questions do you need to ask to understand why visitors are dropping off on these pages? You can use these insights to inform your test hypotheses later.”

Take, for example, this case study that Optimizely wrote up on Teespring. Teespring collected qualitative feedback in a variety of ways, including customer surveys, live chat and on-site surveys. Through their research, they discovered that credibility was an issue. Especially because Teespring has an unconventional commerce model (they only ship the shirts once the minimum order size is hit), they needed to bake in extra trust.

Image Source
Image Source

When conducting user surveys and collecting feedback, they heard anecdotes like: “Not sure if I should give my credit card information,” and, “Not sure if I’ll get my shirt.”

With this in mind, the team set up a test betweentwo CTAs. The original:

Image Source
Image Source

And the variation:

Image Source
Image Source

The subtle change in the variation microcopy ended up increasing conversions by an impressive 12.7%.

Voice of Customer Research Using On-Site Surveys

Another inspiring example comes from our own blog – a case study by Jen Havice and Dustin Drees. In doing conversion research for LearnVisualStudio.NET, they discovered that a vast amount (almost 2/3) of respondents considered themselves beginners, and 69.74% of respondents said they were “most interested in finding their first developer job.”

They tooks these insights to form a basic copy experiment. Here’s the original:

lvs-first-original-568x327 (1)

The variation:

LVS-original-home-page-563x426

By simply telling visitors who the lessons are designed for (and where the lessons will get them), conversions increased on the Courses (+9.2%), Plans and Pricing (+24%), and Curriculum (+23.9%).

They then dug deeper into the survey responses and made a few more changes (below the fold copy, CTA, headline, etc), and their variation ended up outperforming the original on the main call to action button above the fold by 66.3%. All primarily fueled by VOC insights from on-site surveys.

Before Anything Else, Define Your Objectives

The effectiveness of your on-site survey strategy hinges on clearly defined business objectives. Otherwise, you’re just wasting your time.

So begin with the end in mind. What goals do you want to achieve? Be specific. As KISSMetrics put it:

“An open-ended statement like “to find out what my customers want” isn’t a concrete goal, because the answers could be all over the map. Questions without a clear objective also make it impossible to create a prioritized list for your team or developers to focus on.”

Dustin Drees factually considers a lack of clear focus as the largest mistake you can make with on-page surveys. Starting with no clear goal compounds the inefficiency of the surveys:

“dustinDustin Drees:

“This is the biggest mistake I see people making with on-page surveys is not having a clear focus on what knowledge they’re seeking with their research. This is a problem because it will create the more common mistakes; asking questions that are too broad in their scope to lead to actionable insights, presenting questions at the wrong time, over-surveying by asking the same question on every page, or asking the wrong questions completely.

Have a clear focus on the reasons for running your survey, so you can identify the right visitors to ask, in the right spots and at the right time.”

When to Pop the Question?

Since you can target when and to whom you’re showing the on-site survey, keep in mind two things when deciding:

  1. Qualifying the visitor (is this a random visitor or someone actually considering purchasing?)
  2. Asking the right question at the right time (e.g. if you ask someone why they didn’t buy right when they land at the site, there will be lots of friction and confusion, and zero insight gained).

Look at your average time on site and page views per person metrics: ask question from people who have above average engagement (for qualification reasons). One heuristic to follow is to target people just above the average engagement.  That way, you’re getting users at least in consideration of a purchase.

Ask the right question on the right page. Don’t ask anything about buying on the home page. Rather, ask that during the checkout funnel. Don’t ask, “why are you here today?” during the checkout. You get the picture.

Which Questions to Ask?

Avinash Kaushik once wrote about the “three greatest survey questions ever.” The story is that, when asked which analytics tool he’d recommend to a VP on a short time frame, he answered that she should not install an analytics tool. Instead, install an on-site survey and ask these three questions:

  • What is the purpose of your visit to our website today?
  • Were you able to complete your task today?
  • If you were not able to complete your task today, why not?

He then explained more about web survey strategy in another article, explaining that the greatest wisdom is to be gained from open-ended questions:

AvinashAvinash Kaushik

“Any good survey consists of most questions that respondents rate on a scale and sometimes a question or two that is open ended. This leads to a proportional amount of attention to be paid during analysis on computing Averages and Medians and Totals. The greatest nuggets of insights are in open ended questions because it is Voice of the Customer speaking directly to you (not cookies and shopper_ids but customers).

Questions such as: What task were you not able to complete today on our website? If you came to purchase but did not, why not?

Use the quantitative analysis to find pockets of “customer discontent”, but read the open ended responses to add color to the numbers. Remember your Director’s and VP’s can argue with numbers and brush them aside, but few can ignore the actual words of our customers. Deploy this weapon.”

 

That said, depending on your strategy, there are many more questions that can bring insight than those three. Think about trying to answer two categories of questions, in general:

  1. Why did they come to the site? Does our site match their needs? If not, are we attracting the wrong traffic? Or is there an opportunity here we aren’t capitalizing upon?
  2. What are the sources of friction? This is more specific that “why they didn’t buy” (understanding that is our main objective, but we have many goals to understand the big picture).

Examples of Questions to Ask

Here are some example questions you could ask (feel free to changing wording as necessary):

  • What’s the purpose of your visit today? (establishes user intent)
  • Why are you here today? (also established user intent)
  • Were you able to find the information you were looking for? (can identify missnoing information on the site – best asked on product pages)
  • What made you not complete the purchase today? (identifies friction – only ask this as exit survey on checkout pages, and beware that some people are still considering the purchase.)
  • Is there anything holding you back from completing a purchase? Y/N (and then ask for explanation – again, this identifies sources of friction)
  • Do you have any questions you haven’t been able to find answers to? Y/N <– (identifies sources of friction, missing information on the site)
  • Were you able to complete your tasks on this website today? Y/N, and when No is select, ask “Why not” (identifies friction and missing info)

There are so many different questions you could ask and get actionable insights; it all depends on your strategic goals.

How to get more people to respond

There’s not a magic question that resonates with every audience, so you’ll have to experiment a bit.

Typically, though, there are two ways you can go about setting up your on-site survey:

  1. Ask a single open-ended question.
  2. Ask a simple yes/no question, and ask for an explanation once they’ve answered it.

The second often almost always works the best for us. As for why that is, it’s probably because of a psychological principle known as “commitment/consistency” (remember Cialdini?). Once the user starts on the path by answering the easier Y/N question, they are compelled to continue by following up with an explanation.

Here’s an example. Here we asked 2 questions that got almost the same amount of views, but one got significantly more responses:

qualaroo

What was the difference?

  • Winning question (517) responses: Do you have any questions you haven’t been able to find answers to? Y/N
  • “Losing” question (182) responses: Is there anything holding you back from making a booking? Y/N

You can’t really extrapolate any universal lessons from the wording here other than this: You need to experiment with different wordings. Some work better than others with specific audiences.

Here’s another example. We asked the following three questions:

  1. Why didn’t you complete a purchase today?
  2. Is there anything holding you back from making a purchase today?
  3. Do you have any questions you haven’t been able to find answers to?

Can you guess which question yielded which result? The results are below (pay attention to views/responses ratio):

quala2

The first question performed overwhelmingly better than others. Which questions lined up with which results?

  • The first question (winner) was “Is there anything holding you back from making a purchase today?”
  • Second question was “Do you have any questions you haven’t been able to find answers to?” (worst performer).
  • Last one: “Why didn’t you complete a purchase today?”

So the takeaway? There are no universal questions that will resonate with all audiences. Our previous winning question performed worst on this audience. Keep experimenting.

Sometimes small changes to phrases can have a big difference on response rate. For instance, Groove changed their question from “why did you cancel?” to “what made you cancel?” and got nearly double the responses.

theNo one can know for sure why the increase occurred, but they posit it was because the former was accusatory and put the customer on the defensive. Either way, small changes can be big increases sometimes.

Mind Your Cognitive Biases

Just as customer surveys are susceptible to cognitive biases, on-page surveys are as well. Actually cognitive bias can strike easily on both parts of the on-page survey process: creating the survey and analyzing the results.

For example, confirmation bias causes you to overtly focus on responses that back up what you already think (and you ignore all those that say otherwise). Dustin Drees points out how to avoid some of the common blind spots:

“dustinDustin Drees:
Confirmation bias is always a present risk, as early as deciding what survey you want to run, as well as how and what you ask your site visitors. It is hard to avoid preconceptions of what you expect to be wrong with a page, which can skew your questions and lead you to miss signals in the responses. Before running your survey, have someone not invested in the outcome review your questions.

When analyzing the results, remember that the visitors who participated also inevitably had to choose to participate, which means they don’t represent your entire audience, but only a segment of your audience. This is called voluntary response bias. It is nearly impossible to avoid this bias, but you can minimize the effect of it by being sure you’re asking questions of visitors in only certain moments; asking questions of repeat visitors for instance can be more reliable than asking questions to every site visitor, or presenting your survey at key bottlenecks in your conversion path.”

Using On-Page Surveys to Fuel a Radical Redesign: A Case Study

If you’re going to do a radical redesign, at least gather some data first.

That’s what Crazy Egg did a few years ago. They had an outdated website with inadequate product information, and they wanted to design a better experience for users.

Before any redesign took place, they used customer surveys and frequent testing to gather insight. First, they used email surveys to ask current users their thoughts on the product. Then they implemented on-page surveys.

The on-page surveys appeared on multiple pages on their site, and they specifically asked questions catered to the page on which they occurred.

For instance, if a visitor was going to leave the pricing page, a survey would ask what caused them to ultimately not purchase the service. The hiring page asked applicants which other tools they have tried, which helped Crazy Egg benchmark the competition.

All of the insights, along with those from tests and heat maps, factored into their redesign. The changes increased their conversions by 21.6% and decreased their bounce rate 13%.

For reference, here’s their original pricing page:

crazyegg-compare-original
Image Source

The updated version:

Image Source
Image Source

MarketingSherpa gave the surveys credit for the following insights:

  • Objections people had before signing up for the product
  • How the most valuable customers described the product
  • Features customers were interested in
  • Whether or not customers knew about certain product features
  • Net promoter scores to find the most satisfied customers
  • Differences between satisfied and unsatisfied customers

As we’ve written about before, if you’re going to do a radical redesign, conduct full conversion research beforehand. On-site surveys are an important part of that.

On-Site Survey Tools

The tool should be a secondary consideration to your strategy. Still, there are some options when it comes to tools (or read our huge conversion optimization tools guide, all reviewed by experts)

Conclusion

On-site surveys are a critical piece in the conversion research puzzle. They help you identify and remove friction from the purchasing process, and increase conversion rate in the process.

Because you’re polling traffic, and because of advanced targeting and segmenting capabilities of polling tools, you can do a lot of different things with on-site surveys. In this article, we focused on conversion research, primariy on gathering insights to inform experiments. But you can also use on-site surveys for onther things like NPS.

But, make no mistake: just throwing up a HotJar exit survey with a random question does you no good at all. You must have a very clear goal in mind with your qualitative research, a goal that will inform your execution of the surveys. In addition, keep these ideas in mind:

  • Ask the right question at the right time (important!)
  • Experiment to maximize response rate
  • Mitigate cognitive biases
  • Make sure your on-site surveys fit into and inform the rest of your conversion research efforts.

Join the Conversation Add Your Comment

  1. Indeed, an on-site survey is the best way to get customers’ feedback for conversion assessment. I agree that one should have a clear goal in mind before formulation questions. Right questions will lead to the answers you need for evaluation. The great advantage is, in less time, you get answers that are ready for analysis that would help in designing a for a higher conversion rate.

    Thanks,
    Carl Ocab
    http://www.carlocab.com

    1. Alex Birkett

      Thanks for the comment, Carl. Glad you enjoyed the article!

  2. Hi,

    thanks for the information.

    A little ‘secret’ tip for on-site survey tools is:
    http://survicate.com/

    When I evaluated our needs (especially in terms of precise targeting conditions and not too show too much surveys to the individual user) this service was the only one to fit our needs…

    Bye
    Sven

    1. Hi Sven,
      Thanks a lot for this comment! It really made our day at the office.
      Cheers
      Kamil

Comments are closed.

Current article:

How To Use On-Site Surveys to Increase Conversions