Experimentation / CRO

Experimentation and Testing Programs acknowledge that the future is uncertain. These programs focus on getting better data to product and marketing teams to make better decisions.

Research & Strategy

We believe that research is an integral part of experimentation. Our research projects aim to identify optimization opportunities by uncovering what really matters to your website users and customers.

Data and Analytics

90% of the analytics setups we’ve seen are critically flawed. Our data analytics audit services give you the confidence to make better decisions with data you can trust.

The Effects of Highlighting a "Recommended" Pricing Plan [Original Research]

Previously, CXL Institute published research we did on the order of pricing plans. This study on the effects of highlighting particular pricing plans is a continuation of that study. It has the same experimental design, except here we explicitly test a new variable – highlighting a plan with a different background color.

Similar to the first study, we manipulated the pricing page for a survey tool, SurveyGizmo, to see if there are different patterns of user perception and preference (choice of plan) for various layout designs (price plan order) when one particular plan is highlighted.

Results summary

  • People basically viewed in an "r" shape, reading features and prices on top, no matter what order the plans were in or what was highlighted.
  • When ordered cheap-to-expensive, participants focused more quickly, and longer on the highlighted plan.
  • Participants chose the PRO plan more often in the expensive first plan order, and when it was highlighted.

How do I apply this research?

While not surprising, results of this study indicate that the highlighting does work to bring attention to a plan, and particularly if the plan is off to the right (as in an expensive plan when plans are ordered cheap-to-expensive). Overall results of this two-part series suggest to test an expensive-to-cheap plan order & to highlight your more 'recommended' plan.

Background:

The background for this study is generally covered in Part 1 of this study series – Pricing Page Study (Part 1) Effects of Plan Price Order.

In Part 2, we add the concept of recommending a particular plan to the user, via a simple color difference, and focus solely on two pricing plan orders: cheap-to-expensive (low-to-high) & expensive-to-cheap (high-to-low).

Study Report

Data Collection Methods:

We used the pricing page for a survey tool (Surveygizmo). Since many people never have a need for a survey tool like Surveygizmo, participants likely wouldn’t be biased toward the site (they’d probably never heard of it). However, the idea of surveying many people, and therefore the scenario we’d present to participants, is easily understood, which is helpful in setting a motivating scenario for the participants.

Here is a screenshot of the original pricing page (Note: If you check out the current pricing table on Surveygizmo, you’ll notice it’s already been changed just a month after our test!):

Screenshot of original SurveyGizmo pricing page, plans ordered cheapest to most expensive
Screenshot of original Surveygizmo pricing page, plans ordered cheapest to most expensive from left to right (cheapest-first).

Here are the two variations tested for this experiment, with the ‘Pro’ packaged highlighted in both the expensive-to-cheap variation and the cheap-to-expensive variation:

Screenshot of the modified pricing table for Surveygizmo, ordered expensive to cheapest from left to right (expensive-first).
Screenshot of the modified pricing table for Surveygizmo, ordered cheap-to-expensive from left to right, with the ‘Pro’ package highlighted with a green background color.
Screenshot of the modified pricing table for Surveygizmo, ordered expensive to cheapest from left to right (expensive-first).
Screenshot of the modified pricing table for Surveygizmo, ordered expensive-to-cheap from left to right, with the ‘Pro’ package highlighted with a green background color.

Our methods included: A task that participants were prompted to complete, eye-tracking equipment to analyze viewing patterns, and a post-task survey question.

Note: For Part 2 of this study, we improved the validity of our post-task survey question by adding participants (but without the eye-tracking component). Here are our sample size numbers:

Table of participants according to plan variation
Table of participants according to plan variation

For the task, we wanted participants to examine each plan’s features closely. To ensure a thorough analysis of plans, we asked them to find features that only 2 of the 4 plans offered (chat support and analysis tools).

We asked:

“SCENARIO: Imagine you own a medium sized business and are in need of an ONLINE SURVEY TOOL that offers CHAT SUPPORT and ANALYSIS TOOLS.

The next screen will show pricing options for a survey tool.

Browse and compare options to choose a package that fits your needs.”

After viewing the web page, we asked which plan they would choose.

Using the task and post-task survey question, we intended to reveal whether the layout of plans affected how users consume information on the table. If so, does this difference influence the plan they would ultimately choose?

Disclaimer – Oftentimes, what people say they’d do and what they would actually do can be quite different. However, there’s value in understanding participant feedback since it can influence decisions on sites that don’t have the traffic that’s required for A/B testing, and for those sites that do, the specifics of testing.

Findings

Eye-Tracking Results

Here’s a view of the two variations showing the average order in which people viewed each area of interest (the pricing plans and features offered).

Surveygizmo pricing plan page variations with indication of order of first viewing for each plan. The left variation is 'cheap-to-expensive' order, and the right variation is the 'expensive-to-cheap' order
Surveygizmo pricing plan page variations with indication of order of first viewing for each plan. The left variation is ‘cheap-to-expensive’ order, and the right variation is the ‘expensive-to-cheap’ order

Like Part 1 of this study, the aggregate transparency map tells us that people view pricing tables in the same overall pattern. Keep in mind, however, that the plans are in different orders, so while viewing patterns are the same, people view the individual plans in different sequences.

Here are our summary stats for the PRO pricing plans, the plan we’re focused on. This includes the survey responses to which plan they would choose:

Summary stats for eye-tracking and survey results on highlighted vs. non-highlighted PRO plans
Summary stats for eye-tracking and survey results on highlighted vs. non-highlighted PRO plans

Time to first fixation did not differ significantly for the highlighted vs. non-highlighted variants of the expensive-first plans, but did for the cheap-first plans (p-value = 0.0171), see stats below:

Graph of differences in mean time to first fixation for highlighted vs. non-highlighted PRO plan.
Differences in mean time to first fixation for highlighted vs. non-highlighted PRO plan.
Takeaway  –  When ordered plans from cheap-to-expensive, participants focused more quickly and longer on the highlighted plan.

Time fixating did not differ significantly among any of the groups. However, there were differences in the survey results among the plans.

Overall, the expensive-first plans resulted in more people choosing the PRO plan. Here are the comparisons of the highlighted vs. non-highlighted variations:

Chi-square goodness of fit test results for plan selection by participants among the highlighted vs non-highlighted variations.
Chi-square goodness of fit test results for plan selection by participants among the highlighted vs non-highlighted variations.
Takeaway  –  Participants choose the PRO plan more often in the expensive-first plan order and when it was highlighted.

Limitations

Limitations here are the same as in the Part 1 study:

  • Our results were obtained from study participants who didn’t actually need to purchase a survey tool. This issue is the Achilles heel of user testing: Hypothetical situations don’t always translate into real-life situations. There is a possibility that individuals who actually were shopping for a survey tool would view the pricing table differently. However, we tried to control for this with a proper ‘scenario’ capturing accurate motivation. With a proper sample size, we saw differences in responses among groups which provides supporting evidence as to what works and what doesn’t for pricing table order/layout. As always, TEST.
  • While providing participants with a pre-defined task scenario provides uniform motivation, it may also limit our interpretation of the results since the preferences for a certain plan could be biased by our particular scenario prompt. It’s even possible that the differences in plan preference we observed would be more significant —or totally different— if we gave participants a different task (e.g. we asked them to look for survey features that were available with every plan).
  • Also, note that results are limited to one type of product with a particular price range. We’re curious to see how these results apply across different product types (e.g., Subscription vs. SaaS, informational/digital products vs. physical products) and at different price points (i.e., cheap products vs. high-end expensive products).

Conclusion

While people will often do differently than they say, this research shows pretty clearly that highlighting a plan draws more attention to that plan. The results of this study, in combination with Part 1 where we studied pricing plan order, suggest that you should test an expensive-to-cheap plan order and to highlight your more ‘recommended’ plan.

Related Posts

Who's currently reading The Experimental Revolution?