Pricing Page Optimization: How to Order Pricing Plans [Original Research]

Screenshot of original SurveyGizmo pricing page, plans ordered cheapest to most expensive

How do you order your pricing page: Cheap-to-expensive? Expensive-to-cheap? Randomly?

This study, conducted through CXL Institute, is the first of a multi-part pricing page study providing data on how people consume pricing plans depending on the plan’s layout design.

For this first study, we manipulated the pricing page for a survey tool, SurveyGizmo, to see if there are different patterns of user perception and preference (choice of plan) for various layout designs.

Results summary

  • Generally, users processed all pricing tables the same way regardless of plan layout: They spent the most time looking at the first two pricing plans listed in a left-right order.
  • People read about the expensive plans more quickly and for a longer amount of time when they were placed on the left side of the table, listed first or second.
  • Participants chose expensive packages most often when they were listed first.

How do I apply this research?

Considering the results from this study, apply our findings to your own pricing plan table, test it! (list your most expensive plans first in a left-right order). Run an A/B test, and let us know how it goes.

Keep in mind: User testing doesn’t always perfectly translate into reality.  Oftentimes, what people say they’d do and what people actually do is not congruent.  Also, we studied user behavior for a survey platform; we cannot guarantee that our results will apply to other types of products and price ranges. Stay tuned for A/B testing case studies and research on these variations of product and price.

 

Background

The pricing page is a crucial part of the sales funnel for a business. It’s where the customer sees the details about what they get and what they have to pay. We’ve written about it frequently, including its core principles, subscription plan pricing, and even a post reviewing other websites’ pricing tables.

There is a very common way to present pricing plan: cheap-to-expensive, left-to-right. According to a website review by process.st, of the ones that list prices, 81% of SaaS companies listed on the ‘Montclare 250‘ list their prices from cheapest to most expensive. It’s a logical way to present things…but is it data driven and does it lead to more conversions?  In this study, we attempt to answer that question.

We tested different pricing plan layouts in this study to get data behind how different layouts affect user perception. To gather a wide range of data, we implemented task scenarios, eye-tracking, and post-task survey tools to understand how users generally consume information on a pricing table, and how the design of the pricing table may influence which plans they end up choosing.

Study Report

Data Collection Methods:

We used the pricing page for a survey tool (Surveygizmo) as the research subject. Since many people (outside the optimization world) probably haven’t heard of Surveygizmo, they would be unbiased toward the site. However, the idea of surveying many people, and therefore the scenario we’d present to participants, is easily understood.

Here is a screenshot of the original pricing page (Note: If you check out the current pricing table on Surveygizmo, you’ll notice it’s already been changed just a month after our test!):

Screenshot of original SurveyGizmo pricing page, plans ordered cheapest to most expensive
Screenshot of original Surveygizmo pricing page, plans ordered cheapest to most expensive from left to right (cheapest-first).

Here are two variations, the primary alternative (expensive-to-cheap) and a variant that mixes the plans, just for comparison:

Screenshot of the modified pricing table for surveygizmo, ordered expensive to cheapest from left to right.
Screenshot of the modified pricing table for Surveygizmo, ordered expensive to cheapest from left to right (expensive-first).
Screenshot of the modified pricing table for surveygizmo, ordered randomly.
Screenshot of the modified pricing table for Surveygizmo, ordered randomly (mixed).

Our methods included: A task that participants were prompted complete, eye-tracking equipment to analyze viewing patterns, and a post-task survey question.

For the task, we wanted participants to examine each plan’s features closely. We asked them to find features that only 2 of the 4 plans offered (chat support and analysis tools).

We asked:

“SCENARIO: Imagine you own a medium sized business and are in need of an ONLINE SURVEY TOOL that offers CHAT SUPPORT and ANALYSIS TOOLS.

The next screen will show pricing options for a survey tool.

Browse and compare options to choose a package that fits your needs.”

After viewing the web page, we asked which plan they would choose.

Using the task and post-task survey question, we want to find out whether the layout of plans affects how users consume information on the table. If so, does the difference influence the plan they would ultimately choose?

Findings

Eye-Tracking Results

Here is an eye-tracking map animation for each plan layout:

Pricing plan eye-tracking gif across pricing plan variations. Left - cheapest-to-expensive plan order; Middle - mixed plan order; Right - expensive-to-cheapest plan order.
Pricing plan eye-tracking gif across pricing plan variations. Left: cheap-to-expensive plan order; Middle: mixed plan order; Right: expensive-to-cheap plan order.

Notice that the general pattern is the same for all variants: People tend to start viewing the middle of a page then gaze slightly left.

Here’s another view of the variations showing the average order in which people viewed each area of interest (the pricing plans and features offered).

Surveygizmo pricing plan page variations with indication of order of first viewing for each plan.
Surveygizmo pricing plan page variations with indication of order of first viewing for each plan.

Again, this tells us that the same overall pattern of viewing the pricing table exists, but keep in mind the plans are in different orders, so people are seeing the plans in different orders.

Takeaways – Participants processed the table the same way regardless of how plans were ordered. They noticed the two plans listed first (in left-right order) and spent the most time on those first two plans, regardless of which plan or price they were.  The first two plan slots got the most attention.

Let’s look at some numbers:

Because we have 5 areas of interest (each plan and the list of features) across 3 variations, there are lots of possible comparison combinations. However, we’re going to focus on viewing patterns of the 2 most expensive plans (Pro & Enterprise) as this gives a ‘price-anchor’ benchmark among the variations (see the Price Anchoring / Contrast principle section of the Pricing & Pricing Pages Lesson on the CXL optimization course).

Here are our summary stats for the PRO and ENTERPRISE pricing plans:

Summary eye-tracking stats for Pro and Enterprise plan areas of interest
Summary eye-tracking stats for Pro and Enterprise plan areas of interest

There was a significant difference among layouts in mean time to first fixation [F(2,136) = 12.6881, p < 0.001] and total time fixating [F(2,136) = 7.419, p < 0.001].

Takeaway  – Participants spent more time reading about expensive plans and looked at the expensive plans quickly when they were placed on the left side of the table, listed first.

Survey Results

Remember, we asked:

“Considering the your needs, which package would you choose?”

Percentages of participants who chose which plan for each of the three page variations
Percentages of participants who chose which plan for each of the three-page variations. Note, the mixed variation had the Pro plan listed first.

We see that the mixed and expensive-to-cheap variations had a higher number of people choosing the PRO package. Note the mixed variation had the Pro plan in the first slot (furthest left) and the expensive-to-cheap variation had the Pro plan in the second slot.

Takeaway – Participants choose more expensive packages more often when they are listed first, or furthest left in left-right order.

Limitations

Our results were obtained from study participants who didn’t actually need to purchase a survey tool. This issue is the Achilles heel of user testing: Hypothetical situations don’t always translate into real-life situations.

There is a possibility that individuals who actually were shopping for a survey tool would view the pricing table differently. However, we tried to control for this with a proper ‘scenario’ capturing accurate motivation. With a proper sample size, we saw differences in responses among groups which provides supporting evidence as to what works and what doesn’t for pricing table order/layout.

While providing participants with a pre-defined task scenario provides uniform motivation, it may also limit our interpretation of the results since the preferences for a certain plan could be biased by our particular scenario prompt. It’s even possible that the differences in plan preference we observed would be more significant —or totally different— if we gave participants a different task (e.g. we asked them to look for survey features that were available with every plan).

Also, note that results are limited to one type of product with a particular price range. We’re curious to see how these results apply across different product types (e.g., Subscription vs. SaaS, informational/digital products vs. physical products) and at different price points (i.e., cheap products vs. high-end expensive products).

Conclusion

While most SaaS pricing pages list plans from cheapest to most expensive, we found that users were more likely to prefer more expensive plans when they were laid out the opposite way: most expensive to cheapest.

No matter what layout we presented, users the most time looking at the first two pricing plans listed in a left-right order. So, if the cheapest plan was presented first, that’s what they spent the most time looking at. Same idea for the most expensive plan.

While user testing does not always translate to behavioral reality (what people say isn’t always what they do), these results suggest that you should order your pricing plans from high to low. In any case, it’s worth an A/B test.

cxli-logo

Join the Conversation Add Your Comment

  1. Argh! Another study that’s missing the most vital ingredient: how this translates to an increase in revenue.

    Additionally, we’ve known for quite some time that users read a web page in an F shape. Your research is another confirmation that this is the case.

    Reply
    1. Ben Labay

      Hey Chris, we’re going to be sharing some CRO case studies soon, but this series of studies is meant to be on the UX and perception side of the story, getting data on things helps to improve testing hypotheses and illustrates testing ideas and techniques that people can run on their own sites. Don’t be frustrated! Getting data on user perception can’t hurt your testing theories. Also, F-patterns generally apply within text blocks but not always among them…see our study about that here – https://conversionxl.com/how-people-view-search-results/ Thanks for the comment! Feel free to suggest some studies you’d like to see. Cheers, Ben

  2. Like the research, only it is missing the conversion-element.
    The most interesting question about pricing plans is: what functionality should each plan contain to make people chose the High revenue option?

    I do recall that I was present at a presentation of such a test at the Conversion Camp 2014 by Pritt or Peep?

    Furthermore you have the Úgly brother’ option which I did some tests on for a mjor credit Card company. The hypotheses there is: Adding a ‘bad’ deal to a set of three on purpose will make people shift to the Premium option.

    Reply
    1. Ben Labay

      Hi Michiel, yep, no conversion element here…but hopefully someone out there could share a case study on this?! I like the ‘ugly-brother’ idea…sort of an anchoring tactic. Thanks for the comment, cheers, Ben

  3. Hey Ben! Our biggest challenge with laying out a pricing page is the fact that our plans are based on subscriber count (we’re an email marketing service). So it’s not really “optional” as to which plan you choose but it’s entirely based on how many subscribers you have at any given time. In this scenario, is a pricing page better treated as a features page when all features are the same no matter the price point? Would love to dig into examples like that more (I realize that wasn’t the intention of this post but I clicked through as we’re in the throws of redesigning our pricing page now).

    Reply
    1. Ben Labay

      Hey Val, yeah, in your case you don’t have multiple plans…just one that changes price & features per usage. I think there might be some takeaways for you here in terms of how people process a grid of feature data, but you’re right it’s not a good fit. We’re getting good feedback on this study in the sense that it seems like people have a lot of questions and interest in the subject, so we’ll look to possibly follow this up with different formats. Thanks for the comment! Cheers, Ben

  4. Appreciate the post Ben,
    … and the level of detail you included for readers regarding your approach.

    Indeed, without the conversion (real-paying-customers) element, the primary take-away for readers is to prioritize an ongoing testing effort … and this is just one test worth prioritizing among others.

    For confidentiality reasons I am limited with what I can share, but I assure your readers that after numerous tests like this across many different markets, all lifecycle stages and multiple price/value ranges …. it quite simply depends.

    I have seen positive outcomes and winning treatments using (I) High > Low, (II) Low > High and (III) Hybrid approaches.

    Reminder:
    “It’s not the price they don’t like, but what they understand they are (or are not) getting for that price.”

    Factors to proactively consider are the click/view paths and the value messaging / framing you present along the way.

    “What matters most .. is not personal preference .. but rather, what performs best.”

    If your team is not disciplined enough to maintain a regular testing cadence, I highly recommend investing in a service like CoversionXL … even if to just establish the competency within your org and then you take it from there.

    Respectfully,

    Chris Hopf
    @pricing

    Reply
    1. Ben Labay

      Hey Chris, thanks so much for the thoughtful comment, good stuff. What people don’t always appreciate with experimentation and testing, and science generally, is full of ‘it depends’. Experimentation is ultimately suggestive, not definitive. Only after having a body of testing can we have robust theories. So that said, we need more small studies like this and we need to illustrate how testing like this can be easily implemented. We’re trying to push these experiments, and sometimes they’ll hit and sometimes they’ll miss…but this type of work needs to get out there! If you have a client that is ok to share a case study let me know! And thanks again for your perspective, Cheers, Ben

Leave a Reply

Your email address will not be published. Required fields are marked *