Just how bad is a multi-column form layout? This short study conducted through ConversionXL Institute compares form completion time on a single column form vs. a multicolumn form.
Will the same questions with a different layout (one column versus multiple columns) result in different completion times?
When people weigh choices, the Presenter’s Paradox says they do so by averaging (not adding) the value of each item in a package.
This means if you add more items to a list or more products to a bundle, it could reduce the overall value perception (if the added items are deemed less valuable.
Research on this phenomenon is fairly scarce, though, so we decided to conduct a study through ConversionXL Institute.
We provide 3 perspectives:
- We outline what products and lists two academic studies have tested,
- We duplicate a product and list test with a larger sample size to try and replicate the findings, and
- We then apply the test to six new products, three experiential products (travel package, hotel night, massage) and three physical products (camera, printer, kitchen mixer).
When internet users share private information, they want to feel safe doing so.
One of the most popular ways to convey security on a website is by using trust badges (also referred to as “trust logos” or “site seals”).
Previously, CXL Institute published research we did on the order of pricing plans. This study on the effects of highlighting particular pricing plans is a continuation of that study. It has the same experimental design, except here we explicitly test a new variable – highlighting a plan with a different background color.
Similar to the first study, we manipulated the pricing page for a survey tool, SurveyGizmo, to see if there are different patterns of user perception and preference (choice of plan) for various layout designs (price plan order) when one particular plan is highlighted.
How do you order your pricing page: Cheap-to-expensive? Expensive-to-cheap? Randomly?
This study, conducted through CXL Institute, is the first of a multi-part pricing page study providing data on how people consume pricing plans depending on the plan’s layout design.
For this first study, we manipulated the pricing page for a survey tool, SurveyGizmo, to see if there are different patterns of user perception and preference (choice of plan) for various layout designs.
Many people in the marketing space are trying to figure out how to best present their value proposition. Which copy works best? Which design?
These are important questions because your value proposition is such a high impact area of your site – some would say the most important part.
So even though many people are working on and researching value proposition presentation, we thought there was still room to investigate, so we conducted a study through CXL Institute.
This study manipulates the value proposition of a financial service SaaS website, and uses eye-tracking and survey tools to test differences and effectiveness among the value prop. variations.
When designing the landing page for CXL Institute, we conducted an experiment regarding our explainer video.
We wanted to find out how “trustworthy” and “attractive” different voices were perceived. In this CXL Institute study, we tested four different voices, which differed by gender and whether they were professional voice actors or not.
Question is, did it make a different in how people perceived our video content? Yes, and the results were somewhat surprising.
When shopping online, you can’t hold the product, test it out, or talk to a salesperson about how different brands compare to one another. For these scenarios, social proof is frequently used to guide shoppers towards the best product choice.
Which brings us to the real question: Which social proof techniques are most effective? Are some of them totally ineffective?
This study from CXL Institute explores how different forms of social proof are perceived (with eye-tracking), and then how they are recalled (with post-task survey questions).
Designers and conversion optimization people use visual cues to attempt to guide users in particular direction on a web page. Maybe you want a user to continue scroll, or to look at a value proposition, so you add a visual cue to subtly guide them there.
However, when you consider the vast amount of different kinds of visual cues that are available, things become complicated.
You could use arrows, lines, photos of people, borders, pointing fingers, bright banners, exclamation points, check marks… The list goes on.
Which brings us to the real question: Are some visual cues more effective than others? This CXL Institute study explores that question.