Case Study: How We Improved Landing Page Conversions by 79.3%

Case Study: How We Improved Landing Page Conversions by 79.3%

It took us 6 rounds of tests until we landed on a variation that was doing 79.3% better than the version our client had before coming to us.


TruckersReport is a network of professional truck drivers, connected by a trucking industry forum. It’s a great community for drivers to share information and assist in each others’ careers. I guess it’s what you would call a niche site—but that niche is bringing TruckersReport over 1,000,000 visits each month (close to 5 million pageviews).

One of the services they provide to their community is helping truck drivers find better job opportunities. Truck drivers fill out a one-time online resume and then choose between offers from pre-screened trucking companies.

This was the landing page they had, our starting point:


This landing page was converting at 12.1% (email opt-ins).

What followed after the initial landing page was a 4-step online resume building flow. The primary task at hand was to increase landing page conversions—to widen the funnel at the top—but also to increase overall funnel conversions.


In addition to heuristic analysis, we analyzed the current page using Google Analytics, set up mouse tracking data collection (click data, scroll maps, attention heat maps) and started to record user session videos via SessionCam.

Next, we wanted to understand the audience better. We ran an online survey using Google Docs with the goal to get in the head of the truck drivers. Why were they looking for a new job? What matters the most about truck driving jobs, and what were the main motivators? What were their main hesitations and questions when considering a new job offer?

These were the top factors what we focused on:

  • Mobile visits (smartphones + tablets) formed about 50% of the total traffic. Truck drivers were using the site while on the road! –> Need responsive design
  • Weak headline, no benefit –> Need a better headline that includes a benefit, addresses main pain-points or wants
  • Cheesy stock photo, the good old handshake –> Need a better photo that people would relate to
  • Simple, but boring design that might just look too basic and amateur –> Improve the design to create better first impressions
  • Lack of proof, credibility –> Add some
  • Drivers wanted 3 things the most: better pay, more benefits and more home time. Other things in the list were better working hours, well-maintained equipment, respect from the employer. Many were jaded by empty promises and had negative associations with recruiters.

Armed with these insights, we proceeded.

New design

This was the new, fully responsive, design we created.


We didn’t want to change the layout dramatically. We wanted to better isolate user issues. Heat maps and user session replay videos showed that the previous layout worked well in terms of usability.

Why this layout:

  • Prominent headline that would be #1 in visual hierarchy
  • Explanatory paragraph right underneath to explain what the page is about
  • Large background images tend to work well as attention-grabbers
  • Warm, smiling people that look you in the eye also help with attention
  • Left side of the screen gets more attention, so we kept copy on the left
  • As per Gutenberg diagram, bottom right is the terminal area, so that explains the form and call to action placement.

In the process we also re-designed the funnel steps (also fully responsive).


Test #1

Hypothesis: Fewer form fields = less friction and hassle to fill out the form, resulting in more conversions.


Results: The control beat the variation by 13.56%.

Insights: Although short forms usually outperform long forms, this did not apply here. More testing is needed to figure out why. Hypothesis: added fields give more credibility or add relevancy, a plain e-mail field can look “spammy”.

Test #2:

Hypothesis: Copy that addresses most common problems truck drivers are facing, using the the wording they actually use (taken from the customer survey) will resonate better with the audience.

We crafted a new headline + bullet points full of benefits and addressed other stuff that came up in the survey.

Results: While there was no difference in landing page opt-ins, there was a 21.7% difference in the bottom of the funnel conversions—original won. People lured in by big promises were less motivated to go through the whole 5-step funnel.

Insights: Short, straight-to-the point language can work. Too many promises can look like a hype, or attract the wrong kind of people.

Test #3

In the first 2 tests, the average conversion rates had been similar to the original landing page. But since traffic is ever-changing, we decided to test the original landing page vs. the new landing page to make sure the design was enhancing the site.

In addition, we wanted to test the absence of a “job match” page. By default people who completed the opt-in, landed on this page, which had some animation on it that made people feel progress in the background:


The idea behind having this page was to help boost bottom of the funnel conversions. Google Anaytics showed us that there was a 10.8% drop-off rate on this page. So we wanted to test whether losing those people would have a negative impact.

Results: Variation #1 (new landing page) resulted in 21.7% more opt-ins than the control at 99.7% confidence level, and 24% more signups from the whole funnel. “Job match” page did not help improve bottom of the funnel conversions, so we decided to remove it.

Test #4

We wanted to test more headlines.


  • Original: “Get a truck driving job with better pay”. Straightforward.
  • Question: “Looking for a truck driving job with better pay?” The idea here is the notion that people always answer questions in their mind when they read a question.
  • 3 main benefits: “Better Pay. More Benefits. Respect for drivers.” These benefits came from the survey as the 3 most important priorities to the audience.
  • But you are free: “You can get a driving job with better pay. But, of course, you are free to choose.” The psychological phenomenon of “autonomy” is at play here and is widely researched to increase persuasiveness.

Results: Control outperformed all the variations. The original headline won the second best variation—”You are free to choose”—by 16.2%.

Insight: A simple, straightforward approach works best for this audience. So the question is—how can we use this insight to make the page even simpler?

Test #5

Building on the “simple” insight from the previous test, we created a shorter, simpler version of the page:


Results: Variation #1 with a shorter page layout and less copy outperformed the control and resulted in 21.5% more opt-ins at a 99.6% confidence level.

Insight: Learnings from previous tests proved to be right—shorter layout and less copy resulted in more opt-ins. How can we now make it even simpler?

Test #6

We had many different hypotheses on how to simplify the page even more.

  • New design that’s built from the get go for a more compact layout. Better content presentation typically helps.
  • Remove all fields but the email field (the only mandatory field). Less fields typically helps.
  • Get rid of the name field and make the email field the last one. The idea here is for people to start with easy fields (dropdowns), its easier to get going, and by the time they reach the hard field—email—the user is thinking “oh well I already started” (a known psychological phenomenon called “commitment and consistency” by Cialdini), so we’d be riding on momentum.


Results: Variation #3 with no name field and email as the last field resulted in 44.7% more opt-ins at a 99.9% confidence level.


We achieved a 21.7% conversion rate (the margin of error was 1.48% but no overlap with the ranges of other variations occurred) which is  79.3% is better than the initial landing page we started to work on.

Conclusion: Testing is an iterative process

When you start testing a page, don’t test just once and move on to testing other parts of the site. Don’t think of the process as one-off tests, but as testing campaigns.

Learn from each test, make sure you send test data to Google Analytics and segment the results (I didn’t go into details with this here), and keep iterating. Use insights from previous tests to drive upcoming tests. You won’t know what matters until you test it. Have a lot of patience.

If we had only tested the control vs. the new landing page we wouldn’t have reached 79.3%—and we’re just getting started.

Join the Conversation Add Your Comment

  1. Which A/B testing service did you use?

    1. Peep Laja

      First round was done with Visual Website Optimizer, all the following tests with Optimizely (works better with long funnels).

    2. Can you please expand on why you believe Optimizely works better with long funnels? I tried both and for me, overall, VWO is much more user friendly.

    3. Peep Laja

      I like VWO user interface better too (faster as well), but it reported inaccurate data for funnels.

  2. What do you mean by ‘bottom of the funnel’ conversions? Does that mean sales? If yes, how do you record it?

    Secondly, what are the steps in the funnel? How do you setup the funnels?

    1. Peep Laja

      In this case the funnel consisted of 5 steps: landing page, resume building steps, and finally a signature confirming that everything is correct. So bottom of the funnel conversions would mean people who went through the whole funnel. Top of the funnel is people who opted in from the landing page.

      We used Optimizely to track funnel performance, but of course Google Analytics too.

      Here’s a guide to setting up funnels:

  3. What sort of sample size were you testing these changes with? We’re considering testing similar changes soon & interesting how long it took to see any statistically signifance?

    1. Peep Laja

      Our criteria are

      – Test duration minimum 7 days (or 14 or 21 if statistical confidence not achieved, important to test full business cycle at a time, e.g. a week)
      – 250 conversions per variation (150 as absolute minimum if otherwise test would take ridiculously long time)
      – Aim for 95% confidence or more to declare a winner (if this not achieved by 250 conversions per variation, you have a failed test – meaning no significant difference between the variations)

  4. Working with a client that has 1MM page views /month gives you lots of visitors to study. It’s another matter if you have a business whose site only gets about 10k visitors /month. One can just run tests for longer periods of time, but this in of itself can introduce other variability (seasonality of buying and other factors that affect people’s interest in a product) Thoughts on this? How might one do testing differently?

    1. Thanks for sharing that low traffic post. I’ve been looking for something like this for a while. Just signed up to Inspectlet, it will help me a lot.

  5. Hi Peep Good post as usual,

    There is a book i read a while “100 secrets of ad agency” it’s a good read, you can find some nice tricks there.
    keep rocking our world. nice posts create owe conversion posts makes me wealthy.

  6. Excellent way to explain a job well done. It’s always good to find great articles that explain the improvement of the conversion rate.


  7. I think it’s a great article. Personally I am still wondering whether to put my landing pages on the right of the page or left. ps. over 21% of conversion rate is an amazing achievement! I cant get more than 10% ehh.

  8. 79.3% Opt in Rate is IMPRESSIVE.

    On thing that has really helped me out is keeping it simple. People like simplicity and don’t want to spend all day filling out a form and entering information. As of right now one of my landing pages is converting at about 60% which is still VERY good. There is always something new to learn good information once again.

    Matthew J Trujillo

  9. Great post peep…. when I first started learning about conversion rate optimization I was under the impression that A/B testing meant randomly changing elements on the page to see which guess wins… it’s always good to get a reminder that creating a hypothesis and learning about your audience is the main reason for testing, not just the wins… that point is illustrated very nicely here.

  10. Hi Peep,

    Helpful case study, thanks for making it so detailed.

    Question about Test #6 though. It’s not clear what exactly the winning variation (#3) consisted of? Would you mind describing it in more detail? It sounds like it was multi-page? Even better, screenshots for each of the different pages?

    BTW, the screenshot for Test #6 is small so it was really hard to see the details of the different variations.

    Thanks in advance!

  11. Hi Peep,

    Great post and great results.
    Q- What was the time frame of the above cycle – I mean, how long it took from planing the 1st test till you got the results of the 6th one?


  12. Hey Peep,

    Great post! My company is currently working on improving the landing page conversion rate for my B2C website, and I had a question for you.

    What benchmark data do you use for landing page conversion rates? I am trying to gather as much information as possible so I am curious to what you would consider a bad / good / great conversion rate?

Comments are closed.

Current article:

Case Study: How We Improved Landing Page Conversions by 79.3%