What Are Heat Maps Good For (Besides Looking Cool)?

heat maps

Mouse tracking heat maps are a popular conversion optimization tool, but what good are they really?

It’s easy to say that they help you to see what users are doing on your site. Sure, of course – but lots of other methods do that too, and perhaps with greater accuracy.

So how are heat maps useful in the pursuit of higher conversion rates?

What is a Heat Map?

Heat maps are visual representations of data. They were developed by Cormac Kinney in the mid-90’s to try to allow traders to beat financial markets. Basically, they allow us to record what people do with their mouse or trackpad and quantify it, and then they display it in a way that is visually appealing.

Heat maps is a broader category that can include:

  • Hover maps (mouse movement tracking)
  • Click maps
  • Scroll maps

For any of the above heat map types, in order to make accurate inference on the data available you should have enough sample size per page/screen before you act on results. A good rule of thumb is 2000-3000 pageviews per design screen, and also per device (i.e. look at mobile and desktop separately). If the heat map is based off like 50 users, do not trust any of it.

Since there are a few different types of “heat maps,” let’s go over each and decide what value it offers.

Hover Maps (Mouse Movement Tracking)

When people say ‘heat map’, they often mean hover map. It shows you areas where people have hovered over with their mouse cursor. The idea is that people look where they hover and thus it shows how users read a web page.

Image Source
Image Source

Hover maps are modeled off of a classical usability testing technique: eye tracking. While eye tracking is useful in understanding how a user navigates a site, mouse tracking tends to fall short because of some stretched inferences.

The accuracy of mouse cursor tracking is always questionable. People might be looking at stuff that they don’t hover over. They may also be hovering over things that gets very little attention – therefore the heatmap would be inaccurate. Maybe it’s accurate, maybe it’s not. How do you know? You don’t.

In 2010, Dr Anne Aula, Senior User Experience Researcher at Google, gave a presentation where she presented some disappointing findings about mouse tracking:

  • Only 6% of people showed some vertical correlation between mouse movement and eye tracking
  • 19% of people showed some horizontal correlation between mouse movement and eye tracking
  • 10% hovered over a link and then continued to read around the page looking at other things.

We typically ignore these types of heatmaps. Even if you do look at it to see if it supports your beliefs/suspicions, don’t put too much stock in it. Guy Redwood at Simple Usability has a similar belief about mouse tracking:

“We’ve been running eye tracking studies for over 5 years now and can honestly say, from a user experience research perspective, there is no useful correlation between eye movements and mouse movements – apart from the obvious looking at where you are about to click.

If there was a correlation, we could immediately stop spending money on eye tracking equipment and just use our mouse tracking data from websites and usability sessions.”

Hence why Peep calls these maps “a poor man’s eye tracking tool.”

Because there isn’t often a lot of overlap between what these maps show and what users are doing, it’s tough to infer any actual insights. You’ll end up telling more stories to explain the images than actual truths. Even though this blog post is talking about soccer heat maps, they put it well:

“What do heat maps do? They give a vague impression of where a player went during the match. Well, I can get a vague impression of where a player went during a match by watching the game over the top of a newspaper.”

While some studies have indicated higher correlations between actual gaze position and cursor position, you have to ask yourself if the possible insights are worth the risk of misleading data and the increasing possibility of confirmation bias in analysis.

What about algorithm generated heatmaps?

Similarly, there are heatmaps tools that use an algorithm to analyze your user interface and generate a visual based on that. They take into account a variety of attributes: colors, contrast, visual hierarchy, size, etc. Are they trustworthy? Maybe. Here’s how an article on Aura.org put it:

“Visual Attention algorithms, where computer software ‘calculates’ the visibility of the different elements within the image, are often sold as a cheaper alternative. But the same study by PRS, showed that the algorithms are not sensitive enough to detect differences between designs, and are particularly poor at predicting the visibility levels of on-pack claims and messaging.”

(Quick note: PRS, the other of the study cited above, sells eye tracking research services.)

While you shouldn’t fully place your trust in algorithmically generated maps, they’re not any less trustworthy than your hover maps.

And especially if you have lower traffic, algorithmic tools can give you some sort of visual data for usability research. It can give you instant results, which is cool. Some tools to check out:

Keep in mind, just because it’s instant doesn’t mean it’s magic. It’s a picture based on an algorithm and not based on your actual users’ behavior.

Click Maps

Click maps show you a heatmap comprised of aggregated click data. Blue is less clicks, and when it gets towards the warmer reds there are more clicks, and the most clicks are the brightest white and yellow spots.

Image Source
Image Source

This is pretty cool to look at, and to be clear, there’s a lot of communicative value in these maps. They help explain across teams the importance of optimization and what is and isn’t working. Got a big photo that takes up a lot of the page that lots of people click on but isn’t a link? Maybe make it a link or something.

Thing is, you can also see where people click with Google Analytics – which is generally preferable. If you have enhanced link attribution turned on and set up, Google Analytics overlay is great (but some people prefer to see it on a click map type of visual).

And if you go to Behavior → Site Content → All pages, and click on an URL, you can open up Navigation Summary for any URL – where people came from, and where they went after. Highly useful stuff.

With click maps, as I mentioned before, there’s one useful bit – you can see when people click on things that aren’t links.

If you discover something (an image, sentence, or whatever) that people want to click on, but isn’t a link, then either:

  1. Make it into a link
  2. Don’t make it look like a link.

It’s also easy to quickly take in the aggregate click data and see broad trends. Careful, though, not to succumb to convenient storytelling in this case.

Attention maps

An attention map is a heat map that shows you which areas of the page are viewed the most by the user’s browser with full consideration of the horizontal and vertical scrolling activity.

They show which areas of the page have been viewed the most, taking into account how far they scroll, and how long they spend on the page.

Peep considers this to be far more useful than the other mouse movement or click heatmaps. Why? Because you can see if key pieces of information – both text and visuals – are in the area that is visible to almost all users. This makes it easier to design pages with the user in mind. Here’s how Peep put it:

peep lajaPeep Laja:

“What makes this useful is that it takes account different screen sizes and resolutions, and shows which part of the page has been viewed the most within the user’s browser. Understanding attention can help you assess the effectiveness of the page design, especially above the fold area.”

Scroll Maps

Scroll maps are heat maps that show you how far people scroll down on a page. They’re interesting in that they show you where users tend to drop off, and can be very useful.

Image Source
Image Source

While we can use scroll maps for really any length of page, there’s an especially pertinent use case for them in designing long-form sales pages and longer landing pages.

Generally, the longer the page, the less people will make it all the way down. This is normal and helps you prioritize content – what’s must have and what’s just nice-to-have? Prioritize what you want people to pay attention to and put it up higher.

Scroll maps can also help you tweak your design. If you’ve got strong color changes, that means that people think whatever follows is no longer connected to what came before (called “logical ends.”). These are sharp drop-off points that are hard to see with just Google Analytics.

On longer landing pages, this might mean adding navigation cues and better visual cues where the scrolling activity stops.

User session replays

This isn’t really a ‘heat map’ per se, but is the most valuable bit in most tools that offer heat maps.

Use session replays allow you to record video sessions of people going through your site. It’s kind of like user testing, but has no script and no audio. But people are risking with their actual money – so it can be more insightful.

This is more qualitative data. You’re trying to detect bottlenecks and usability issues. Where are people not able to complete actions? Where do they give up?.

One of the best use cases for session replays is watching how people fill out forms. Though you could configure event tracking for Google Analytics, it wouldn’t provide the level of insight as user session replays. Also, if you have a page that is performing badly, and you don’t know why, then you can watch user session replays to figure out possible problems. You can also see how fast they read, scroll down the page, etc.

Analyzing them is, of course, timely. We spend like half a day watching videos for a new client site.

What’s Wrong With Heat Maps

Heat maps can be problematic for the same reason as that old metaphor about the drunkard and the light post. People use it for support instead of illumination.

In other words, ignoring some of the data inaccuracies discussed above with certain maps, it opens you up to a world of potential bias, especially if this is a primary piece of your conversion research. Andrew Anderson, head of optimization at Malwarebytes, put it very well:

Andrew Anderson:

Nothing shows a lack of understanding of rate and value more than people getting overly caught up with where people click.

Is more or less people clicking on something good or bad? Is the most clicked thing the most important? The most influential? What will happen if twice as many people click on this one thing? Does something have to be clicked on a lot to have influence? Does it have to be clicked on at all? Heat maps in the end provide a thousand more questions without the ability to answer a single one in a meaningful way.

What we know is that most people will use their bias to determine the value of items and use that to filter all the incoming information. They will confuse the most active for the most valuable. They will default to a linear rate model, which is the least representative type of model. They will try to get more people to a step or an item on the page without any real insight into the relative value or efficiency of that change. Even worse they will use a heat map or any click based metric as a way to continue their story telling and to continue to confuse what they hope will happen with what is the best for the site or page.

Heat maps can be helpful at a high level and as a way to communicate problem areas to people less analytically savvy in the organization.

They can also be a good starting point for conversion research and analysis.

But almost all of the insight they bring can be gleaned from different analytics tools – and Google Analytics tends to offer less wiggle room for interpretation, storytelling and bias.

In other words, heat maps are great tools in the optimizer’s arsenal, but should not be the end-all be-all for determining project and test planning.


Heat maps are pretty cool looking. Not only that, they offer substantial value (if used right):

  • Algorithmic heat maps can give low traffic sites and idea of how people use their site.
  • Click maps can give high level visuals on where people are clicking and where they aren’t.
  • Attention maps help you see which parts of the websites are most visible to all users, across all browsers and devices. They help you decide where to put your value prop and other important elements.
  • Scroll maps can help you design longer landing pages and keep people moving down the page (helps with prioritizing content as well).
  • User session replays are irreplaceable tools in your arsenal.

But you should never solely rely on heat maps for conversion research. The results can be limiting at best and misleading at worst, resulting in compounding interest of bias and illusory insights.

Join the Conversation Add Your Comment

  1. Great article. Glad that someone has finally come out and said it.

    Perfect example: we were using heat maps (and scroll maps) to see how far people scrolled on our webpage and where they paid attention. We focused on prioritizing content to where people scrolled and hovered over, and also increasing our scroll depth. Then we realized that the people that convert only read the first section. So we actually should have been preventing scroll (or deleting extraneous content).

  2. To Ryan’s comment: many enterprise users of customer experience solutions compare segmented heatmaps that show how converters vs. non-converters behaved differently on the site.

    Sometimes it jumps out from that comparison that those who scrolled deep and immersed themselves in the technical details of a product page end up not converting. There are many possible explanations. For example, as retailers know some “nice to have” products are more likely to be purchased when buyers remain in an emotional decision making mode as opposed to engaging in a rational buying process. Too much info switches buyers from emotional to rational buying.

    But as you point out that is just one possible explanation. Since there is no way to know for sure, the best way to leverage the insight is to run a test. For example, formulate a hypothesis: What if the product pages don’t show technical details upon scroll down but put them behind tabs so the details are less in your face and keep buyers in an emotional buying mode.

    Let the test data decide.

    For companies selling discretionary products this and similar adjustments have led to 3x increases in Sales as we’ve seen in our work with customers at ClickTale.

    In general, I want to disagree with some of the tone in this article. Just because you cannot deduct causation from correlation and cannot be 100% sure that the eye is where the mouse is doesn’t mean that the data aren’t useful. You just have to know how to use the data.

    This reminds me about the debate ten years ago about whether web analytics data is accurate. As you know, the conclusion for web analytics has been: It’s not and get over it!

    But that doesn’t mean that the analytics aren’t tremendously useful for increasing revenues and customer experiences.

  3. yawn

    Another catchy headline with Peeps hubristic style.

    Truth is any data is “crap” if not analyzed from every perspective. And sample size does matter. 3k, really? If you really want to critique bad assumptions based statistical inference, 3k is hardly reliable. Perfect world, 100k samples at least.

    Correct, the perfect world is rarely possible with the constraints of most small and medium sized businesses (traffic)… but if you’re going to make a statement about “crap”, how about some solid math behind it. For those unfamiliar with statistical mathematics, the rate of error (standard deviation) is near zero with large data samples. 3k is on that scale is far too small to determine any hypothesis/ conclusion with any certainty.

    1. Peep Laja

      Thanks John.

      While the article is not written by me, I appreciate your feedback. I don’t see anything being called “crap” in the article, but I do agree that in addition to offering a ballpark number (as this is what people remember far better than sample size calculations), it should also state a proper statistical way of determining it. Magic numbers for sample sizes don’t exist.

  4. Interesting piece, thanks for sharing.

    Full disclosure: I work for Decibel Insight, an analytics tool that includes a suite of heatmaps.

    In the past I’ve seen some research touted around by other heatmap providers that suggests a correlation between cursor position and eye gaze. This seems tenuous to me personally, from my own anecdotal experience, and we tend to say in our own demos that focus heatmaps provide a proxy of eye gaze at best.

    I think it’s fair to say, though, that there is value in seeing where cursor position commonly falls among your users, because, unlike pure click tracking, it shows content that is “tempting” too. Again, though, this is an inexact science. You need to combine this kind of evidence with the session replays, metric-focused analytics, and VOC too.

    We’ve developed heatmaps that do not track clicks on the page with dots (which is fraught with inexactitude and confirmation bias) but instead attribute values to clickable elements precisely, tracking them across responsive views and removing the need to approximate, based on colour and ‘size of blob’ alone, how many clicks occurred in a given area.

    These heatmaps are called Attribution heatmaps, and as well as tracking clicks they also track the elements that contribute to eventual goal achievement, which is extremely useful for CRO. I’d urge you to check them out. https://www.decibelinsight.com/heatmaps/attribution-heatmaps/

    As Arin says, heatmaps are as useful as the tests they give rise to. They’re one part of the arsenal of tools available to web analysts and optimizers, and I think that when they’re used in a wider mix they can be extremely helpful.

  5. Great article, as always.

    Advice – don’t bother contacting Eye Quant, there is no response after 10 days. The expectations were as high as the prices on their page.

Comments are closed.

Current article:

What Are Heat Maps Good For (Besides Looking Cool)?