The What, How, Why Approach to Conversion Rate Optimization

This is a simple way to model your thinking about how to approach a website and breathe life into your analytics numbers. It’s how I approach things and I don’t think it’s terribly original, but I wanted to clarify my thinking on my approach, so here we are.

Let’s say we’re looking at the homepage of a site.

The What: Analytics and Clickstream Data

The ‘what’ consists of the clickstream data around the homepage: visits, uniques, pages per visit, time on page, time on site for visitors who landed there, bounce rate, conversion rate. There are more but those are the ones we usually look at.

We have this data for all the pages on our site, but it doesn’t usually come into play until there’s a problem. Once there’s a problem, usually in the form of someone asking “how come are sales aren’t higher?” and so you look and maybe see that the bounce rate is really high for the homepage and so hmm that’s not good, is it?

You dig deeper and see that 80% of traffic is entering the site through the homepage and that the bounce rate for the homepage is 70%. Ouch. Track down the designer and publicly shame them. Kidding. A better idea would be to go ask the designer for the research they conducted during the design process. If you get a blank stare, then great, you’re going to earn your money.

OK, back to the what. So the analytics data you have will tell you what is going on. People are entering, seeing the homepage, and then leaving. Side note: if you’re using Google Tag Manager, then I recommend setting up an event timer listener so you know if they’re actually just bouncing right away or if they’re sticking around for a bit.

Now, at this point, you can go back to your client and tell them that their bounce rate is high, which will be of zero value to them since you don’t have a solution to address the problem. Besides, if they pissed away paid thousands of dollars for the design, they have a vested interest in being right about that decision and you’ll be fighting upstream psychologically-speaking, which it’s tough enough to change a homepage as it is (or you’ll be fighting the IKEA effect if they built it themselves, which made me laugh when I discovered that there’s a cognitive bias named after IKEA).

I digress. My point was that you have to provide solutions, not just tell people that things suck.

The How

The How is where we figure out how the homepage is sucking. In what way does it suck, now that we know that it does indeed suck? There are so many ways that a page can suck, maybe infinite ways. Here, we want to make an educated guess about how people are leaving.

Are they scrolling down the page, not finding what they want, and then leaving? Are they leaving immediately, like in less than 3 seconds? Where is their mouse going? Are they clicking on stuff that isn’t a link and then leaving?
These questions are tough to answer, but heatmaps and other tools that allow you to actually see a video of the user’s visit can provide some assistance. And rarely will all visitor behavior fit into one nice clean bucket so you have to look for patterns and make an educated guess.

The Why

This is where the money is made. Up to this point, we’ve been picking up clues but haven’t really hit on any big insights. The why is the insight-rich territory that can provide you answers and solid clues as to what is going on.

There are two basic approaches (that I can think of) for generating the why: listening and science. Listening is powerful as hell but only science will tell you if you were right.

Listening

By listening I mean listening to people that actually use the site. Just ask them and they’ll tell you the most amazing things like “I can’t find the pricing!” and “THIS SITE SUCKS WTF” and “where’s the schedule” and “tried to sign up but can’t find it” and “where do you ship” and “DO YOU TAKE VISA”.

Now, if you ask me, there’s more gold in those statements than in the analytics and if I had to choose between one or the other, I would go with listening instead of the analytics. For instance, how they say it can tell you a lot about the visitors socioeconomic status and education level.

And if you listen long enough, you’ll start to see clear patterns emerge that will inform your own hypotheses and guide you to the most significant obstacles to conversion on the site.

Listening can be done in many ways at different price points:

  • Do in-person user testing in a “usability lab” (no need for an actual lab, a normal room with a camera will do).
  • Use a service like usertesting.com.
  • Use an on-site survey program.
  • Have your mom use the site and talk about it as she does it.
  • Have the people at your company use the site.
  • Talk to the sales team, if there is one, and ask them what kind of things people say on the phone about the site or about their needs in general (their needs on the phone will be similar to their needs on the site).
  • Look back at the research that was originally done when the site was designed (one can hope).
  • Look at available research about your target market.

Science

By science, I mean the scientific method. Formulate a hypothesis about why the bounce rate is low, then design an experiment to test that hypothesis. If you were able to use any of the methods in the Listening section, then you may already have some great hypotheses.

If not, then you can generate hypotheses on your own.

Look at the homepage. Does it do a good job of establishing the value proposition of the site? Does it clearly communicate what your site is about and why the visitor should be there? These and many other questions can help you generate hypotheses and I recommend reading this book by Chris Goward for more ideas.

Wrapping it up

Now you have something to talk to the client about. “We see that your sales are low and we think that it’s because your homepage isn’t engaging enough and visitors are leaving immediately. But don’t worry, because we did some research and we think that x, y, and z are the reasons that visitors are bouncing. We’re going to set up an A/B test to see if that improves the bounce rate. If not, those hypotheses will be ruled out and we can continue testing until we fix the problem.”

 

Robert

 

Leave a Reply

Your email address will not be published. Required fields are marked *