How Metrics Help Us Measure Help Center Effectiveness

How Metrics Help Us Measure Help Center Effectiveness


The term 'data driven' is thrown into just about every best practice process one hears about these days. Thankfully it's more than just meaningless buzz – modern data collection and analysis can help us truly transform what we do. So what does this mean when applied to a self service portal such as a help center website? How does one decide what self service analytics to look at, and what conclusions to draw that can influence change for the better?

We decided to tackle these questions for our documentation website, help.k15t.com. Read on to learn what we did and why.

Ask the Right Questions

Before diving into specific website metrics like page traffic and on-page actions, we need to have a clear picture of what it is specifically that we want to know. This will give our data context and provide a rational basis with which to interpret it.

Of course no one is to blame for wanting a quick and easy answer to "Which Google Analytics metrics should I monitor to make my help website better?" But unless you're okay with data that doesn't actually help you reach your goals, it's just not quite that easy. What we're looking for is something actionable – a tangible way we can evolve and improve.

During my research into the ways we at K15t could best analyze our own self-service help center website, I read plenty of suggestions as to which individual metrics to measure – page views, average time on page, bounce rate, etc. But just wanting to know which things to measure won't necessarily help one decide how to act. You might end up spending time on reporting numbers instead of better serving your customers and growing your business.

What you really need to do before collecting data is to ask the right questions.

asking-the-right-questions.png

We asked ourselves this: "What do I want to know about my help center, and why?"

Imagine you can talk to your help site users in person. What would you ask them? The resulting questions will lead you toward the data you should collect. Then, ask yourself why those questions are important. That will help to give the answers that you uncover context and meaning. This latter part is what allows you interpret your data and end up with actionable improvements.

If you don't define these initial questions before diving in, it's quite easy to drown in the sea of data that's recorded in your Google Analytics account without ever reaching the stage of interpreting any of it. You won't be able to see the forest for the trees.

Instead of asking "what's the average time someone spends on my help pages?" ask "are people reading the information we provide?"

We set out to define these questions for our own help center, in order to gain insight into which metrics we needed to monitor to help make the site better. So rather than continuing on a theoretical level, I'll share the actual K15t help center analysis. Hopefully, this will give you ideas on how you can approach your own help pages.

Here's what we decided to ask:

Read on to learn why we chose these four points to influence how we can evolve and improve our help center, and to find out what we measured to get us there.

Is Our Self Service Portal Easy to Navigate?

Our motivation behind asking this question is wanting to provide a great user experience. If people can quickly find the answer to their query in our self service help site, they'll be less frustrated and more satisfied with their experience with our brand on the whole. What's more, ease of site use can reduce support desk tickets created due to user confusion, which helps our teams run more efficiently.

Here's what we measure to help indicate whether our help center is easy to navigate:

  • Page views in a session, combined with page duration


    If site sessions include a low number of page views combined with long page duration, we take this as an indicator that readers are quickly finding what they are looking for. These people have not needed to view many pages before landing on one where they have spent time to consume all of the content. Help site users generally are motivated to find a solution, and are willing to put some effort into their search (i.e. multiple page views). This is why a high number of page views per session, combined with short page duration times, indicate users are having trouble finding what they are looking for. 

looking-for-solutions.png

Are Visitors Finding the Information They Are Looking For?

Knowing whether your users are finding the answer to their problem is one of the most important questions for help pages. A positive answer lets you know that the breadth and depth of your documentation topics and articles are in good shape. A negative answer doesn't have to be a disaster. It lets you know that you need to take action – something you wouldn't have known if you hadn't asked. But which metrics address this question?

We decided that a couple of different metrics could give us an answer:

  • Pages visited per session
    Low page visit numbers per session can infer that people quickly found the actual content they were looking, which as mentioned above can also indicate ease of site use. Remember that people using your help pages are often looking to solve a very specific problem. When you see that they're visiting a lot of different pages, it probably isn't because they're consuming the content for fun – they're likely just not finding the answers they need.

  • Scroll tracking, and page duration
    To further hone in on the answer to our question of users' ability to find the info they need, it makes sense to aim for high scrolling rates. By tracking how far users scroll down a page, you'll be able to tell whether they are actually reading your content. But what if they scroll because they are searching for something instead of actually reading? To distinguish this we'll also have to look at the page duration numbers, which if high enough, tell us visitors are actually reading as they scroll.

In summary, a low number of page visits per session, and high scrolling rates combined with long page duration, indicates that people are finding the information they are looking for. If we see high page visits per session, and high scrolling rates combined with short page duration, we'll know we're missing important content.

Are Visitors Actually Solving Their Problems?

This question gets to the heart of all online help pages. It's a bit tricky to measure directly, and our methods here require some inference. But the whole point of a help center website is, after all, to help people solve their problems. Even if they're finding what they're looking for quickly, that still does not necessarily mean the content leads them to a real solution. Here is what we measure to gauge the quality of our help content.

  • Self Service Score
    Comparing our help center website unique visitors number to the overall number of support tickets submitted in our Zendesk support portal gives us a ratio

    called the Self Service Score (coined originally by Zendesk)

    . We break this ratio down by product to help us gauge how well the content on our self service site is serving the needs of our users. When this number is going down, we can infer that the help content is serving user needs well. And when the opposite is true we can identify where we need to improve.

  • Page interaction via comments
    This type of feedback is about as direct as it gets, and that's what we love about it. At the bottom of every content page in our help center we have an integrated Disqus comment widget which allows readers to ask questions or leave feedback directly within the context of the articles themselves. This gives us an instantaneous way to know when something is missing or unclear, and gives an unfiltered indication of what needs to be changed to help a user solve a problem.

  • Did you find the information useful?
    Something we're moving toward is integrating a 'like or dislike' widget on help pages to get a higher volume of direct feedback. The question we'll ask is simple: Did you find the information on this page useful? Once we have this, we'll be able to identify the worst-performing pages and put resources directly into improving them. Measuring likes isn't a default metric you can access through Google Analytics unfortunately. It requires a third-party service or a custom script in Google Tag Manager, which is on the roadmap for future implementation on our help center site.

Support ticket submission is influenced by how prominently you feature the ticket submission button. Handle this decision carefully – you want to encourage your users to use your self service help pages, but also not frustrate them by hiding the ability to submit a support ticket.

Which Problems Are Readers Trying to Solve?

Answering this question will let us identify improvement opportunities in the self-service portal, or more importantly, within a product itself. 

  • High page views
    Looking at the pages that have the highest number of page views – while excluding navigational pages like our help homepage – will give us a sense of the problems people are having most. Then we can concentrate on making these pages really great, so that as many of our users as possible will be satisfied. Through this we can also derive ideas on how to improve the product itself. If one of the most visited help pages talks about setting up a specific function within a product, then this tells us we should have a look at ways to improve the in-product user experience for this function. The caveat here is that pages with high page views don't give you insight into which problems you haven't yet addressed through your content.

  • Internal site search terms
    We provide our users with a search bar on our help center site. If you do too, then you can use Google Analytics' internal site search function to see which keywords are searched for the most. This gives direct insight into the problems your visitors have come to solve – valuable info once again for both the documentation team and the user experience side of the product team.

Interpretation

Interpretation is what gives data meaning. Remember that data by itself doesn't really indicate anything, it's just a bunch of numbers. What makes the difference is how one uses these numbers to draw inferences and make change. If our interpretation of the data we collect is flawed, then the actions we decide to take will be just as flawed.

By thinking first about what you want to know about your users, and then considering why it is you want to know these things, you've already taken steps to make your interpretation more accurate. But you might still yet lead yourself down the wrong pathway. This comes down to the need for real-world validation. The only way to find out whether what you've interpreted is correct is through trial and error, by taking action and seeing what changes it brings.

Look at your data as specifically as possible. Avoid global statistics – go in product by product or specific function by specific function if you can. This will give you much more accurate information about how well your documentation serves your users. Only use aggregated data for non-specific hubs such like top-level landing pages (e.g. a help center homepage or other general navigation pages).

Try, and Try Again

There are many different metrics you can measure to help determine whether your help pages are doing their job to enable your users to solve their problems. But analyzing and interpreting aren't worth much if you don't take action. Only by changing things will you see improvement or know if you interpreted the data correctly.

So don't be afraid to make changes, it's the only way analyzing data will help you improve performance and contribute to reaching your company's goals. Measuring, interpreting, making changes, and measuring again is our mantra – we hope this process proves helpful for you too.

https://k15t.jira.com/wiki/plugins/servlet/confluence/placeholder/unknown-macro?name=www-blog-cta&locale=en_GB&version=2





Share this article
AI search is here!

AI search is now available for your help center! Deliver a faster, more intuitive and intelligent search experience with Scroll Viewport.

Reset Cookies

The following services will be reset and deactivated for you.

  • Hyvor Talk:
    We're using Hyvor Talk as a comment tool. Hyvor Talk sets a local storage when activated. By clicking "Disable all services" you're no longer able to post or read comments on our website until accepting the service again.
  • YouTube:
    We're using YouTube to embed video into our website. YouTube sets cookies when activated. By clicking "Disable all services" you're no longer able to watch our embedded videos on the website until accepting the service again.

By clicking "Disable all services" all cookies and local storages related to the services will be removed. Before using them on our website again, you need to accept them.