Content audit: Taking stock of our learning resources

8 min read David Renwick

Summary: In this post, David goes through the process of running an audit of Optimal Workshop’s content – and why you should probably think about doing your own.

When was the last time you ran a website content audit? If the answer’s either ‘I don’t know’ or ‘Never’, then it’s probably high time you did one. There are few activities that can give you the same level of insight into how your content is performing as a deep dive into every blog post, case study and video on your website.

What is a content audit?

At a very high level, a website content audit is a qualitative analysis of all blogs, landing pages, support articles and guides on your website. It’s like taking inventory or stock-taking. In real terms, a content audit will often be a spreadsheet with fields for things like content type, URL, title, view count and category – the fields differ depending on your own needs and the types of content you’re auditing.

Why conduct a content audit?

There’s really no better way to understand how all of your content is performing than a comprehensive audit. You’re able to see which articles and pages are driving the most traffic and which ones aren’t really contributing anything.

You can also see if there are any major gaps in your content strategy to date. For example, is there a particular area of your business that you’re not supporting with guides or blog articles?

A holistic understanding of your website’s content allows you to create more effective content strategies and better serve your audience.

Auditing Optimal Workshop

Content had grown organically at Optimal Workshop. In the 10 years since we started countless people had a hand in creating blog articles, landing pages, videos and other types of content – much of it often created without following a clear content strategy. That’s often fine for a small startup, but not the right direction to stay on for a rapidly growing business.

When I started to scope the task of auditing everything content-related, I first took note of where all of our content currently sat. The ‘learn hub’ section of our website was just a fairly convoluted landing page pointing off to different sub-landing pages, while the blog was a simply a reverse-chronological order display of every blog post and far too many categories. There was clearly room for significant improvement, but taking stock of everything was a critical first step.

A screenshot of a section of Optimal Workshop’s learn hub webpage, showing the ‘Videos’, eBooks’ and ‘Case studies’ section links.
The learn hub pre-overhaul

With a rough idea of where all of our content was located – including the many live pages that weren’t accessible through the sitemap – I could begin the process of collating everything. I’d decided on a spreadsheet as it allowed me to achieve quite a high information density and arrange the data in a few different ways.

I came up with several fields based on the type of content I was auditing. For the blog, I wanted:

  • Article title
  • Categories/tags
  • Author
  • View count
  • Average time on page
  • Average bounce rate

At an individual level, these categories gave me a good idea as to whether or not a piece of content was performing well. When looking at all of the blog posts in my finished audit, I could also quickly identify any factors that the best-performing pieces of content had in common.

One of the most interesting, although not entirely surprising, learnings from this audit was that our more practical and pragmatic content (regardless of channel) always performed better than the lighter or fluffier content we occasionally produced. The headline was almost certainly the deciding factor here. For example, articles like ‘A guide to conducting a heuristic evaluation’ and ‘How to create use cases’ attracted view counts and read times well above articles like ‘From A to UX’ and ‘Researching the researchers and designing for designers’. Interestingly, content written to support the use of our tools also often attracted high view counts and read times.

Intuitively, this makes sense. We’re a software company writing for a community of researchers, designers and professionals, many of whom will have come to our blog as a result of some interaction with our tools. It makes sense they’d see more value in content that can help them accomplish a specific task – even better if it supports their use of the tools.

A section of the content audit in Google Sheets. There are 6 columns for title, view count, author, tags and bounce rate.
A snippet of the blog content audit

Auditing the learn hub

Following my audit of the blog, I moved onto the other areas of the learn hub. I created an entirely new spreadsheet that contained everything that wasn’t a blog post, with a set of different fields:

  • Page name
  • Content type (landing page, case study, video or guide)
  • Description
  • Owner (which product/marketing team)
  • Page views
  • Average time on page
  • Bounce rate

I knew before even starting the audit that our series of 101 guides received a significant share of our learn hub page traffic, but I wasn’t quite prepared for just much they attracted. Each guide received far and away more traffic than the other learning resources. It’s results like these that serve to really highlight the value of frequent content audits. Few other exercises can provide such informative insights into content strategy.

At some point in the past, we’d also run a short video series called ‘UX Picnic’, where we’d asked different guest user researchers to share interesting stories. Similarly, we had two case studies live on the website, with a third one delisted but still available (as long as you knew the URL!). We hadn’t seen spectacular traffic with any of these pieces of content and all were good candidates for further investigation. Seeing as we had big plans for future case studies, analyzing what worked and what didn’t with these earlier versions would prove a useful exercise.

A screenshot of the UX Picnic webpage on the Optimal Workshop website. There are 6 links to different videos, with pictures of the people hosting the videos.
We’ve had lots of guests chat with us about everything UX

A product demo page, information architecture guide page and how to pick the right tool page made up the final pieces of our audit puzzle, and I popped these last 2 on a third ‘other pages’ spreadsheet. Interestingly, both the information architecture guide page and how to pick the right tool page had received decent traffic.

Identifying gaps in our content ‘tree’

An important function of a content audit is to identify ways to improve the content strategy moving forward. As I made my way through the blog articles, guides and case studies, I was finding that while we’d seen great results with a number of different topics, we’d often move onto another topic instead of producing follow-up content.

Keyword research revealed other content gaps – basically areas where there was an opportunity for us to produce more relevant content for our audience.

Categorizing our content audit

Once I’d finished the initial content pull from the website, we (the Community Education team) realized that we wanted to add another layer of categorization.

With a focus specifically on the blog (due to the sheer quantity of content), we came up with another tagging system that could help us when it came time to move to a new blogging platform. I went back through the spreadsheet containing every blog post, and tagged posts with the following system:

  • Green: Valuable – The post could be moved across with no changes.
  • Red: Delete – The post contains information that’s wildly out of date or doesn’t fit in with our current tone and style.
  • Yellow: Outdated – The post is outdated, but worth updating and moving across. It needs significant work.
  • Purple: Unfinished series – The post is part of an unfinished series of blog posts.
  • Orange: Minor change – The post is worth moving across and only needs a minor change.
  • Blue: Feature article – The article is about a feature or product release.

This system meant we had a much better idea of how we’d approach moving our blog content to a new platform. Specifically, what we could bring across and the content we’d need to improve.

The document that keeps on giving

Auditing everything ‘content’ at Optimal Workshop proved to be a pretty useful exercise, allowing me to see what content was performing well (and why) and the major gaps in our content strategy. It also set us up for the next stage of our blog project (more coming soon), which was to look at how we’d recategorize and re-tag content to make it easier to find.

How to do a content audit

If you’ve just jumped down straight down here without reading the introduction at the top of the page, this section outlines how to run your own content audit. To recap, a content audit is a qualitative assessment of your website’s content. Running one will enable you to better understand the pros and cons of your current content strategy and help you to better map out your future content strategy.

To do a content audit, it’s best to start with a clear list of categories or metrics. Commonly, these are things like:

  • Page visits
  • Average time on page
  • Social shares
  • Publication date
  • Word count

The sky’s the limit here. Just note that the more categories you add, the more time you’ll have to spend gathering data for each piece of content. With your categories defined, open a new spreadsheet and begin the process of auditing each and every piece of content. Once you’ve finished your audit, socialize your insights with your team and any other relevant individuals.

Then, you can move onto actually putting your content audit into practice. Look for gaps in your content strategy – are there any clear areas that you haven’t written about yet? Are there any topics that could be revisited. Ideally, a content audit should be kept updated and used whenever the topic of “content strategy” comes up.