Testing FAQs with people who don’t use your site

10 min read Ashlea McKay

“Questions are never indiscreet, answers sometimes are.”

Oscar Wilde

Frequently asked question pages. Love them or hate them, I don’t think they’re going anywhere anytime soon. This debate has been going on for quite some time and there is an equal number of opinions on both sides of the FAQ fence. Nielsen Norman Group’s Susan Farrell says FAQs can still add value to a website when done properly, and Gerry McGovern says FAQs are the dinosaurs of web navigation.

So, how do we really know for sure if they will or won’t add value to a design? Like anything in UX, you have to test it! I don’t know about you, but I’m a shake-it-and-see-what-falls-out kind of UXer, so naturally I decided to run a Treejack study. Scouring the web one fine day, I came across Sainsbury’s Active Kids. Its FAQ page was unlike any I had ever seen and I knew I’d found the one. I was also curious to see how it would test with people who don’t use the website — after all, anyone should be able to use it. Since Active Kids is an active lifestyle program for UK schools and sports clubs, I recruited my participants entirely from the US. Pull up a chair and get comfy because what I found out should serve as a lesson to us all.

Why Active Kids?

First of all, why did I choose this in the first place? The Active Kids FAQ page caught my attention for three main reasons:

  • structure
  • labels
  • content

The structure of this FAQs page is quite deep, complex and very different from the rest of the site — almost like another information architecture (IA) had been built within the main structure. Imagine you have a large warehouse with hundreds of shelves, and then somewhere in the middle of it, someone builds a house — that’s how it felt to me.

There are two ways to get to it: through the “Help” label on the top navigation bar and the “FAQ” label in the footer. It also uses a combination of drop-down filters that the user needs to apply, but it also has automatic filter options and confusing labels that can send you down a path you don’t necessarily want to take.

I also found it very interesting that most of the information contained within the FAQs section cannot be located anywhere else on the website and most of this is essential to gaining a fundamental understanding of what Active Kids actually does. Adding to the house in the warehouse analogy, it’s like the house holds all the key information the warehouse needs to function, but no one knows which room it’s kept in.

The top level of the FAQs section The top level of the FAQs section

Setting up the study

Treejack was the perfect choice for testing the findability of information on the Active Kids FAQ page and I decided to test the IA of the website as a whole — this means both the warehouse and the house. I couldn’t just test the house in isolation because that’s not how a user would interact with it. The test needed the context of the whole site to gain an understanding of what’s going on. Creating a Treejack study is quick and easy and all you have to do is build the structure out in a basic Excel spreadsheet and then copy and paste it into Treejack.

My next job was to determine the task based scenarios that my participants would use during the study. I decided to choose nine and all were derived from content located in the FAQs section and related to tasks a user might carry out when investigating or participating in the program. Once I had my tree and my tasks, all I had to do was set the correct answers based on where the information currently sits on the Active Kids website and I was ready to launch.

Recruiting participants for the study

In my experience, recruiting participants for a Treejack study is quick and easy. All you have to do is determine the screener criteria for your participants and Optimal Workshop takes care of the rest. For this study I requested 30 participants and they all had to reside in the US. I ended up with 31 completed responses and it was all over in less than two hours.

Treejack results

So, what fell out of that tree when I tested a website aimed at parents and teachers of kids in the UK with 31 Americans? I’ll be honest with you: it wasn’t pretty. Here’s what I discovered in this study:

The overview tab for the Treejack results The overview tab for the Treejack results
  • 81 per cent were unable to find out if home educators were eligible to apply (number 1 on the graph)
  • 65 per cent were unable to find out what a Clubmark accreditation is (number 2 on the graph)
  • 68 per cent were unable to find out how to share their wishlist with friends and family (number 3 on the graph)
  • 64 per cent could not find the information that would explain the purpose of the £1 fee mentioned in the terms and conditions (number 4 on the graph)
  • 97 per cent could not locate the information that would tell them if they could use a voucher from 2014 in 2015 (number 5 on the graph)
  • No participant was able to determine if students from a middle school would be able to participate in Active Kids (number 8 on the graph)
  • 58 per cent of participants in this study were unable to find out what the program is even about (number 9 on the graph)

On the flip side, 68 per cent of participants in this study were able to locate a phone number to contact Active Kids directly (number 6 on the graph) and 97 per cent were successfully able to work out how to redeem vouchers (number 7). Overall, it wasn’t great.

In addition to some very useful quantitative data, Treejack also provides detailed information on the pathways followed by each participant. Understanding the journey they took is just as valuable as discovering how many found their way to the correct destination. This additional level of granularity will show you where and when your user is getting lost in your structure and where they went next. It’s also handy for spotting patterns (e.g., multiple participants navigating to the same incorrect response).

I always set my studies to collect responses anonymously and when this occurs, Treejack assigns each participant a numerical identifier to help keep track of their experience without the participant having to share his or her personal details. For task 6, the paths chart below shows that participants numbered eight to 20 were able to navigate directly to the correct answer without deviating from the correct path I defined during setup.

Paths followed by participants in this study for Task 6 Paths followed by participants in this study for Task 6

For Task 3 (below) , the story told by the paths was quite different. Participant number five navigated back and forth several times through the structure in their attempt to locate information on how to share a wishlist. After all that effort, they were unable to find the information they needed to complete the task and nominated to contact Active Kids directly. Not only is this a bad experience for the user but it also puts unnecessary pressure on the call centre because the information should be readily available on the website.

Paths followed by participants in this study for Task 3 Paths followed by participants in this study for Task 3

Treejack also provides insights into where participants started their journey by recording first click data. Just like Chalkmark, this functionality will tell you if your users are starting out on the right foot from that all important first click.

In this study I found it interesting that when looking for information regarding the eligibility of home educators in the Active Kids program, 42 per cent of participants clicked on “Schools & Groups” and 19 per cent clicked on “Parents & Community” for their first click. Only 6 per cent clicked on “Help”, which happens to be the only place this information can be found.

First click results for Task 1 First click results for Task 1

I also found the first click results for Task 9 to be very interesting. When looking for basic information on the program, more than half (52 per cent) of the participants in this study went straight to “Help”. This indicates that, for these participants, none of the other options were going to provide them the information they needed.

First click results for Task 9 First click results for Task 9

What can be learned from this study?

I mentioned earlier there was a lesson in this for everyone, and rather than dwell on how something tested, it’s time to move on to some lessons learned and constructive ideas for improvement. Based on the results of this Treejack study, here are my top three recommendations for improving the Active Kids website:

Rethink the content housed in the FAQs section

Most of the key information required to master the basics of what Active Kids is all about is housed entirely in the FAQs section. FAQs should not be the only place a user can find out basic information needed to understand the purpose of a product, program or service. I believe this website would benefit from some further thinking around what actually belongs in the FAQs section and what could be surfaced much higher. Another idea would be to follow the lead of the Government Digital Service and remove the FAQs section altogether — food for thought. Frequently asked questions would not be frequently asked questions if people could actually find the information on your site in the first place. Figure out where the answers to these questions really belong.

If you’re using Treejack, just look at the fails in your results and figure out where people went first. Is there a trend? Is this the right place? Maybe think about putting the answer the user is looking for there instead.

Restructure the FAQs section

If you must have an FAQs section (and believe me I do understand that they don’t just disappear overnight! Just try to keep it as an interim solution only) please consider streamlining the way they are presented to the user. Ditch the filtering and display the list on one page only. Users should not have to drill down through several layers of content and then navigate through each category. For further reading on getting your FAQs straight, this Kissmetrics article is well worth a read.

Review the intent of the website

Looking at the Active Kids website and the results from this study, I feel the intent of this website could use some refining. If we come back to my warehouse and house analogy, the main chunk of the website (the warehouse) seems to be one giant advertisement, while the house (the FAQs) is where the action-oriented stuff lies. The house seems to hold the key information that people need to use the program and I think it could be displayed better. Don’t get me wrong, Active Kids does some incredibly good work for the community and should absolutely shout its achievements from the rooftops, however a sense of balance is required here. I think it’s time for the house and the warehouse to join forces into a solution that offers both rooftop shouting and usable information that facilitates participation.

The value of fresh eyes

This study goes to show that regardless of where you are in your design process, whether that’s at the very beginning or a few years post-implementation, there is value to be gained from testing with a fresh set of eyes. I’m still undecided on which side of the FAQs debate I belong to — I’m going to sit on the fence and stand by the “if in doubt — test it” school of thought.

Further reading: