An improved results platform for tree testing with Treejack

We’ve improved the Treejack results platform as part of our project to revamp and refine the results for all three tools. I've written before that we didn't intend to change how the results of tree tests were calculated, but as we were moving Treejack over to the new platform, we couldn’t help ourselves. Now that it's live, we're here to fill you in on what's new.

If you're new to tree testing with Treejack, have a look at our intro video and check out the demo study.

Now you can see direct and indirect fails and skips

Previously on the Task Results tab, only the success score was split into direct and indirect. A direct success means that people went directly to the correct answer without going back up the tree. An indirect success means that people found a correct answer, but had to click back up the tree at least once before they got there. Now, fails and skips are split into direct and indirect as well.

The new individual Task Result looks like this:

treejack results

What makes a fail or a skip 'direct' or 'indirect'?

If someone skips the task before even clicking on the tree, they're marked as a direct skip. And if they click on the tree at least once before they skip the task, they're marked as indirect skip. These skips are really interesting to analyze, because you'll be able to see if people tried to find the answer, and the exact point at which they decided to skip.

Similarly, if someone goes directly to a wrong answer, without ever clicking back up the tree, they're marked as a direct fail. The distinction between direct and indirect fails is a useful one. When someone goes directly to the wrong information, it suggests (though doesn't prove) they have confidence they're on the right track. Imagine how this might play out on your live website — a person will arrive on a page that they're pretty sure has the information they want. But the information they want isn't there. Frustrating, certainly.

The indirect fail number is also important. In this case, people have gone back through the tree at least once before they've selected an answer — and it's incorrect. When a task receives high numbers of indirect fails, it gives you a clear message that the labelling and organization of that part of your tree is confusing people. Not to worry though — because now that you have data showing you the cause, you can quash the confusion and get your information architecture right.

View participant paths in detail

The new Paths tab in the Treejack results enables you to see exactly which way people went before selecting an answer or skipping the task. You can filter this table based on the results you consider the most important (see the top key on screenshot below).

You'll find this visualization useful in lots of ways. For example, you could filter the table so you only see the direct fail paths, and trace each path onto a printed version of your tree (on a spreadsheet) to pinpoint problems with language or structure. Or you could just look at the indirect results to see if there are patterns to participant confusion. Or you could look at the two or three longest paths to's really up to you.

Here's an example of what you can see when you filter the Paths results to include just direct fails and indirect fails:

tree testing paths results

View first click data for every task

OK, this is also very very cool. The new First Click tab shows you how many people clicked where for their first click. We think this data is particularly valuable because a 2009 study by Bob Bailey and Cari Wolfson showed a clear link between first click and task success. Their analysis of 12 scenario-based usability studies proved that when people get their first click right, they're about twice as likely to get the task correct than if they got their first click wrong.

This study inspired us to build Chalkmark, a tool for gathering first click data on webpage screenshots and wireframes. And although first click data has always been available in the downloaded Treejack results spreadsheet, you can now view first click data in the results platform:

first click data

We've updated the spreadsheet reports to reflect these changes

The downloadable results spreadsheet now includes data on direct and indirect fails and skips for each task, along with the same data it always contained on first clicks and paths. We'd initially planned to remove the spreadsheet entirely, but have since learnt that many researchers use the spreadsheet regularly to do their own analysis — so we've kept it and made it better instead.

Jump in and take a look around

You can see the new results in your account now, and if you don't have an account you could sign up and create your first tree test today. Alternatively, feel free to put your data analysis hat on and exlore the results of our sample study on Bananacom's website.

Let us know what you think.

Published on Apr 08, 2015
  • Nahum
  • Nahum is CTO of Optimal Workshop.

Blogs you might also enjoy