Get new posts delivered straight to your inbox.
Enter your email address
Subscriber count: 2,561
Jul 22, 2011 •
This week was the first time I've actually seen our usability labs used. We have a new usability research group led by someone with years of experience doing usability engineering for Microsoft. I watched in our observation room as he led a handful of users through 60-90 minute usability tests for an application I documented.
What is surprising about usability testing is how much valid information you can gather from so few subjects. With a handful of people unfamiliar with an app who are thoroughly trying to use it, you can find out most of the major problems with the app.
As the usability researcher asked participants to perform various tasks, eventually the participants forayed into the help. I was eagerly waiting for them to click the help icon, and when they did, I was a bit surprised what happened.
Only a few of the users read the help. Most just watched the videos. Keep in mind that all of the users were brand new to the app.
With one user, he didn't realize there were videos at all, as I had them buried in a side option called "screencasts." Screencasts seemed something similar to screen sharing, he said, so he didn't click there until later. When he did, he felt that the videos were just what he was looking for in help material.
Another user watched one of the videos twice, and after watching the video, was somewhat successful in completing a task (or at least a lot more successful than before -- there were still challenges).
Given the popularity of the videos, I soon stuck all five of the videos right on the home page. Here's the introductory video that most people watched:
Despite the popularity of the videos, I found that videos appeal to different users in different contexts. For more tech savvy users just looking for an answer to a specific question, they were less likely to watch a video and more likely to search/scan the text for answers to their specific questions. Even so, these intermediate users were interested and encouraged by the videos available and sometimes watched one. One user saw the Youtube logo on the videos and immediately said, Cool.
I also noticed that when users did venture into the text, they tended to discover information they hadn't anticipated. I found this fascinating. I've written before that search doesn't allow you to discover what you aren't aware of. Most users seemed pretty confident that they already knew how most of the app worked. When they actually read the help, though, there was a lot they didn't know.
Almost invariably, users tried to figure the app out themselves from the interface first before resorting to the help -- especially advanced users. In watching users play guessing games, proceed with trial-and-error mentality, and generally click everywhere trying to figure things out, I realized it would have been easier if they viewed several tutorials first before trying to complete the tasks.
One of my colleagues explained that at one company, they did usability testing that involved a laptop, mouse, keyboard, and a 40 page quick reference guide. Of about a dozen users, one person read the entire guide before doing the tests. When this user started on the tests, he moved right through the tasks quickly.
We know this is probably true of software usage. It's more efficient to read up on how to use the application first before diving into it. Despite this, most of us will resist help until the last resort, and turn to it only when we get stuck. We'll struggle and struggle and struggle and only after spending 30 minutes or more guessing, then we'll read the instructions for 5 minutes to figure it out.
One reason, my colleague explained, is that we need a certain context before information in a help file or video becomes relevant. Without having seen the app and wondered, for example, about the use of tags, a video called "About Tags" doesn't become relevant.
Perhaps on-screen text that contains snippets of instruction, with links to more information in the help, would be a way to move people from the interface into the help.
Overall, if there's one thing I've learned from watching users, it's that they prefer video tutorials to text -- especially new users. Having come to this conclusion about video, it makes me wonder whether I should be focusing more on video than writing. Maybe I have to reevaluate the importance of written communication? My blog is called, after all, I'd rather be writing. Shouldn't I change it to something like, I'd rather be creating videos?
One limitation of video is that it's harder to dive into complexity and sophistication. In writing, you can explain concepts, explore ramifications, analyze, assess, and synthesize all you want. Like in this blog post, for example. I'm doing all kinds of little mental explorations. In a scripted video, however, it's much harder to do explore a topic in a structured, logical, interesting way. Perhaps videos force you to stick with the basics.
I haven't tried doing more "thinking" in videos. As a test, I recorded the following short video last night:
As you can see, it's somewhat rambling and meandering. But would you rather watch the video or read this post? I tried to essentially cover the same material.
My biggest takeaway from usability testing is that it opened up my eyes to the need for testing -- not just testing design and functionality, but testing content as well. At Confab, I had breakfast with one of the presenters, Angela Colter, who was presenting on testing content. We test interfaces with all kinds of users, but if people are really interested in content, if that's what they go to a site for, shouldn't we be testing content instead? Shouldn't content be the primary thing we test for, and then design?
I missed Angela's presentation, but as I recall, she said something to the effect that when they tested content for a group of users, they found the content didn't answer many of the users' questions. I'd love to pull about 10 users into a room and have them review the content of a website or help system, based on various goals and questions they have. I don't know why I haven't done this before. Now I realize that not doing it for content has the same effect as not doing it for interfaces -- you may think it's fine, but if you're the one who designed/wrote it, you become blind to its failures.
While the interaction designers' pride pretty much crumbled while watching users try to use the interface, I imagine technical writers would feel the same way watching a usability test for their help. I have no doubt that users would experience as much or more frustration looking in the help file for various questions and answers. Perhaps this is why we tend to avoid usability testing for our help systems -- the reality of how much help fails would be too much burden for us to bear. The results would force us to reassess and reapproach how we do help. Based on my experiences in watching users explore help, videos and on-screen help will now be my top priority.