Usability Testing
As part of my job at itslearning I have become the in-house usability testing moderator. Initially, I was doing this only for the projects I was working on but now I am brought in to test all of itslearning's high priority projects.
Before each test I determine which workflows are the most important to test. Each step of that workflow gets a rating of green, yellow or red based on how each test subject performs. At the end of every round of user testing, all the tasks are organized by how the subjects performed and understood each step (see below). The tasks with the most red/yellow lights are shown first and given the highest priority for fixing.
I have been in charge of recruiting, planning, moderating and creating reports for each test. My team members include note takers and observers, relevant product managers, other designers and occasionally engineers take on these roles.
In my time conducting user testing sessions and working with experts in the field at the Comparative Usability Evaluation meeting I have come up with a list of best practices:
Pilot your study
Create tasks that fit within your timeline and allow your participants to explore. The richness of data can be lost if you're rushing from task to task.
Do not tell participants how many tasks there will be.
Print tasks and provide them to the participant one at a time.
Look for opportunities to ask questions based on participant reactions. "Tell me what you mean by that." or "Can you tell me about that?"
Position yourself so that the participant knows you are paying attention, but don't get in their way.
Let participants struggle, this can lead to a wealth of data.
Thoroughly prepare participants at the beginning of the study.
Never defend the design.
Clear your cookies before each test.
Use a setup that the participant is used to/comfortable with. (e.g. if they are a PC user let them use a PC).