Before starting the test:
Explain that the objective is to test the software, not the participants.
Explain how the test materials and records will be used.
If a consent agreement is to be signed, explain all information on it.
If verbal protocols will be collected, let participants practice thinking aloud.
Ensure that all participants’ questions are answered and that participants are
comfortable with all procedures.
During the test:
Minimize the number of people who will interact with the participants.
If observers will be in the room, limit them to two or three.
Provide a checklist for recording:
Times to perform tasks.
Errors made in performing tasks.
Unexpected user actions.
System features used/not used.
System bugs or failures.
Record techniques and search patterns that participants employ when attempting to work through a difficulty.
If participants are thinking aloud, record assumptions and inferences being made.
Record the session with a tape recorder or video camera.
Do not interrupt participants unless absolutely necessary.
If participants need help, provide some response.
Provide encouragement or hints.
Give general hints before specific hints.
Record the number of hints given.
Watch carefully for signs of stress in participants:
Sitting for long times doing nothing.
Blaming themselves for problems.
Flipping through documentation without really reading it.
Provide short breaks when needed.
Maintain a positive attitude, no matter what happens.
After the test:
Hold a final interview with participants; tell participants what has been learned in the test.
Provide a follow-up questionnaire that asks participants to evaluate the product or tasks performed.
If videotaping, use tapes only in proper ways.
Respect participants’ privacy.
Get written permission to use tapes.
Analyze, Modify, and Retest
Compile the data from all test participants.
List the problems the participants had.
Sort the problems by priority and frequency.
Develop solutions for the problems.
Modify the prototype as necessary.
Test the system again, and again.
Evaluate the Working System
Collect information on actual system usage through:
Interviews and focus group discussions.
Online suggestion box or trouble reporting.
Online bulletin board.
User newsletters and conferences.
User performance data logging.
Respond to users who provide feedback.