Subscribe by Email

Wednesday, April 2, 2008

Usability testing process

Usability testing can make a lot of difference to the ultimate success of the products of the company, and needs to be well planned. If you do the test wrong, or have some of the parameters wrong, then this could have negative results for the company (in addition, you could end up letting the key designers and engineers get wrong inputs). The key to successful usability testing is careful planning and preparation and making sure that the usability test is properly executed. The following are the steps that could be used in this process:

1. Draw up a list of things to test.

* You can't test everything. Decide what are the most important tasks you want users to be able to perform for your product or site, and prioritize them as to which are the ones to be done first.

2. Draw up the scenarios that you want the users to test.

* The scenarios should be based on some of the initial customers that you had designed, and should be realistic.
* Keep the tests to be a reasonable time frame. Remember that people take a certain amount of time to get comfortable with a topic, so anything less than 10 minutes could be tricky, One the other hand, if it gets too long, it could make the subjects weary.
* Test the scenarios on yourself before you use them on your test subjects.

3. Write a script for administering the test -- this is necessary for consistency. It is particularly important if more than one person will be conducting the test, or if you cannot do the test at some point of time. Also, finalizing a script is a good way of ensuring that key people have signed off on the tests.

* Include in the script an introduction to the test, an explanation of how the test will be conducted, the questions that will be asked, and suggestions for follow-up questions to encourage the subjects to discuss what they are doing and why.
* Allow for some free format answers
* You can ask test subjects to think out loud -- a behavior that is not always natural to everyone. You may need to drill this into the subjects periodically, but it is very important - it helps find out what they are thinking.

4. Test the test. Try it on some users -- find out if the scenarios are comprehensible; if there are some problems, then you may need to rework the tests.

5. Train the testers and recorders.

* First-time testers should each practice conducting a test with volunteers or on each other - this will help them get more comfortable and gain a bit of experience.
* First-time testers should then discuss the experience of conducting the test.
* Critique each other, appreciate good things, but learn from mistakes and highlight necessary improvements to be made.
* Practice and critique the recording of subjects, comments.

6. Gather volunteers to be tested, using the various sources that you would have (forums, existing customers, etc).

* Expect to provide some form of compensation to your volunteers, it could be something as simple as a copy of the last released version of the software, or maybe some sort of company branded gift item
* Volunteers should be selected so that they are representative for the general user population. Choose a mixture of age, year in school, occupation, etc.
* People who volunteered should be scheduled and sent several follow-up messages about the time and place for their test.
* Even with reminder messages, expect some people to drop out - there may be sudden personal issues or some other such reason.

7. Make sure you have a quiet place to do the testing. It's a good idea to conduct the practice test in the actual room that you will use for the real testing and that the subjects will not have much distractions in such testing. It will not help if people keep dropping in and out of the place.

8. Conduct the tests.

* Schedule enough time between tests for the tester and the recorder to debrief before the next test begins.
* If you discover that a test question or a design element is not working, it is unwise to continue testing.

Fix what is wrong before you continue testing. Make the change and test the correction. The computer industry calls this "rapid prototyping".

9. Record the test results as soon as possible after the test is completed. If possible, stream the actual process so that interested folks can see it easily as it is going on.

10. Analyze the results and determine how to correct the design problems. Redesign based upon usability evaluation results.

Brief steps that need to be taken:
- Recruiting (one month in advance - selecting locations than dates and participants
- Travel Planning (one month in advance - in tandem with the recruiting)
- Prototype development and refinement (2-4 weeks in advance depending on the fidelity)
- Task development (2-4 weeks in tandem with the prototype)
- Approvals (2-4 weeks in advance depending on the formality of the study)

No comments:

Facebook activity