1 foxtrot, by bill amend 10/31/02. usability testing amy thurston userworks, inc. 20 november 2002
TRANSCRIPT
1
Foxtrot, by Bill Amend 10/31/02
Usability Testing
Amy Thurston
UserWorks, Inc.
20 November 2002
3
What is usability? An effective, efficient, satisfying relationship
among: a system or product its users what the user wants to accomplish
Considering both initial ease of use (intuitiveness, ease of learning) operational ease of use (productivity, ease of
retention)
4
5
What is the goal of usability?
To improve aspects of the product with which users come into contact
To make user interfaces logical, intuitive and clear to people who use them
Includes the entire user experience physical design (dimensions, layout, packaging) look and feel (displays, controls) procedures (the steps to complete a task) environment (physical, geographic) availability/usefulness of support (online help,
training, documentation, help line support)
6
7
Types of usability testing Testing with users Heuristic / expert evaluation Online survey Click-path monitoring Concept evaluations
Considerations These techniques can be complimentary Ideally, one would use groups of these as you go
through an iterative design process
8
Categories of usability testing
Formative - Evaluations based on observing representative users completing important and frequent tasks with a prototype of the product. Errors, accuracy, speed of performance, and analyze user comments to recommend design changes
Summative / Acceptance - Usability engineering specialists observe and record performance and user comments at the end of a design cycle
Benchmarking - Establishes a level of product usability for future testing and measurement
9
______________
/ / Visual Design User Testing Takes: 2days-2weeks
/ Surface / Look and Feel Heuristic Evaluation Lasts: 2weeks - 2mos
/_____________/
______________
/ / Interface Design User Testing Takes: 2days-2weeks
/ Skeleton / Heuristic Evaluation Lasts: 2weeks - 2mos
/_____________/ Participatory Design
______________
/ / Interaction Design Card sorting Takes: 2days-2weeks
/ Structure / Prototype Testing Lasts: 1mo - 6mos
/_____________/
______________
/ / User Modeling User Interviews Takes: 2weeks - 2months
/ Scope / Project Objectives Personas/Scenarios Lasts: 3mos - 1year
/_____________/ Mental Models
______________
/ / Deep User Research Ethnography Takes: 6weeks - 6months
/ Strategy / Business Goals Contextual Research Lasts: 1year - 5years
/_____________/
--Peter Merholz, peterme.com, based on a structure by Jesse James Garrett, jjg.net
10
Objectives of usability testing
Plan Prepare Collect data Analyze Report the data
11
What’s not on the list? Taking action on the findings
Many times, implementation is left to another group
Findings need to be balanced with business decisions
What’s best for the user might not be best for the product
Decision makers often have a broad array of reports which inform their actions; usability is only one of these
12
Planning What are the goals of testing?
Is the product in development and is there a need for guidance?
Is the product about to be deployed and is there only time for “tweaking”
What if there are show-stoppers? Note that the final results will only be as
good as the procedures are appropriate Select the best usability testing technique
13
Planning Who are the users?
Are there multiple user groups? How many users should be tested?
Do we need to test users from all groups? What type of test is this? (i.e,, do we need
quantitative, random sampling based data? Do we need to turn around the results in two days?)
What or who determines if the product is usable?
14
15
16
Planning What should be tested?
What are the users typically trying to do? Are there some basic functions that have to
be right? Are there some functions that seem
confusing or problematic? Do anticipated environmental situations affect
usability?
17
Planning Who will receive the report? What happens to the information that will
be generated? Will changes be made to the product
based on the data?
18
What if you don’t know the users?
19
Types of user needs and task analysis activities Interviews / focus groups Competitive / comparative analysis Contextual inquiry Walkthrough / talkthrough Link analysis
20
Interviews / focus groups
Interview users / conduct focus groups to probe for user needs on specific design features or preferences
Example: Interviewer: “This product has the
capability to do a three-way conference call. Would that be sufficient?”
User: “No; sometimes we have 4 to 5 locations that need to be involved.”
21
Competitive / comparative analysis
Examine other similar products Determine typical user scenarios and
evaluate products List features, noting their design and, if
possible, their effectiveness
22
Contextual inquiry Understand procedure Develop training criteria Determine information needs
23
Walkthrough / talkthrough Walkthrough only of use where there are
well-established procedures and existing products
May pick up more dynamic problems than a static task analysis Enough time to accomplish? Physical distance between needed
information
24
Link analysis
Task analysis focusing on: Communication links Physical access
Emphasis on frequency of contact Often used for:
Workplace design Task allocation Determining instrumentation requirements
25
Preparation Materials needed for testing and test
plan Screener to “qualify” participants Video consent form Tasks for participants (enough to fill the
time, and a little more) Pre-test and post-test questionnaires Test plan
26
Preparation: Screener
Based on a description of user group(s) Determine which characteristics are
important for choosing participants Some important characteristics are used to
determine group membership (i.e., profession, experience, location)
Other characteristics should be balanced within the group (i.e., gender and age are frequently balanced)
27
Preparation: Screener
Use the screener to verify participant is willing to be video taped, if appropriate
After the potential participant successfully completes the screener, use the opportunity to schedule the participant
Participants who are “no-shows” are costly Give incentives Make reminder calls
28
Preparation: Consent Form
Is it legally necessary to receive written permission from a participant to video or audio tape them? Note that testing can still go forward if they agree but
that data will be without back-up verification You need the participant’s agreement to use their
image Some clients need written non-disclosure
statements from participants
29
Preparation: Tasks Tasks should be written
Basic parts of the product (i.e., the critical functions or paths)
Parts of particular interest to the client Parts that
Are known to cause confusion and problems You anticipate will cause confusion and problems
30
Preparation: Tasks Tasks should be pilot tested with
someone unfamiliar with the product To assure that the task wording is clear To determine how long it takes to accomplish
the tasks To determine if the tasks are “do-able” To find if there are alternate “correct” paths
31
Preparation: Pre-test and Post-test Questionnaires Questionnaires can be of many types
Specifically designed for the particular test General questionnaires designed by others
(e.g., QUIS from U MD) Validated questionnaires such as SUS (by
John Brooke at Redhatch for DEC, free), or SUMI and WAMMI (from Jurek K. at University College Cork, not free)
Marketing questionnaires Benchmarking, baseline measures
32
Preparation: Post-task Questionnaires
Purpose is to determine from the participant’s point of view Satisfaction Efficiency Ease of learning Ease of use If feel in control If like system or not
33
Preparation: Test Plan
A script for the administrator Gives place for checklist of what to say Can give precise wording if study needs to adhere
to more precision Gives the anticipated “correct” path for ease of the
administrator to follow Introduction should be delivered in person
even if test is a non-interference test
34
Testing Testing team can be one or more persons Participant fills out preliminary forms Start the tape! Use test guide to direct the “production”
Introduce test Have user do task(s) Note usability issues, comments (by hand or software) Have user answer questions, questionnaire(s)
Debrief, thank, and compensate
35
Testing Materials to have participant complete
before test Demographic, consent forms
Use the test guide during the test Materials to have been completed by
end of test Questionnaires, forms for usability Data log and/or other notes
36
Analysis: the Basics
Usability heuristics Constantine and Lockwood (see
http://www.foruse.com/) Jakob Nielsen (see
http://www.useit.com/papers/heuristic/heuristic_list.html and http://www.useit.com/alertbox/9710a.html)
Expert opinions (see references) Industry best practices Experience
37
Report Usability tests must be documented
To have the appropriate effect To allow the powers-that-be to make
appropriate decisions To serve the in the historical context of a
project
38
Report
Helpful to know these answers before starting the report Who is the report for? How will it be used? How much time after testing is allowed for
analysis and reporting? What will be the most effective (or acceptable)
format?
39
Report Basic information about participants and test Test based data Analysis of the data--prioritize problem severity
High--cannot complete the task or causes data loss Medium--likely to cost user time or difficulty Low--not substantial but worth considering
Recommendations (if appropriate for the type of test) including further testing if problems are severe or complicated
40
Report Formats Written reports
Top line report (1 p) Brief report (5-7pp) Systematic scientific (academic style) report PowerPoint reports Common Industry Format for software analysis reports (but there is no
place for recommendations) Briefings
Formal with presentation Telephone based
Highlights tape Collection of clips of participants using the product with voice over
comments or introductory titles
41
Sample results from a usability test
Overall, 4 of 5 users liked the product 3 of 5 users had some difficulty completing Task 3.
The average rating for completing this task was 5.3 on a 9 point scale. The three users that had difficulty selected the blue button instead of the red button because the names are ambiguous and confusing. It is recommended that one or both of these names change to reflect their opposing effects.
5 of 5 users completed Task 6. Users commented that they liked this feature and found it easy to complete the task. Average rating was 8.7 on a 9 point scale.
42
Usability resources
Online: Usability.gov - National Cancer Institute Useit.com - Jakob Nielsen Usableweb.com - Keith Instone UPAssoc.org - Usability Professionals’
Association UIE.com - User Interface Engineering
43
Good Luck in Your Own Usability Testing
Amy Thurston
UserWorks, Inc
1738 Elton Rd., Suite 138
Silver Spring, MD 20903
Phone 301-431-0500
FAX 301-431-4834