deciding how to measure usability how to conduct successful user requirements activity?
Post on 17-Dec-2015
221 Views
Preview:
TRANSCRIPT
Deciding How to Measure Usability
Understanding what you can measure Matching Measures to your Goals and
concerns Matching Measures to the Product's Stage of
Development Setting Quantitative Criteria for Each
measure and each Task
Understanding what you can measure
In a usability test you measure both Performance measures and Subjective measures
Understanding what you can measure
Performance Measures: count of behaviors or actions you can see.
Quantitative
e.g. how many errors people make and how many times they repeat the same error
Understanding what you can measure
Subjective Measures: people's perceptions, opinions and judgments.
either quantitative or qualitative
e.g. give a people 5 or 7 point scale and ask them to rate difficulty in using a product.
Deciding How to Measure Usability
Understanding what you can measure Matching Measures to your Goals and
concerns Matching Measures to the Product's Stage of
Development Setting Quantitative Criteria for Each
measure and each Task
Deciding How to Measure Usability
Understanding what you can measure Matching Measures to your Goals and
concerns Matching Measures to the Product's Stage of
Development Setting Quantitative Criteria for Each
measure and each Task
Matching Measures to your Goals and Concerns
Performance measures chosen should be directly related to quantitative usability goals and concerns behind the usability test
How to collect performance data?
Data Logging Software
S... Stop the test B Take a Break
T... Stop the task A Assist
M Menu Error H Help desk
S Select from list error F Frustration
E Other Error N Observations
Event: O 06:21:33
Comment: Looking for online help for To: field
Points to be considered in building Data Logging Program
Include preset codes for events that happen in every test.
Code should be small Short description with code Timestamp every event Allow easy backup and movement of data into a
file
What if you don't have a data logging software?
Should you measure positive behavior?
Deciding How to Measure Usability
Understanding what you can measure Matching Measures to your Goals and
concerns Matching Measures to the Product's Stage of
Development Setting Quantitative Criteria for Each
measure and each Task
Matching Measures to Product's stage of development
Very important to consider where product is in the development cycle while planning performance measures.
Testing prototype for a manual (without index)
Deciding How to Measure Usability
Understanding what you can measure Matching Measures to your Goals and
concerns Matching Measures to the Product's Stage of
Development Setting Quantitative Criteria for Each
measure and each TaskChoose the performance measures you want to
count
Selecting Performance Measures
General Concerns: Ease of users who have never used email Ease of users who have used other email Will the online help be useful Will new user be able to select items from
screens quickly and easily
Selecting Performance Measures
Specific Concerns: Will the user be able to read a specific piece
of mail and skip over mail they don't want to read
Will new users be able to find and select people's addresses to send them mail
Will users be able to find the right menu path to read/write/send a message
Setting Quantitative measure for Each Measure and Task
How to select criteria for performance measures?
Typical criteria for Performance measures Excellent Acceptable or OK Unacceptable
Base your criteria on users
Setting Quantitative Criteria for each measure and each task
Measure Excellent Acceptable Unacceptable
Task 1: Read Message
Time for Task <3 min 3-5 >5
Time in online help
<1 1-2 >2
Write and Send
Time for Task <10 min 10-15 min >15 min
Time in online help
<2 2-4 >4
E=Other Errors 0 1-2 More than 2
Are the measures the same for all tasks in a given test?
No
Are the criteria of performance same for all tasks?
No
Do you take the test situation into account in selecting criteria?
Yes
Should you count system response time in setting the criteria?
Yes
What we have discussed till now
Tasks that participants will do during tests and how to measure participant's performance with products.
How to conduct successful user requirements activity?
Welcoming your participants Dealing with late and absent participants Warm up exercises Inviting Observers Introducing your think aloud protocol Moderating your activity Recording and note taking Dealing with awkward situations Conclusion
Welcoming your participants
Ask participants to come about 15 minutes earlier.
Introduction. Welcome Signs. Playing CDs.
Don't leave participants alone
Dealing with late and absent participants
Despite your best efforts some participants can be late.The late participantYou can't wait any longerIncluding late participantsThe No-Show
Warm-up Exercises
Start with light conversation.Introduce yourself.Provide nametags
Don't spend much time
Inviting Observers
Developer observers get to know user requirements better.
Tell observers to come early and remain quiet.
Don't allow managers to observe
Introducing your think aloud Protocol
Make participants speak Their thought process while doing task . Steps in the task. Expectations and evaluation statements.
Provide some examples to participants
e.g. Stapler
Observe loudness of participants
Moderating your activity
Have personality Ask questions Stay focused You are not a participant Keep activity moving No critiquing Everyone should participate No-one should dominate Practice makes perfect
Take Notes
You get data immediately and can start analysis
Participants feel they are saying important things
Problems: You can get wrapped being a stenographer If discussion pace is fast all information
cannot be captured
Video and Audio Recording
Capture nuances, body language of participants (cant have in notes)
Listen to recording and take notes Video better than audio recording
Problems: Can make participants uncomfortable
Dealing with awkward Situations
Some Uncomfortable situations and ways to deal with them :Participant Issues Observers Issues
Participants Issues
Participant is called in middle of test Participant's cell phone rings continuously Wrong participant recruited Participant thinks he is on a job interview Participant refuses to be videotaped and
wants to leave
Contd..
Participants Issues
Participant is confrontational with other participants
Participant dominates the group Participant is not truthful about his identity Participant refuses to sign consent form Fire Alarm sounds in middle of test and
participant still wants to continue
Product Team/Observers Issues
Team changes the product mid-test An observer turns on light in control room Observers talk loudly during an activity
top related