a deep dive into questions by @cjforms at uxlx
DESCRIPTION
How to ask better questions and how to assess UX using surveys. This workshop at UXLX 2014 in Lisbon was a deep dive into two important topics in survey design for user research. We used the four-step model of how people answer questions to work on better questions, then we focused on two special uses of questionnaires in user research: the post-test assessment of satisfaction, and then how to gather information from users for redesign. Thanks to all the attendees for making this workshop a lot of fun. Caroline Jarrett @cjformsTRANSCRIPT
A deep diveinto questionsWorkshop at UxLx 2014 led by Caroline Jarrett
How to ask better questions, andhow to assess user experience using surveys
Introductions(I’m Caroline Jarrett - @cjforms)Work with your neighbour• Your name and role• A random thing about yourself
2
3
Agenda Introductions
How to ask better questions
Break
How to assess user experience using surveys
Wrap up
A survey I saw recently
4
• How do we know it’s a survey?
5
And it continued….
6
• I’ll hand out an invitation I received recently by email• Work in pairs• Decide whether it is a survey or something else
7
8
Agenda
Introductions
What is a survey?
How to ask better questions
The steps to answer a question
Improve step 1: read and understand
Improve step 2: find the answer
Improve step 3: judge the answer
Improve step 4: place the answer
Understand why people answer
Break
How to assess user experience using surveys
Wrap up
There are four steps to answer a questionStep1. Read and understand
2. Find an answer
3. Judge the answer
4. Place the answer
Adapted from Tourangeau, R., Rips, L. J. and Rasinski, K. A. (2000)“The psychology of survey response”
There are four steps to answer a questionStep A good question …1. Read and understand is legible and makes sense
2. Find an answer asks for answers that we know
3. Judge the answer asks for answers we’re happy to reveal
4. Place the answer offers appropriate spaces for the answers
Adapted from Tourangeau, R., Rips, L. J. and Rasinski, K. A. (2000)“The psychology of survey response”
Are you …?
11
12
Let’s review a question• There is a question coming up on the next slide• I will ask you to think about ONE of these four steps
1. Read and understand 2. Find the answer3. Judge the answer4. Place the answer
• Please think about any problems in that particular step
13
14
Agenda
Introductions
What is a survey?
How to ask better questions
The steps to answer a question
Improve step 1: read and understand
Improve step 2: find the answer
Improve step 3: judge the answer
Improve step 4: place the answer
Understand why people answer
Break
How to assess user experience using surveys
Wrap up
Legibility is also important
15 Hermann grid illusion
16
Agenda
Introductions
What is a survey?
How to ask better questions
The steps to answer a question
Improve step 1: read and understand
Improve step 2: find the answer
Improve step 3: judge the answer
Improve step 4: place the answer
Understand why people answer
Break
How to assess user experience using surveys
Wrap up
17
In your last five days at work, what percentage of your work time do you estimate that you spend using publicly-available online services (not including email, instant messaging and search) to do your work using a work computer or other device?
The approximate curve of forgetting
19
Agenda
Introductions
What is a survey?
How to ask better questions
The steps to answer a question
Improve step 1: read and understand
Improve step 2: find the answer
Improve step 3: judge the answer
Improve step 4: place the answer
Understand why people answer
Break
How to assess user experience using surveys
Wrap up
I saw this question on an employee survey
20
• Let’s create a list
21
22
Agenda
Introductions
What is a survey?
How to ask better questions
The steps to answer a question
Improve step 1: read and understand
Improve step 2: find the answer
Improve step 3: judge the answer
Improve step 4: place the answer
Understand why people answer
Break
How to assess user experience using surveys
Wrap up
Test with users to make sure youoffer the right answer options
23
24
25
26
Offer the right widget to collect the answerKnowledge of what users want to tell you
How many answers? Offer
We know all the answers that users are likely to give us
They only have one answer
Radio buttons
They may have more than one
Check boxes
We’re not sure Text boxes
Allen Miller, S. J. and Jarrett, C. (2001) “Should I use a drop-down?”http://www.formsthatwork.com/files/Articles/dropdown.pdf
27
Grids are often full of problems at all four steps
28
29
Grids are a major cause of survey drop-out
35%
20%
20%
15%
5%5%
Total incompletes across the 'main' section of the ques-tionnaire
(after the introduction stage)
Subject MatterMedia DownloadsSurvey LengthLarge GridsOpen QuestionsOther
Source: Database of 3 million+ web surveys conducted by Lightspeed Research/KantarQuoted in Coombe, R., Jarrett, C. and Johnson, A. (2010) “Usability testing of market research surveys” ESRA Lausanne
30
But it’s the topic that matters most
35%
20%
20%
15%
5%5%
Total incompletes across the 'main' section of the ques-tionnaire
(after the introduction stage)
Subject MatterMedia DownloadsSurvey LengthLarge GridsOpen QuestionsOther
Source: Database of 3 million+ web surveys conducted by Lightspeed Research/KantarQuoted in Coombe, R., Jarrett, C. and Johnson, A. (2010) “Usability testing of market research surveys” ESRA Lausanne
31
Agenda
Introductions
What is a survey?
How to ask better questions
The steps to answer a question
Improve step 1: read and understand
Improve step 2: find the answer
Improve step 3: judge the answer
Improve step 4: place the answer
Understand why people answer
Break
How to assess user experience using surveys
Wrap up
Response relies on effort, reward, and trust
32
Trust
Perceivedeffort
Perceivedreward
Diagram from Jarrett, C, and Gaffney, G (2008) “Forms that work: Designing web forms for usability” inspired by Dillman, D.A. (2000) “Internet, Mail and Mixed Mode Surveys: The Tailored Design Method”
An interesting subject helps in all the areas
33
Shared interestsinspire trust
Interestingtopicstakeless effort
Interesting subject =
intrinsicallyrewarding
What about response here?
34
35
Your answers to this survey are important for our work
But what’s in it for me? And I’m really ready for a coffee.
36
Agenda Introductions
What is a survey?
How to ask better questions
Break
How to assess user experience using surveys
Wrap up
Let’s start with a standard option: SUS• The System Usability Scale
(SUS) was created in 1986• It has been shown to be
valid and reliable• You get a score between
0 and 100• You can compare your SUS
score with other systems
37
Brooke, J. (1996). SUS: A "quick and dirty" usability scale. In Usability Evaluation in Industry. P. W. Jordan, B. Thomas, B. A. Weerdmeester and A. L. McClelland. London:, Taylor and Francis.
Jeff Sauro (@measuringux) has done a lot of work with SUS
38 http://www.measuringusability.com/sus.php.
• Jeff provides tools for scoring SUS
• He has adapted it to websites
• “SUS scores are not percentages”
39
There are other commercial products with wider concepts of UX
• SUPR-Q – includes credibility and loyalty– licenced product– http://www.suprq.com
• WAMMI– online service– includes access to standard
databases (extra for SUPR-Q)– http://www.wammi.com
• Your task: find an explanation of the difference between the European Commission and the European Union
• Use this site: http://ec.europa.eu• Decide whether you had a good or bad experience
40
• I will ask you to fill in SUS (original) or SUS (Sauro)
41
My journey into user experience started a long time ago, with usability
The extent to which a product can be used by specified users to achieve specified goals with effectiveness, efficiency and satisfaction in a specified context of use
(ISO 9241:11 1998)
42
Working mostly in government, we were interested in effectiveness and efficiency
43
The extent to which a product can be used by specified users to achieve specified goals with effectiveness, efficiency and satisfaction in a specified context of use
(ISO 9241:11 1998)
But what about user experience?What about satisfaction?
44
The extent to which a product can be used by specified users to achieve specified goals with effectiveness, efficiency and satisfaction in a specified context of use
(ISO 9241:11 1998)
Picture credit: Flickr jek in the box
45
46
Satisfaction is a complex matterCompared experience to what? Resulting thoughts(nothing) Indifference
Expectations Better / worse / different
Needs Met / not met / mixture
Excellence (the ideal product) Good / poor quality (or ‘good enough’)
Fairness Treated equitably / inequitably
Events that might have been Vindication / regret
Adapted from Oliver, R. L. (1996) and (2009)“Satisfaction: A Behavioral Perspective on the Consumer”
Example: bronze medal winners tend to be happier than silver medal winners
47
Nathan Twaddle, Olympic Bronze Medal Winner in Beijing
Photo credit: peter.cipollone, Flickr
Matsumoto D, & Willingham B (2006). The thrill of victory and the agony of defeat: spontaneous expressions of medal winners of the 2004 Athens Olympic Games.
• The first question was about rating satisfaction• What were they asking us to rate?
– Just a guess from what you recall
48
49
The challenge of UX and surveys: which bit to measure?
The extent to which a product can be used by specified users to achieve specified goals with effectiveness, efficiency and satisfaction in a specified context of use
(ISO 9241:11 1998)
???
Some ideas about what we could measureIn the definition GoDaddy customer
supportGoDaddy as a provider of domain names
Product This contact with help desk Overall experience of moving a domain to GoDaddy
Users What proportion of customers contact support
Demographics (example: type of job)
Goals Reason for contacting help Reason for looking at GoDaddy
Effectiveness Whether support fixed the problem
Whether GoDaddy offers the right products
Efficiency Whether it took a reasonable time
Whether the product is priced correctly
Satisfaction Helpfulness of support person
Likely to purchase again / recommend
Context of use Home/office; alone/helped Business / personal
51
52
In the definition Information to help the European Commission design a better website
Product
Users
Goals
Effectiveness
Efficiency
Satisfaction
Context of use
53
• Write questions for each topic• Then get another team to try your questionnaire
54
Find out about users’ goals
Tip
55
Ask about recent vivid experienceTip
Image credit: Fraser Smith
56
Interview first
Tip
57
Test everything
Tip
More resources on http://www.slideshare.net/cjforms
59