critical incident technique: collecting data from a user’s perspective (aka what users really...

25
SUCCESSFUL SURVEYS verb: to investigate the opinions or experience of (a group of people) by asking them questions 100% per cent of attendees surveyed were very satisfied with the NUX event* Oxford Dictionaries http://oxforddictionaries.com/definition/english/survey * I made that bit up

Upload: northern-user-experience

Post on 18-Dec-2014

298 views

Category:

Marketing


0 download

DESCRIPTION

NUX Manchester – 3rd February 2014 Critical Incident Technique: collecting data from a user’s perspective (aka What users really think) Jonathan Willson, Principal Lecturer in Information and Communications at MMU. Jonathan works closely with Richard Eskins (known to many) teaching web development and our new UX unit, plus Digital Rights. Jonathan has applied CIT in a number of research projects as an inexpensive way to obtain rich data and deep insights.

TRANSCRIPT

Page 1: Critical Incident Technique: collecting data from a user’s perspective (aka What users really think)

SUCCESSFUL SURVEYS verb: to investigate the opinions or experience of (a group of people) by asking them questions

100% per cent of attendees surveyed were very satisfied with the NUX event*

Oxford Dictionaries http://oxforddictionaries.com/definition/english/survey

* I made that bit up

Page 2: Critical Incident Technique: collecting data from a user’s perspective (aka What users really think)

Usability tool? ”Surveys are not always thought of as usability tools since they are normally employed more for marketing-related tasks – asking users about their likes and dislikes."

Braun et al: Usability: the site speaks for itself.

Page 3: Critical Incident Technique: collecting data from a user’s perspective (aka What users really think)

Advantages • carefully constructed surveys can:

• provide good usability data • validate user requirements • expand upon user requirements • be used to preview new features or design changes

• generate meaningful answers • supplement expert review • elicit facts • collect demographics about your users

Page 4: Critical Incident Technique: collecting data from a user’s perspective (aka What users really think)

When to survey? • at the beginning of a project

•  learn about current users and their needs •  identify stakeholders

• before a redesign •  learn about current users and their needs

•  after launching a new or revised site •  assess whether needs are met / not met •  identify areas for improvement

• have features or content rated or ranked • seek ideas for further improvements

Page 5: Critical Incident Technique: collecting data from a user’s perspective (aka What users really think)

Questions concerning … •  If users are able to find the information they seek • How satisfied users are with your site • What experiences users have had with your site or similar sites

• What users like and dislike about your site • What frustrations or issues users have with your site •  If users would recommend your site to others •  If users have any ideas or suggestions for improvements

usability.gov

Page 6: Critical Incident Technique: collecting data from a user’s perspective (aka What users really think)

Online surveys • structured questionnaire that target audience completes over the internet

• data (responses) stored in a database • survey tools provide some level of analysis • broad reach •  low cost • quick and easy to launch

• you need to have a clear idea about: • your purpose • how you will find participants

Page 7: Critical Incident Technique: collecting data from a user’s perspective (aka What users really think)

Points to consider • length of survey

• keep as brief as possible •  indicate how much time will be required • online surveys typically indicate progress - % complete

• mix of questions • open ended and closed

• follow up survey or interview • opportunity to ask more in-depth questions

Page 8: Critical Incident Technique: collecting data from a user’s perspective (aka What users really think)

RISKS OF QUANTITATIVE STUDIES Jakob Nielsen

Page 9: Critical Incident Technique: collecting data from a user’s perspective (aka What users really think)

“Beware number fetishism” “Number fetishism leads usability studies astray by focusing on statistical analyses that are often false, biased, misleading, or overly narrow. Better to emphasize insights and qualitative research” “qualitative delivers the best results for the least money”

(Nielson, 2004)

Page 10: Critical Incident Technique: collecting data from a user’s perspective (aka What users really think)

Quantitative methods •  quantitative research sometimes considered more

scientific or credible v. insight-based studies • may reduce a complex situation down to mere numbers

•  beware bogus findings •  be aware of limitations

•  e.g. test subjects in student projects may be other students who are not representative of mainstream users

•  tests of websites that are scaled-back designs with fewer pages and more limited content

•  good quantitative research may … •  allow comparison and identifying trends over time •  be expensive and difficult

Page 11: Critical Incident Technique: collecting data from a user’s perspective (aka What users really think)

Avoid taking the ‘wrong path’ • user interfaces and usability are highly contextual

• their effectiveness depends on a broad understanding of human behaviour

• experts tend to get better results than beginners from qualitative studies

• even beginners will get (some) usable results

Page 12: Critical Incident Technique: collecting data from a user’s perspective (aka What users really think)

DON’T USE SURVEYS … Smashing UX Design

Page 13: Critical Incident Technique: collecting data from a user’s perspective (aka What users really think)

… to find out how something is used “What people say they do and what they actually do are often very different things.” “If you base your design decisions on what people tell you they do, and not what they actually do, your product will not be designed to support actual user behaviour.” However, there is another way …

Page 14: Critical Incident Technique: collecting data from a user’s perspective (aka What users really think)

CRITICAL INCIDENT TECHNIQUE A flexible set of principles for qualitative research

Page 15: Critical Incident Technique: collecting data from a user’s perspective (aka What users really think)
Page 16: Critical Incident Technique: collecting data from a user’s perspective (aka What users really think)

Origins • developed during WW2 • to identify effective and ineffective behaviours in a variety of military activities

• subsequently developed as a tool for the systematic study of human behaviour •  “think of some occasion during combat flying in which

you personally experienced feelings of acute disorientation … describe what you saw, felt or heard that brought on the experience”

(Flanaghan, 1954)

Page 17: Critical Incident Technique: collecting data from a user’s perspective (aka What users really think)

Purpose • to gather certain important facts concerning behaviour in a defined situation

• no single rigid set of rules governing data collection

• rather a flexible set of principles that must be modified and adapted to meet the specific situation at hand

• a way of focusing the respondent’s mind on specific occurrence(s)

Page 18: Critical Incident Technique: collecting data from a user’s perspective (aka What users really think)

Key stages • General aims

•  aims or intended outcomes of the activity under investigation

• Plans and specifications •  instructions given to the observers of the behaviour

• Collecting the data •  interviews, group interviews, questionnaires and record forms •  while the activity is ongoing or fairly recent

• Analysing the data •  summarise and describe the data with a focus on thematic content

•  Interpreting and reporting •  present the findings in a usable form

Page 19: Critical Incident Technique: collecting data from a user’s perspective (aka What users really think)

Relevance • various studies have judged the method to be valid and reliable

• capable of generating a comprehensive and detailed description of the chosen domain

• set against more general debates in the social sciences about the validity and reliability of similar qualitative methods

Page 20: Critical Incident Technique: collecting data from a user’s perspective (aka What users really think)

Tip of the …

Page 21: Critical Incident Technique: collecting data from a user’s perspective (aka What users really think)

Reflective process • identifying effective and ineffective ways of doing something

• factors that help and hinder • aspects that are critical in a specific situation • encourages participants to explore new ideas for problem-solving and make recommendations

• can be used with concerned, informed insiders and interested outsiders

Page 22: Critical Incident Technique: collecting data from a user’s perspective (aka What users really think)

Uses • gathering data for the design or re-design of equipment

• investigating the information needs or information-seeking behaviour of users

• task analysis • identifying how decisions are made • performance evaluation and appraisal • determining critical requirements for a job

Page 23: Critical Incident Technique: collecting data from a user’s perspective (aka What users really think)

Can you recall … • an incident when … • you had no support or backup… ? • you felt good about something you had done … ? • you realised you did not know enough … ? • you were too busy … ?

• aim of study to identify staff development needs • Harrington, 1992

Page 24: Critical Incident Technique: collecting data from a user’s perspective (aka What users really think)

Prompts • what device were you using? • who else was involved? • how did the task start / proceed / conclude? • did you know what you were doing at each stage? • what were your feelings during / after the incident?

• what are your feelings now? • why was the incident critical for you?

Page 25: Critical Incident Technique: collecting data from a user’s perspective (aka What users really think)

Benefits •  flexible •  can be applied to multi-user systems •  focus on important issues, e.g. safety critical •  identify cause and severity •  pick up issues that may not be detected by other methods •  can be adapted to questionnaires or interviews •  cost-effective Be aware that … •  routine issues may not be reported •  not suited to general task analysis •  relies on memory so focus on recent events