usertesting 2016 webinar: research to inform product design in agile environments
TRANSCRIPT
Research to Inform Product Design in Agile Environments
Steve Fadden, Ph.D.
Director, Salesforce Analytics UX Research
Professional Faculty, UC Berkeley School of Information
Presented for UserTesting Webinar, November 10, 2016
Agenda1. Agile values and challenges
2. Research methods
3. Tips & tricks
Image: http://www.geograph.org.uk/photo/111487
About me
Steve Fadden
@sfadden
linkedin.com/in/stevefadden
Development lifecycles
Images: https://commons.wikimedia.org/wiki/File:Waterfall_model_(1).svg; https://en.m.wikipedia.org/wiki/File:Software_Development_Spiral.svg; https://en.wikipedia.org/wiki/Scrum_(software_development)
Waterfall Spiral Agile / Scrum
Agile valuesIndividuals and interactions
over processes and tools
Working software over
comprehensive
documentation
Customer collaboration over
contract negotiation
Responding to change over
following a plan
Reference: http://www.agilemanifesto.org/
Continuous delivery and change“Our highest priority is to satisfy the customer through early and
continuous delivery of valuable software.”
“Welcome changing requirements, even late in development.
Agile processes harness change for the customer's competitive
advantage.”
Reference: http://agilemanifesto.org/principles.html
Sustainable, optimal work“Agile processes promote sustainable development.”
“The sponsors, developers, and users should be able to maintain
a constant pace indefinitely.”
“Simplicity--the art of maximizing the amount of work not
done--is essential.”
Reference: http://agilemanifesto.org/principles.html
Self-organization and adjustment“The best architectures, requirements, and designs emerge from
self-organizing teams.”
“At regular intervals, the team reflects on how to become more
effective, then tunes and adjusts its behavior accordingly.”
Reference: http://agilemanifesto.org/principles.html
ChallengesFrequent, short cycles
“Just-in-time” answers
Drive to implement now
Cross-functional teams
Image: https://pixabay.com/en/mountain-biking-mountain-bike-1614129/
Comprehensive research approach
Images: https://commons.wikimedia.org/wiki/File:Light_bulb_(yellow)_icon.svg; https://commons.wikimedia.org/wiki/File:Korean_Traffic_sign_(Left_Turn_and_Right_Turn).svg;
https://pixabay.com/en/chart-line-line-chart-diagram-trend-148256/
Strategic
opportunities
Tactical
decisions
Performance
assessment
Formative research informs decisions
Images: https://commons.wikimedia.org/wiki/File:Light_bulb_(yellow)_icon.svg; https://commons.wikimedia.org/wiki/File:Korean_Traffic_sign_(Left_Turn_and_Right_Turn).svg;
https://pixabay.com/en/chart-line-line-chart-diagram-trend-148256/
Generative Formative Summative
Formative methodsThroughout cycle
Informs decision-making
Varying fidelity
Improvement focus
Faster & lighter
Image: https://commons.wikimedia.org/wiki/File:Following_rough_paths_to_the_Green_Sand_Beach.jpg
Technique 1: Understanding problems
Evidence of problems
Potential opportunities
● Recent events
● Specific details
● Feelings and perceptions
● Future responses
Critical incidents
Image: https://www.flickr.com/photos/vfwnationalhome/12436173623
Reference: Flanagan, J.C. (1954). The Critical Incident Technique, Psychological Bulletin, 51(4), 327-358; http://www.usabilitynet.org/tools/criticalincidents.htm; Image: https://pixabay.com/en/photos/interview/
Critical Incident Process1. Confirm user profile
2. Identify last time
3. Gather details
a. Description
b. Actions taken
c. Feelings
d. Outcome
e. Future actions
Example prompt“Consider the last time you had to share something online. How long ago did this happen? What did you share? Describe the steps you took to share, and highlight any surprises or problems (if any) that happened.”
Example result● Validates problem
● Identifies opportunities
● Clarifies expectations
● Details scenarios
● Builds empathy
“I needed to share a PDF with a
friend, and we use ABC, but I
hadn’t used it in a while. I logged
in through my browser, dragged
the PDF to Files, and then saw
the PDF open. I expected ABC to
start uploading it. I hit back,
created a folder in ABC,
uploaded the PDF to it, clicked
share to add my friend, and sent
her the link.”
Technique 2: Examining concepts
Scenario explorationInitial confusions
Acceptability
Extensions and opportunities
● Description
● Flow and interaction
● Process illustration: steps,
images, storyboard, video
Reference: Carroll, J.M. (2003). Making Use: Scenario-Based Design of Human-Computer Interactions. MIT Press; Image: https://www.flickr.com/photos/brenneman/5273755180
Scenario exploration process1. Present overall scenario
a. Ascertain understanding
b. Capture concerns/questions
2. Show steps of flow or interaction, capturing:
a. Concerns, confusions
b. Benefits, positives
c. Open questions
3. Gather final comments at end
Example prompt: Initial scenario“Imagine you had a tool that provided the ability to export data from any document that contained numbers. You would be able to select the data you wanted, and the tool would export it to an analysis tool of your choice. Discuss your initial thoughts about this tool, highlighting any questions, concerns, or benefits that come to mind.”
Example prompt: Storyboard review“The following 4 slides illustrate how you might interact with this tool. Review each slide, and comment on anything you find to be confusing, problematic, useful, or appealing about the concept.”
Example flow (comments gathered after each slide presented)
1.
2.
3.
4.
100%
Example feedback, slide 1
1. “Makes sense so far. I wonder how the
tool will handle tables that are embedded
in documents that contain a lot of text,
such as labels or superimposed
descriptions.”
2.
Example feedback, slide 2
“I’m thinking that this process would
require a lot of clicks, even for a small
number of columns. It would be better if
the tool automatically recognized a lot of
this information, and then I could go in and
review/modify it.”
3.
Example feedback, slide 3
“I understand this process, but am
concerned that people might give different
names to the same data. You should
embed best practices for naming here.
Otherwise, the result could be messy.”
4.
100%
Example feedback, slide 4
“I like that it shows progress, but it seems
that it should be pretty fast for documents
that don’t have a lot of tables embedded in
them. Will we be able to save the
mappings? That could save time in the
future.”
Example: Final comments
1.
2.
3.
4.
100%
“It’s great that you don’t have to
jump around different parts of
the system to do this. It’s very
valuable to be able to complete
this from one place.”
“Hi, I wanted to follow up to
reiterate that this is a REALLY
COOL idea and it fills a much
needed requirement for our
use of the product. Please
consider me for future studies
like this, because we need this
functionality!”
Technique 3: Gauging reactions
Initial impressionsVisual appeal
Goals and intentions
Specific content
● Interface preview
● Time limit
● Survey questions
Image: https://commons.wikimedia.org/wiki/File:Claude_Monet,_Impression,_soleil_levant.jpg
First impressions matterFormed by ~50ms
● Primarily based on visual appeal
● Do not change with additional viewing time
Initial usability impression remains stable
● With 5s, 60s, and no time limit
● Even when site is manipulated to be more/less usable
References: Lindgaard, G., Fernandes, G., Dudek, C., & Brown, J.M. (2006). Attention web designers: You have 50 milliseconds to make a good first impression! Behaviour and Information Technology, 25(2),
115-126; http://usabilitynews.org/visual-appeal-vs-usability-which-one-influences-user-perceptions-of-a-website-more/; http://www.measuringu.com/five-second-tests.php
Impression test process
“View the following interface for a few seconds, and then answer the questions that follow.”
Present
instructions
Show
interface
Present
questions
Present questions after viewingThe interface you just viewed appears:
Very Very Appealing - - - - - - - - - Unappealing
Very Very Easy - - - - - - - - - Hard
Very Very Efficient - - - - - - - - - Inefficient
What is the purpose of the interface?
Ratings and open feedback are helpful
“It looks very plain and almost
ugly. It’s obviously some kind of
class schedule. I remember seeing
assignments and icons that look
like standard office software. I
could probably figure out how to
use it, but I’m not sure I’d want
to.”
Technique 4: Evaluating expectations
Image: https://commons.wikimedia.org/wiki/File:HK_TST_港威大廈_The_Gateway_entrance_lobby_interior_night_Sept-2013_glass_door.JPG
Expectation testsAreas of confusion
Mental model
Names, categorization
● Static screens
● No text/labels
● Basic task information
Expectation process: “Greeking” technique1. Identify important tasks
2. Create scenario
3. Write “first step” questions
4. Present interface without text
5. Ask for group/category names
Reference: Thomas S. Tullis, 1998. A method for evaluating web page design concepts. CHI 98 conference summary on human factors in computing systems
Example prompt: Scenario and instructionsScenario: “You are a college instructor using a new online course management tool to schedule and track assignments and communicate with your students.”
Instruction: “Indicate where you would first click to start each task?”
“Greeking” activity: Individual tasksHow would you expect to:
1. See a calendar with class
meetings?
2. Edit information about a
specific homework
assignment?
3. Send a secure email
message to a student?
“Greeking” activity: GroupingsIdentify and prioritize any
groups of functionality
What would you call each
group?
21
3
“3. More details”
“1. Main navigation”
“2. Quick tools”
Technique 5: Addressing usability problems
Look for problemsAssess effectiveness
Compare to competition
Validate performance
● Benchmark testing
● Discount usability
Both good, but...
Image: https://commons.wikimedia.org/wiki/File:Flickr_-_The_U.S._Army_-_Searching_for_opposing_forces.jpg
Fixing issues as a teamTask completion barriers
Improvement opportunities
Consensus on feasibility
● High fidelity interface
● Representative users
● Key stakeholders
Image: https://commons.wikimedia.org/wiki/File:Jigsaw_puzzle_01_by_Scouten.jpg
Rapid Iterative Testing & Evaluation (RITE)
Reference: Medlock, M.C., Wixon, D., Terrano, M., Romero, R., & Fulton, B. (2002). Using the RITE method to improve products: a definition and a case study, Presented at UPA conference.
Similar to conventional usability evaluation
Emphasis on fixing major problems between sessions
1. Schedule time for interface changes
2. Agree on critical tasks and success criteria
3. Record problems encountered
4. Discuss problems worth fixing
5. Implement fixes and continue
Result: Whole team understands issues better
Stakeholder: “Every participant expects
saving. Is there a design that better conveys
that it’s a one-time process?”
Stakeholder: “We could try the term
‘Apply’ plus some descriptive text to see if
it changes expectations?”
Stakeholder: “It’s a great idea, but we won’t
have the resources to implement saving.
This can only be a one-time process.”
Participant: “It says ‘save’
but I’m just modifying
the settings here. Unless
I can re-use this later?”
Tips & Tricks
Order activities intentionally
Image: https://pixabay.com/en/photos/stack/
Test with each other
Image: http://www.publicdomainpictures.net/view-image.php?image=50475
Anticipate confusion
Image: https://www.flickr.com/photos/mythicseabass/4940173372
Recruit for the research purpose
Image: https://en.wikipedia.org/wiki/Human_billboard
Triangulate for support
Image: https://pixabay.com/en/parachute-paragliding-cat-and-mouse-1625978/
@sfadden on Twitter
slideshare.net/stevefadden1
Thank you!