find the balance between qualitative and quantitative web site usability measures
DESCRIPTION
TRANSCRIPT
Find the Balance Between Qualitative and Quantitative Web Site Usability Measures
John LovettSenior AnalystJupiter [email protected]
Jeff SchuelerPresidentUsability [email protected]
Usability Sciences Overview
A leading provider of qualitative usability testing research since 1988– We were doing usability before it was cool
A leading provider of quantitative user experience research since 2000– Our focus is the measurement and improvement of
the user experience and effectiveness of web sites
– Patented technology WebIQ®
Our unique combination of qualitative and quantitative capabilities and methodologies sets us apart from our competitors
We provide our research services across the country and around the world
Balancing Qualitative and Quantitative Web Site Usability Measures
John Lovett
May 21, 2008
4 © 2008 JupiterResearch, LLC
Question: Indicate your top 3 challenges
in operating your company’s primary,
externally-facing Web site?
JupiterResearch Web site Decision-Makers
Survey
Usability is Persistently a Top Challenge for Web Site Operators
2004
Improving Usability of the
Site (# 1 answer)JupiterResearch/ERI
Executive Survey, 12/04, n=235 (US
only)
2005
Improving Usability of the Site (#2 answer)
JupiterResearch/ERI Executive Survey, 12/05, n=251 (US
only)
2006
Improving Usability of the Site (#1 answer)
JupiterResearch/ERI Executive Survey, 12/06, n=250 (US
only)
5 © 2008 JupiterResearch, LLC
Usability Challenges are Universal
0% 20% 40% 60% 80% 100%
Demonstratingreturn on
investment (ROI)
Prioritizing Webdevelopment
initiatives
Improvingusability of site
More than $3B
$1B to less than $3B
$500M to less than $1B
$100M to less than $500M
$50M to less than $100M
Percentage of Web Decision MakersQuestion: Please indicate your top 3 challenges in operating your company’s primary, externally-facing Web site? (Please select up to three.)
Source: JupiterResearch/ERI Executive Survey, 12/06, n=250 (US only)
6 © 2008 JupiterResearch, LLC
Understanding Behavior, Satisfaction and Success Requires Different Tools
Usability, Behavior,Intent, Satisfaction,
Success
7 © 2008 JupiterResearch, LLC
Technologies Counterbalance Expertise and Substantiate Qualitative Assumptions
0% 20% 40% 60% 80% 100%
Working with external usability consultants
Formal usablity lab testing
Conducting focus groups
Developing in-house usability expertise
Using A/B testing tools
Measuring customer satisfaction
Monitoring Web analytics data
Percentage of Web Decision Makers
Question: Which of the following tactics do you currently use to measure website usability? (Please select all that apply.)
Source: JupiterResearch/ERI Executive Survey (12/06), n=250 (US only)
QualitativeUsability
Resources
QuantitativeUsabilityMeasures
8 © 2008 JupiterResearch, LLC
Polling Question:
Which of the following tools do you currently use to measure website usability? (Please select all that apply.)
– Web analytics– Satisfaction Surveys– Multivariate or A/B testing– Exit surveys– In-house expertise– Focus groups– Lab testing– External usability consultants– Eye Tracking– None of the above
9 © 2008 JupiterResearch, LLC
Highly Effective Tactics Require a Mix of Qualitative & Quantitative Methods
0% 20% 40% 60% 80% 100%
Using A/B testing tools
Formal usablity labtesting
Measuring customersatisfaction
Conducting focusgroups
Developing in-houseusability expertise
Monitoring Webanalytics data
Percentage of Web Decision Makers
Question: Which of the following tactics have been most effective for improving the usability for your website in the last 12 months? (Please select up to 3.)
Source: JupiterResearch/ERI Executive Survey (12/06), n=250 (US only)
Qualitative Measures
10 © 2008 JupiterResearch, LLC
JupiterResearch’s Usability Framework Balances Methods
Source: JupiterResearch (5/08)
Automated (A/B) testing framework to test hypotheses while expediting time to market
Traffic and performance analyses to mine clickstream and monitor for performance impact
Voice of Customer Capture to tie feedback to clickstream and split-path testing
Traditional usability principles to ground framework in established “best practices”
11 © 2008 JupiterResearch, LLC
Traditional Usability Principles
• Develop a usability process
• Understand, Prioritize, Implement, Optimize
• Cultivate internally – Fortify externally
• Adhere to best practices
• Resist throwing technology at the problem
12 © 2008 JupiterResearch, LLC
Traffic & Performance Analyses
• Ensure clickstream visibility
• Find the holes in your conversion funnel
• Capture the What of onsite behavior
• Seek to understand intent
• Delivery reliability and consistency
• Usability exceeds design
13 © 2008 JupiterResearch, LLC
Voice of Customer Capture
• Quantify Satisfaction
• Extend further to gauge Success
• Manage unsolicited feedback
• Amass social awareness
• Provide a vehicle for feedback
• Perform damage control
14 © 2008 JupiterResearch, LLC
Increased Visibility into Attitudes & Behavior Drives Satisfaction
Cu
sto
me
r S
atis
fact
ion
Usability Measurement Technologies (CEM, VOC, Analytics)
Lack of visibility into customer
interactionsCapture of implicit
and explicit customerexperience data
Source: JupiterResearch (11/07)
15 © 2008 JupiterResearch, LLC
Automated Testing Framework
• Deploy a Control
• Test Alternative Hypotheses
• Use Testing as Segmentation Tool
• Be Cognizant of Demo/Psycho-graphics
• Address Improvement not Problems
16 © 2008 JupiterResearch, LLC
A Balanced Approach Provides a Holistic Perspective
Paper DesignsFocus groups
Participatory designCard SortingDiary studies Eye Tracking
Source: JupiterResearch 5/08
PerformanceAnalytics
A/B or MVTSatisfaction Indexes
Online Testing
Qualitative Quantitative
17 © 2008 JupiterResearch, LLC Source: JupiterResearch 5/08
Proble
m
Iden
tific
atio
n
Clicks
tream
Activ
ityCust
omer
Feedbac
k
Behav
iora
l
Analys
is
Focus
Groups
/Lab
sVis
ual &
Iter
ativ
e
Desig
nSat
isfa
ctio
n
Scorin
g
Performance Monitoring (Gomez, Keynote)
Web Analytics (Coremetrics, Omniture)
VOC technologies(ForeSee, iPerceptions)
Design & Development(BitGroup, Allurent)
Agencies & Usability Specialists(Usability Sciences, Razorfish)
Usability Technologies(Morae, Eye Tracking)
CEM technologies(Coradiant, Tealeaf)
Usability Measures
Usa
bil
ity
So
luti
on
s
Competitive Benchmarking Survey existing site visitors Persona Development
jcp.com Scorecard– Scorecard of end-to-end
experience vs. competitors
Site Survey– Collect survey and behavioral data
to best understand the customer experience
Customer feedback– Town hall meetings – Call monitoring
Ideas for Understanding the Shopper Understanding the jcp.com Shopper
Design Walk-Through Rapid Iterative Testing Information architecture A/B testing in design
Ideas for Understanding the Shopper Understanding the jcp.com Shopper
User needs– Use cases and workflows– Design concepts and wireframes– Prototyping
Information architecture Usability Panel Usability Testing Benchmark Usability study
– Existing vs. new functionality
Competitive Benchmark Analysis
Usability testing A/B Testing in production
Ideas for Understanding the Shopper Understanding the jcp.com Shopper
A/B and Multivariate Testing– A/B testing - test a factor, such as
a graphic, button, or image, against one or more variations to see which is most persuasive.
– Multivariate testing - enables testing many changes simultaneously.
• Evaluating the impact of combinations of factors and variations often reveals significant interaction effects that can have a dramatic impact on your conversion goals.
Survey existing site visitors Usability test incremental
enhancements
Ideas for Understanding the Shopper Understanding the jcp.com Shopper
Continue site surveys– Continuous measurement of the
customer experience
Detail Clickstream Analysis– Post implementation of major
releases
Questions??
John LovettSenior AnalystJupiter [email protected]
Jeff SchuelerPresidentUsability [email protected]