user centred evaluation · –according to iso 9241-210 from 2010 •this can include evaluating...

Post on 20-Jul-2020

3 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

TRANSCRIPT

Presentation at the Twintide meeting in Copenhagen,

13th of October 2012

User Centred Evaluation

in ‚the Wild‘

Marta Kristín Lárusdóttir

Overview for the Presentation

• Introduction to the topic

• My research focus

• The 3studies I have made on this topic

• Conclusion

• Future work

What do I mean by ‚in the Wild‘?

What I mean is Software Development Practice

Scrum is popular in Iceland and Sweden

Scrum is feature oriented

• It is a process for planning software projects

• Usually there is 2 – 4 weeks between delivery

Requirements analysis

Design

Implementation

Testing/Evaluation

Functionality of the system

Tra

ditio

nal - T

ime

Scrum - Time

The main rules in Scrum*

1. Work in iterations of no more than a month long

2. Be “done” with something

– by the end of each iteration

– to some pre-agreed upon definition of done and solicit

feedback from your key stakeholders on it

3. Get together and figure out what you’re doing

– at the start of an iteration

4. Reflect on how well you did during the iteration

at the end of the iteration

5. Talk a lot during the iteration

* According to Mike Cohn (2012)

User Centred Evaluation

• Evaluation based on the user's perspective

– According to ISO 9241-210 from 2010

• This can include evaluating Usability and UX

• User-centred evaluation can be used to

– a) collect new information about user needs,

– b) provide feedback on strengths and weaknesses of the

design solution from the user's perspective

– c) assess whether user requirements have been achieved

– d) establish baselines or make comparisons between

designs.

Evaluation Methods

• Analytical evaluation

1. An expert analyses according to

supporting material

• Gathering lists of usability problems

• Empirical evaluation

2. Measuring user’s performance

• Quantitatively gathering data on time

on task, user satisfaction, etc.

3. Qualitatively by observing users

and asking users their opinion

• Informal feedback from users

User Centred Evaluation ‚in the Wild‘

Usability and User

Experience evaluation

In software development

using Scrum for project

management

My Research Focus

• My main research questions:

1. How do IT professionals conduct user centred evaluation in the

Scrum development process?

2. How does it affect their evaluation that they are using Scrum?

4 papers accepted on this subject

1. Larusdottir, M. K., Bjarnadottir, E., Gulliksen, J.: The Focus on Usability

in Testing Practices in Industry, – Proceedings of the Human Computer Interaction Symposium at the World

Computer Congress 2010, Brisbane, Australia, September 2010.

2. Larusdottir, M. K., Cajander, A, Gulliksen, J: Informal Feedback

Rather Than Performance Measurements – User Centred Evaluation in

Scrum Projects

– Accepted to the journal Behavior and Information Technology, October 2012.

3. Larusdottir, M. K., Cajander, A, Gulliksen, J: The Big Picture of UX is

missing in Scrum Projects

– Proceedings of the Ix-Used workshop at NordiCHI, October, 2012

4. Jia, Y., Larusdottir, M. K., Cajander, A: The Usage of Usability

Techniques in Scrum Projects

– Proceedings of the HCSE 2012 conference in Toulouse, October, 2012

State-of-the-Art

• The complexity of evaluation should be studied – Results from research studies fail the practitioner, Wixon (2003)

• Asking IT professionals mainly in surveys

– The usage of evaluation methods

• Bygstad et al. (2008) Vukelja et al. (2007), Rosenbaum (2000)

– Frequency of evaluation

• Bak, et al. (2008), Ardito, et. al. (2011)

– Purpose of the evaluation

• Vermeeren, et al. (2011), Venturi, et. al. (2006), Monahan et al.

(2008), Bark et al. (2006)

• No study on evaluation according to the process used

• Conducted in 2009

• 25 participants from 18 companies

Study 1: Survey study in Iceland

Results – Survey on Scrum Projects

Testing technique Yes, a

lot

Yes, some So and so Little No, not at

all

Unit/component testing 22% 35% 26% 13% 4%

Integration testing 17% 35% 31% 13% 4%

System testing 39% 30% 22% 9% 0%

Acceptance testing 30% 44% 13% 13% 0%

Usability testing 4% 22% 35% 35% 4%

Alpha testing 4% 13% 17% 17% 48%

Beta testing 9% 22% 9% 17% 44%

Performance/load testing 0% 26% 26% 35% 13%

Security testing 4% 22% 8% 39% 26%

Testing technique Lack of

training/

knowledge

Lack of

budget

Lack

of

time

Other N/A N

Unit/component testing 36% 0% 32% 5% 27% 22

Integration testing 11% 0% 42% 0% 47% 19

System testing 7% 0% 47% 0% 47% 15

Acceptance testing 7% 0% 27% 7% 60% 15

Usability testing 20% 15% 35% 10% 20% 20

Alpha testing 0% 11% 11% 10% 68% 19

Beta testing 0% 11% 17% 11% 61% 18

Performance/load testing 26% 11% 32% 0% 32% 19

Security testing 47% 5% 16% 0% 32% 19

Results - Survey on Scrum Projects

Study 2: Interview study in Sweden

• Conducted in 2010

• 21 participants working for 14 companies using Scrum

Results:

Interviews

Type of evaluation

Empirical

Quantitative

Evaluation Empirical Qualitative Evaluation

Analytical

Qualitative

Evaluation

Evaluation

Method

Professional Role

N Measuring

user

performance

and surveys

Observing

Users

Asking

user their

opinions

Feedback

from user

surrogates

Inspection

evaluation

Usability experts 5 2 5 4 1 5

Interaction designers 7 1 5 6 3 3

Business analysts 4 0 1 4 0 1

Developers 2 0 0 1 1 2

Scrum managers 3 1 2 3 2 1

Total 21 4 13 18 7 12

Study 2 Results: Interviews in Sweden

Purpose of the

Evaluation

Professional Role

N Feedback on

Context of Use

Feedback on User

Requirements

Feedback on

Design

Usability experts 5 3 4 5

Interaction designers 7 5 6 6

Business analysts 4 3 4 3

Developers 2 1 0 2

Scrum managers 3 0 2 3

Total 21 12 16 19

Study 2: Two success stories

• The big picture for UX is Missing in Scrum projects

– A vital activity is visioning in the pre-phase

• There is a lack of responsibility for UX

• The Scrum context affects the Usability

professionals work

– Scrum is feature oriented

– Hard to find time for the User Centred Evaluation

• Informal cooperation emphasised in Scrum

– Much informal evaluation is conducted

Study 3: Usability techniques in Scrum

• Conducted in 2011 in Sweden

• 49 participants

Study 3 Results: Survey in Sweden

Study 3 Results: Survey in Sweden

Main conclusions

• Formal user centred evaluation not that common

– But it get high rating from the partictioners

• Informal evaluation is conducted by many

– Especially evaluating in the design phase

– Especially in the results from the interview study

• Heuristic evaluation not commonly used

– But half of the participants in the interviews do expert reviews

• Participants do not name the methods/approaches

they are using

Future work

• How is the deployment managed in Scrum projects?

• How is UCD integrated in Scrum projects?

• How is UCD conducted in the games industry?

• How are user requirements before the production

starts specified?

top related