amost 20091 experimental comparison of code-based and model-based test prioritization bogdan korel...

102
AMOST 2009 1 Experimental Comparison of Code-Based and Model- Based Test Prioritization Bogdan Korel Computer Science Department Illinois Institute of Technology Chicago, USA George Koutsogiannakis Computer Science Department Illinois Institute of Technology Chicago, USA

Upload: randall-mitchell

Post on 27-Dec-2015

217 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: AMOST 20091 Experimental Comparison of Code-Based and Model-Based Test Prioritization Bogdan Korel Computer Science Department Illinois Institute of Technology

AMOST 2009 1

Experimental Comparison of Code-Based and Model-Based

Test Prioritization Bogdan Korel

Computer Science DepartmentIllinois Institute of Technology

Chicago, USA

George KoutsogiannakisComputer Science Department

Illinois Institute of Technology

Chicago, USA

Page 2: AMOST 20091 Experimental Comparison of Code-Based and Model-Based Test Prioritization Bogdan Korel Computer Science Department Illinois Institute of Technology

AMOST 2009 2

Outline

IntroductionTest prioritization Code-based test prioritizationSystem ModelingModel-based test prioritizationMeasuring the effectiveness of test

prioritization Experimental studyConclusions

Page 3: AMOST 20091 Experimental Comparison of Code-Based and Model-Based Test Prioritization Bogdan Korel Computer Science Department Illinois Institute of Technology

AMOST 2009 3

Outline

IntroductionTest prioritization Code-based test prioritizationSystem ModelingModel-based test prioritizationMeasuring the effectiveness of test

prioritizationExperimental studyConclusions

Page 4: AMOST 20091 Experimental Comparison of Code-Based and Model-Based Test Prioritization Bogdan Korel Computer Science Department Illinois Institute of Technology

AMOST 2009 4

Introduction

During maintenance of evolving software systems, their specification and implementation are changed

Regression testing validates changes made to the system

Existing regression testing techniques– Code-based– Specification-based

Page 5: AMOST 20091 Experimental Comparison of Code-Based and Model-Based Test Prioritization Bogdan Korel Computer Science Department Illinois Institute of Technology

AMOST 2009 5

Introduction

During regression testing, after testing the modified part of the system, the modified system needs to be retested using the existing test suite

Retesting the system may be very expensiveTesters are interested in detecting faults in the

system as early as possible during the retesting process

Page 6: AMOST 20091 Experimental Comparison of Code-Based and Model-Based Test Prioritization Bogdan Korel Computer Science Department Illinois Institute of Technology

AMOST 2009 6

Outline

IntroductionTest prioritizationCode-based test prioritizationSystem ModelingModel-based test prioritizationMeasuring the effectiveness of test

prioritizationExperimental studyConclusions

Page 7: AMOST 20091 Experimental Comparison of Code-Based and Model-Based Test Prioritization Bogdan Korel Computer Science Department Illinois Institute of Technology

AMOST 2009 7

Test Prioritization

• We consider test prioritization with respect to early fault detection

• The goal is to increase the likelihood of revealing faults earlier during execution of the prioritized test suite

Page 8: AMOST 20091 Experimental Comparison of Code-Based and Model-Based Test Prioritization Bogdan Korel Computer Science Department Illinois Institute of Technology

AMOST 2009 8

Test Prioritization

Let TS = {t1, t2, …, tN} be a test suite

Question: In which order tests should be executed?

1. t1, t2, …, tN-1, tN

2. tN, tN-1, …, t2, t1

3. …

Page 9: AMOST 20091 Experimental Comparison of Code-Based and Model-Based Test Prioritization Bogdan Korel Computer Science Department Illinois Institute of Technology

AMOST 2009 9

Test Prioritization

Suppose test t2 is the only test in TS that fails.

1. t1, t2, …, tN-1, tN early fault detection

2. tN, tN-1, …, t2, t1 late fault detection

Page 10: AMOST 20091 Experimental Comparison of Code-Based and Model-Based Test Prioritization Bogdan Korel Computer Science Department Illinois Institute of Technology

AMOST 2009

Existing test prioritization methods:– Random prioritization– Code-based prioritization

• Order tests according to some criterion, e.g., a code coverage is achieved at the fastest rate

– Model-based test prioritization

• Information about the system model is used to prioritize the test suite for system retesting

10

Prioritization Methods

Page 11: AMOST 20091 Experimental Comparison of Code-Based and Model-Based Test Prioritization Bogdan Korel Computer Science Department Illinois Institute of Technology

AMOST 2009

Perform an experimental study to compare:

– Code-based prioritization– Model-based test prioritization

11

Prioritization Methods

Page 12: AMOST 20091 Experimental Comparison of Code-Based and Model-Based Test Prioritization Bogdan Korel Computer Science Department Illinois Institute of Technology

AMOST 2009 12

Outline

IntroductionTest prioritizationCode-based prioritizationSystem ModelingModel-based test prioritizationMeasuring the effectiveness of test

prioritizationExperimental studyConclusions

Page 13: AMOST 20091 Experimental Comparison of Code-Based and Model-Based Test Prioritization Bogdan Korel Computer Science Department Illinois Institute of Technology

AMOST 2009 13

Code-based Test Prioritization

The idea of code-based test prioritization is to use the source code of the system to prioritize the test suite.

Page 14: AMOST 20091 Experimental Comparison of Code-Based and Model-Based Test Prioritization Bogdan Korel Computer Science Department Illinois Institute of Technology

AMOST 2009 14

Code-based Test Prioritization

The original system is executed for the whole test suite

Information about execution of the original system is collected

The collected information is used to prioritize the test suite for the modified system

Execution of the test suite on the original system may be expensive

Page 15: AMOST 20091 Experimental Comparison of Code-Based and Model-Based Test Prioritization Bogdan Korel Computer Science Department Illinois Institute of Technology

AMOST 2009 15

System retesting

Originalimplementation

Modified implementation

Test suite

modified

Page 16: AMOST 20091 Experimental Comparison of Code-Based and Model-Based Test Prioritization Bogdan Korel Computer Science Department Illinois Institute of Technology

AMOST 2009 16

Code-based Test Prioritization

modified

Originalimplementation Modified

implementation

Test suite

Prioritized test suite

Code-based test prioritization

Page 17: AMOST 20091 Experimental Comparison of Code-Based and Model-Based Test Prioritization Bogdan Korel Computer Science Department Illinois Institute of Technology

AMOST 2009 17

Code-based Test Prioritization

OriginalImplementation

Prioritized test suite

Tests executioninformation

Prioritizationalgorithm

Test suite

Page 18: AMOST 20091 Experimental Comparison of Code-Based and Model-Based Test Prioritization Bogdan Korel Computer Science Department Illinois Institute of Technology

AMOST 2009 18

Code-based Test Prioritization

Several code-based test prioritization methods:– Total statement coverage– Additional statement coverage– Total function coverage– Additional function coverage– …

Page 19: AMOST 20091 Experimental Comparison of Code-Based and Model-Based Test Prioritization Bogdan Korel Computer Science Department Illinois Institute of Technology

AMOST 2009 19

Code-based Test Prioritization

Information collected for each test during the original system execution:– Total statement coverage

• # of statements executed

– Additional statement coverage• A list of statements executed

– Total function coverage• # of functions executed

– Additional function coverage• A list of functions executed

Page 20: AMOST 20091 Experimental Comparison of Code-Based and Model-Based Test Prioritization Bogdan Korel Computer Science Department Illinois Institute of Technology

AMOST 2009 20

Code-based Test Prioritization

Several code-based test prioritization methods:– Total statement coverage– Additional statement coverage

(Heuristic #1)

– Total function coverage– Additional function coverage– …

Page 21: AMOST 20091 Experimental Comparison of Code-Based and Model-Based Test Prioritization Bogdan Korel Computer Science Department Illinois Institute of Technology

AMOST 2009 21

Allows each statement to have the same opportunity to be executed during software retesting

A higher priority is assigned to a test that covers the higher number of not yet executed statements

Additional Statement Coverage

Page 22: AMOST 20091 Experimental Comparison of Code-Based and Model-Based Test Prioritization Bogdan Korel Computer Science Department Illinois Institute of Technology

AMOST 2009 22

Additional Statement CoverageExecuted statements for each test t1: S1, S2, S3

t2 : S1, S5, S8, S9

t3 : S1, S5, S7

t4 : S1, S5, S3, S4

t5 : S1, S2, S7

t6 : S1, S2

t7 : S1, S2, S4

t8 : S1, S2, S3, S4, S7

t9 : S1, S6

t10 : S1, S2

Page 23: AMOST 20091 Experimental Comparison of Code-Based and Model-Based Test Prioritization Bogdan Korel Computer Science Department Illinois Institute of Technology

AMOST 2009 23

Additional Statement CoverageExecuted statements for each test t1: S1, S2, S3

t2 : S1, S5, S8, S9

t3 : S1, S5, S7

t4 : S1, S5, S3, S4

t5 : S1, S2, S7

t6 : S1, S2

t7 : S1, S2, S4

t8 : S1, S2, S3, S4, S7

t9 : S1, S6

t10 : S1, S2

Page 24: AMOST 20091 Experimental Comparison of Code-Based and Model-Based Test Prioritization Bogdan Korel Computer Science Department Illinois Institute of Technology

AMOST 2009 24

Additional Statement CoverageExecuted statements for each test t1: S1, S2, S3

t2 : S1, S5, S8, S9

t3 : S1, S5, S7

t4 : S1, S5, S3, S4

t5 : S1, S2, S7

t6 : S1, S2

t7 : S1, S2, S4

t8 : S1, S2, S3, S4, S7

t9 : S1, S6

t10 : S1, S2

S: t8

Page 25: AMOST 20091 Experimental Comparison of Code-Based and Model-Based Test Prioritization Bogdan Korel Computer Science Department Illinois Institute of Technology

AMOST 2009 25

Additional Statement CoverageExecuted statements for each test t1: S1, S2, S3 Covered statements t2 : S1, S5, S8, S9 S1, S2, S3, S4, S7

t3 : S1, S5, S7

t4 : S1, S5, S3, S4

t5 : S1, S2, S7

t6 : S1, S2

t7 : S1, S2, S4

t9 : S1, S6

t10 : S1, S2

S: t8

Page 26: AMOST 20091 Experimental Comparison of Code-Based and Model-Based Test Prioritization Bogdan Korel Computer Science Department Illinois Institute of Technology

AMOST 2009 26

Additional Statement CoverageExecuted statements for each test t1: S1, S2, S3 Covered statements t2 : S1, S5, S8, S9 S1, S2, S3, S4, S7

t3 : S1, S5, S7

t4 : S1, S5, S3, S4

t5 : S1, S2, S7

t6 : S1, S2

t7 : S1, S2, S4

t9 : S1, S6

t10 : S1, S2

S: t8

Page 27: AMOST 20091 Experimental Comparison of Code-Based and Model-Based Test Prioritization Bogdan Korel Computer Science Department Illinois Institute of Technology

AMOST 2009 27

Additional Statement CoverageExecuted statements for each test t1: S1, S2, S3 Covered statements t2 : S1, S5, S8, S9 S1, S2, S3, S4, S7

t3 : S1, S5, S7

t4 : S1, S5, S3, S4

t5 : S1, S2, S7

t6 : S1, S2

t7 : S1, S2, S4

t9 : S1, S6

t10 : S1, S2

S: t8, t2

Page 28: AMOST 20091 Experimental Comparison of Code-Based and Model-Based Test Prioritization Bogdan Korel Computer Science Department Illinois Institute of Technology

AMOST 2009 28

Additional Statement CoverageExecuted statements for each test t1: S1, S2, S3 Covered statements

S1, S2, S3, S4, S5, S7, S8, S9

t3 : S1, S5, S7

t4 : S1, S5, S3, S4

t5 : S1, S2, S7

t6 : S1, S2

t7 : S1, S2, S4

t9 : S1, S6

t10 : S1, S2

S: t8, t2

Page 29: AMOST 20091 Experimental Comparison of Code-Based and Model-Based Test Prioritization Bogdan Korel Computer Science Department Illinois Institute of Technology

AMOST 2009 29

Additional Statement CoverageExecuted statements for each test t1: S1, S2, S3 Covered statements S1, S2, S3, S4, S5, S7, S8, S9

t3 : S1, S5, S7

t4 : S1, S5, S3, S4

t5 : S1, S2, S7

t6 : S1, S2

t7 : S1, S2, S4

t9 : S1, S6

t10 : S1, S2

S: t8, t2

Page 30: AMOST 20091 Experimental Comparison of Code-Based and Model-Based Test Prioritization Bogdan Korel Computer Science Department Illinois Institute of Technology

AMOST 2009 30

Additional Statement CoverageExecuted statements for each test t1: S1, S2, S3 Covered statements

S1, S2, S3, S4, S5, S7, S8, S9

t3 : S1, S5, S7

t4 : S1, S5, S3, S4

t5 : S1, S2, S7

t6 : S1, S2

t7 : S1, S2, S4

t9 : S1, S6

t10 : S1, S2

S: t8, t2 , t9

Page 31: AMOST 20091 Experimental Comparison of Code-Based and Model-Based Test Prioritization Bogdan Korel Computer Science Department Illinois Institute of Technology

AMOST 2009 31

Outline

IntroductionTest prioritizationCode-based prioritizationSystem ModelingModel-based test prioritizationMeasuring the effectiveness of test

prioritizationExperimental studyConclusions

Page 32: AMOST 20091 Experimental Comparison of Code-Based and Model-Based Test Prioritization Bogdan Korel Computer Science Department Illinois Institute of Technology

AMOST 2009 32

System Modeling

A state-based modeling is used to model state-based systems, i.e., systems characterized by a set of states and transitions between states that are triggered by events

System modeling is very popular for modeling state-based systems such as: control systems, communications systems, embedded systems, …

Page 33: AMOST 20091 Experimental Comparison of Code-Based and Model-Based Test Prioritization Bogdan Korel Computer Science Department Illinois Institute of Technology

AMOST 2009 33

System Modeling Several modeling languages have been developed

to model state-based software systems

EFSM: Extended Finite State Machine SDL: Specification Description

Language VFSM: Virtual Finite State Machine State Chart …

Page 34: AMOST 20091 Experimental Comparison of Code-Based and Model-Based Test Prioritization Bogdan Korel Computer Science Department Illinois Institute of Technology

AMOST 2009 34

System Modeling Several modeling languages have been developed

to model state-based software systems

EFSM: Extended Finite State Machine SDL: Specification Description

Language VFSM: Virtual Finite State Machine State Chart …

Page 35: AMOST 20091 Experimental Comparison of Code-Based and Model-Based Test Prioritization Bogdan Korel Computer Science Department Illinois Institute of Technology

AMOST 2009 35

Extended Finite State Machine

EFSM consists of:– States– Transitions

Page 36: AMOST 20091 Experimental Comparison of Code-Based and Model-Based Test Prioritization Bogdan Korel Computer Science Department Illinois Institute of Technology

AMOST 2009 36

EFSM Transition

State 1 State 2Event(p)[Condition]/Action(s)

Page 37: AMOST 20091 Experimental Comparison of Code-Based and Model-Based Test Prioritization Bogdan Korel Computer Science Department Illinois Institute of Technology

AMOST 2009 37

EFSM Transition

State 1 State 2Event(p)[Condition]/Action(s)

Page 38: AMOST 20091 Experimental Comparison of Code-Based and Model-Based Test Prioritization Bogdan Korel Computer Science Department Illinois Institute of Technology

AMOST 2009 38

EFSM Transition

State 1 State 2Event(p)[Condition]/Action(s)

Page 39: AMOST 20091 Experimental Comparison of Code-Based and Model-Based Test Prioritization Bogdan Korel Computer Science Department Illinois Institute of Technology

AMOST 2009 39

EFSM Transition

State 1 State 2Event(p)[Condition]/Action(s)

Page 40: AMOST 20091 Experimental Comparison of Code-Based and Model-Based Test Prioritization Bogdan Korel Computer Science Department Illinois Institute of Technology

AMOST 2009 40

EFSM Transition

State 1 State 2Event(p)[Condition]/Action(s)

Page 41: AMOST 20091 Experimental Comparison of Code-Based and Model-Based Test Prioritization Bogdan Korel Computer Science Department Illinois Institute of Technology

AMOST 2009 41

Sample System Model

S1

Exit

T14Off()

CCS=off

T13Off()

CCS=off

T15Off()

CCS=off

T1

Set(c)[(c >=n) && (c <=m)]/r=c;

CCS=activated

On()/CCS=on;n=40

m=65;r=0

Start s3

T4

T10Accelerate(c)/

CCS=suspendedr=c

T9Brake()

CCS=suspended

T11Resume(c)[(c >=n)

&& (c<m) ]/r=c; CCS=activated

SensorSpeed(c)[(c >=r)]/display(c)

T2 S2

T7Coast(c)[c >=m]/r=r-5;display(r)

Page 42: AMOST 20091 Experimental Comparison of Code-Based and Model-Based Test Prioritization Bogdan Korel Computer Science Department Illinois Institute of Technology

AMOST 2009 42

State-Based Models

We assume that models are executable i.e., enough detail is provided in the model so it can be executed.

An input t (test) to a model is a sequence of events with input values associated with these events.

Page 43: AMOST 20091 Experimental Comparison of Code-Based and Model-Based Test Prioritization Bogdan Korel Computer Science Department Illinois Institute of Technology

AMOST 2009 43

System Model

Input events: On(), Set(50), SensorSpeed(50), Coast(70), Brake(), Resume(55), Accelerate(50), Off()

S1

Exit

T14Off()

CCS=off

T13Off()

CCS=off

T15Off()

CCS=off

T1

Set(c)[(c >=n) && (c <=m)]/r=c;

CCS=activated

On()/CCS=on;n=40

m=65;r=0

Start s3

T4

T10Accelerate(c)/

CCS=suspendedr=c

T9Brake()

CCS=suspended

T11Resume(c)[(c >=n)

&& (c<m) ]/r=c; CCS=activated

SensorSpeed(c)[(c >=r)]/display(c)

T2 S2

T7Coast(c)[c >=m]/r=r-5;display(r)

Page 44: AMOST 20091 Experimental Comparison of Code-Based and Model-Based Test Prioritization Bogdan Korel Computer Science Department Illinois Institute of Technology

AMOST 2009 44

System Model

Input events: On(), Set(50), SensorSpeed(50), Coast(70), Brake(), Resume(55), Accelerate(50), Off()

S1

Exit

T14Off()

CCS=off

T13Off()

CCS=off

T15Off()

CCS=off

T1

Set(c)[(c >=n) && (c <=m)]/r=c;

CCS=activated

On()/CCS=on;n=40

m=65;r=0

Start s3

T4

T10Accelerate(c)/

CCS=suspendedr=c

T9Brake()

CCS=suspended

T11Resume(c)[(c >=n)

&& (c<m) ]/r=c; CCS=activated

SensorSpeed(c)[(c >=r)]/display(c)

T2 S2

T7Coast(c)[c >=m]/r=r-5;display(r)

Page 45: AMOST 20091 Experimental Comparison of Code-Based and Model-Based Test Prioritization Bogdan Korel Computer Science Department Illinois Institute of Technology

AMOST 2009 45

System Model

Input events: On(), Set(50), SensorSpeed(50), Coast(70), Brake(), Resume(55), Accelerate(50), Off()

S1

Exit

T14Off()

CCS=off

T13Off()

CCS=off

T15Off()

CCS=off

T1

Set(c)[(c >=n) && (c <=m)]/r=c;

CCS=activated

On()/CCS=on;n=40

m=65;r=0

Start s3

T4

T10Accelerate(c)/

CCS=suspendedr=c

T9Brake()

CCS=suspended

T11Resume(c)[(c >=n)

&& (c<m) ]/r=c; CCS=activated

SensorSpeed(c)[(c >=r)]/display(c)

T2 S2

T7Coast(c)[c >=m]/r=r-5;display(r)

Page 46: AMOST 20091 Experimental Comparison of Code-Based and Model-Based Test Prioritization Bogdan Korel Computer Science Department Illinois Institute of Technology

AMOST 2009 46

System Model

Input events: On(), Set(50), SensorSpeed(50), Coast(70), Brake(), Resume(55), Accelerate(50), Off()

S1

Exit

T14Off()

CCS=off

T13Off()

CCS=off

T15Off()

CCS=off

T1

Set(c)[(c >=n) && (c <=m)]/r=c;

CCS=activated

On()/CCS=on;n=40

m=65;r=0

Start s3

T4

T10Accelerate(c)/

CCS=suspendedr=c

T9Brake()

CCS=suspended

T11Resume(c)[(c >=n)

&& (c<m) ]/r=c; CCS=activated

SensorSpeed(c)[(c >=r)]/display(c)

T2 S2

T7Coast(c)[c >=m]/r=r-5;display(r)

Page 47: AMOST 20091 Experimental Comparison of Code-Based and Model-Based Test Prioritization Bogdan Korel Computer Science Department Illinois Institute of Technology

AMOST 2009 47

System Model

Input events: On(), Set(50), SensorSpeed(50), Coast(70), Brake(), Resume(55), Accelerate(50), Off()

S1

Exit

T14Off()

CCS=off

T13Off()

CCS=off

T15Off()

CCS=off

T1

Set(c)[(c >=n) && (c <=m)]/r=c;

CCS=activated

On()/CCS=on;n=40

m=65;r=0

Start s3

T4

T10Accelerate(c)/

CCS=suspendedr=c

T9Brake()

CCS=suspended

T11Resume(c)[(c >=n)

&& (c<m) ]/r=c; CCS=activated

SensorSpeed(c)[(c >=r)]/display(c)

T2 S2

T7Coast(c)[c >=m]/r=r-5;display(r)

Page 48: AMOST 20091 Experimental Comparison of Code-Based and Model-Based Test Prioritization Bogdan Korel Computer Science Department Illinois Institute of Technology

AMOST 2009 48

System Model

Input events: On(), Set(50), SensorSpeed(50), Coast(70), Brake(), Resume(55), Accelerate(50), Off()

S1

Exit

T14Off()

CCS=off

T13Off()

CCS=off

T15Off()

CCS=off

T1

Set(c)[(c >=n) && (c <=m)]/r=c;

CCS=activated

On()/CCS=on;n=40

m=65;r=0

Start s3

T4

T10Accelerate(c)/

CCS=suspendedr=c

T9Brake()

CCS=suspended

T11Resume(c)[(c >=n)

&& (c<m) ]/r=c; CCS=activated

SensorSpeed(c)[(c >=r)]/display(c)

T2 S2

T7Coast(c)[c >=m]/r=r-5;display(r)

Page 49: AMOST 20091 Experimental Comparison of Code-Based and Model-Based Test Prioritization Bogdan Korel Computer Science Department Illinois Institute of Technology

AMOST 2009 49

System Model

Input (test):On(), Set(50), SensorSpeed(50), Coast(70), Brake(), Resume(55), Accelerate(50), Off()

Transaction sequence: T1, T2, T4, T7, T9, T11, T10, T15

S1

Exit

T14Off()

CCS=off

T13Off()

CCS=off

T15Off()

CCS=off

T1

Set(c)[(c >=n) && (c <=m)]/r=c;

CCS=activated

On()/CCS=on;n=40

m=65;r=0

Start s3

T4

T10Accelerate(c)/

CCS=suspendedr=c

T9Brake()

CCS=suspended

T11Resume(c)[(c >=n)

&& (c<m) ]/r=c; CCS=activated

SensorSpeed(c)[(c >=r)]/display(c)

T2 S2

T7Coast(c)[c >=m]/r=r-5;display(r)

Page 50: AMOST 20091 Experimental Comparison of Code-Based and Model-Based Test Prioritization Bogdan Korel Computer Science Department Illinois Institute of Technology

AMOST 2009 50

Outline

IntroductionTest prioritizationSystem ModelingModel-based test prioritizationMeasuring the effectiveness of test

prioritizationExperimental studyConclusions

Page 51: AMOST 20091 Experimental Comparison of Code-Based and Model-Based Test Prioritization Bogdan Korel Computer Science Department Illinois Institute of Technology

AMOST 2009 51

Model-based Test Prioritization

The idea of model-based test prioritization is to use the system model(s) to prioritize the test suite– Model is modified

– Model is not modified

Page 52: AMOST 20091 Experimental Comparison of Code-Based and Model-Based Test Prioritization Bogdan Korel Computer Science Department Illinois Institute of Technology

AMOST 2009 52

System retesting – model is modified

Original model

Modified model

modified

implementation Modified implementation

Test suite

modified

Page 53: AMOST 20091 Experimental Comparison of Code-Based and Model-Based Test Prioritization Bogdan Korel Computer Science Department Illinois Institute of Technology

AMOST 2009 53

Model-based Test Prioritization

Original model

Modified model

modifiedimplementation Modified

implementation

Test suite

Prioritized test suite

Model-based test prioritization

modified

Page 54: AMOST 20091 Experimental Comparison of Code-Based and Model-Based Test Prioritization Bogdan Korel Computer Science Department Illinois Institute of Technology

AMOST 2009 54

Model-based Test Prioritization

The idea of model-based test prioritization is to use the system model(s) to prioritize the test suite– Model is modified

– Model is not modified

Page 55: AMOST 20091 Experimental Comparison of Code-Based and Model-Based Test Prioritization Bogdan Korel Computer Science Department Illinois Institute of Technology

AMOST 2009 55

System retesting – model is not modified

Model

implementation Modified implementation

Test suite

modified

Page 56: AMOST 20091 Experimental Comparison of Code-Based and Model-Based Test Prioritization Bogdan Korel Computer Science Department Illinois Institute of Technology

AMOST 2009 56

Model-based Test Prioritization – model is not modified

Model

modifiedimplementation Modified

implementation

Test suite

Prioritized test suite

Model-based test prioritization

Page 57: AMOST 20091 Experimental Comparison of Code-Based and Model-Based Test Prioritization Bogdan Korel Computer Science Department Illinois Institute of Technology

AMOST 2009 57

Model-based Test Prioritization

Model

Prioritized test suite

Tests executioninformation

Prioritizationalgorithm

Marked model

elements

Test suite

Page 58: AMOST 20091 Experimental Comparison of Code-Based and Model-Based Test Prioritization Bogdan Korel Computer Science Department Illinois Institute of Technology

AMOST 2009 58

S1

Exit

T14Off()

CCS=off

T13Off()

CCS=off

T15Off()

CCS=off

T1

Set(c)[(c >=n) && (c <=m)]/r=c;

CCS=activated

On()/CCS=on;n=40

m=65;r=0

Start s3

T4

T10Accelerate(c)/

CCS=suspendedr=c

T9Brake()

CCS=suspended

T11Resume(c)[(c >=n)

&& (c<m) ]/r=c; CCS=activated

SensorSpeed(c)[(c >=r)]/display(c)

T2 S2

T7Coast(c)[c >=m]/r=r-5;display(r)

Page 59: AMOST 20091 Experimental Comparison of Code-Based and Model-Based Test Prioritization Bogdan Korel Computer Science Department Illinois Institute of Technology

AMOST 2009 59

S1

Exit

T14Off()

CCS=off

T13Off()

CCS=off

T15Off()

CCS=off

T1

Set(c)[(c >=n) && (c <=m)]/r=c;

CCS=activated

On()/CCS=on;n=40

m=65;r=0

Start s3

T4

T10Accelerate(c)/

CCS=suspendedr=c

T9Brake()

CCS=suspended

T11Resume(c)[(c >=n)

&& (c<m) ]/r=c; CCS=activated

SensorSpeed(c)[(c >=r)]/display(c)

T2 S2

T7Coast(c)[c >=m]/r=r-5;display(r)

Source code

Page 60: AMOST 20091 Experimental Comparison of Code-Based and Model-Based Test Prioritization Bogdan Korel Computer Science Department Illinois Institute of Technology

AMOST 2009 60

S1

Exit

T14Off()

CCS=off

T13Off()

CCS=off

T15Off()

CCS=off

T1

Set(c)[(c >=n) && (c <=m)]/r=c;

CCS=activated

On()/CCS=on;n=40

m=65;r=0

Start s3

T4

T10Accelerate(c)/

CCS=suspendedr=c

T9Brake()

CCS=suspended

T11Resume(c)[(c >=n)

&& (c<m) ]/r=c; CCS=activated

SensorSpeed(c)[(c >=r)]/display(c)

T2 S2

T7Coast(c)[c >=m]/r=r-5;display(r)

Source code related to Brake is modified

Page 61: AMOST 20091 Experimental Comparison of Code-Based and Model-Based Test Prioritization Bogdan Korel Computer Science Department Illinois Institute of Technology

AMOST 2009 61

S1

Exit

T14Off()

CCS=off

T13Off()

CCS=off

T15Off()

CCS=off

T1

Set(c)[(c >=n) && (c <=m)]/r=c;

CCS=activated

On()/CCS=on;n=40

m=65;r=0

Start s3

T4

T10Accelerate(c)/

CCS=suspendedr=c

T9Brake()

CCS=suspended

T11Resume(c)[(c >=n)

&& (c<m) ]/r=c; CCS=activated

SensorSpeed(c)[(c >=r)]/display(c)

T2 S2

T7Coast(c)[c >=m]/r=r-5;display(r)

Source code related to Brake is modified

Page 62: AMOST 20091 Experimental Comparison of Code-Based and Model-Based Test Prioritization Bogdan Korel Computer Science Department Illinois Institute of Technology

AMOST 2009 62

S1

Exit

T14Off()

CCS=off

T13Off()

CCS=off

T15Off()

CCS=off

T1

Set(c)[(c >=n) && (c <=m)]/r=c;

CCS=activated

On()/CCS=on;n=40

m=65;r=0

Start s3

T4

T10Accelerate(c)/

CCS=suspendedr=c

T9Brake()

CCS=suspended

T11Resume(c)[(c >=n)

&& (c<m) ]/r=c; CCS=activated

SensorSpeed(c)[(c >=r)]/display(c)

T2 S2

T7Coast(c)[c >=m]/r=r-5;display(r)

Source code related to Brake is modifiedSource code related to Coast is modified

Page 63: AMOST 20091 Experimental Comparison of Code-Based and Model-Based Test Prioritization Bogdan Korel Computer Science Department Illinois Institute of Technology

AMOST 2009 63

S1

Exit

T14Off()

CCS=off

T13Off()

CCS=off

T15Off()

CCS=off

T1

Set(c)[(c >=n) && (c <=m)]/r=c;

CCS=activated

On()/CCS=on;n=40

m=65;r=0

Start s3

T4

T10Accelerate(c)/

CCS=suspendedr=c

T9Brake()

CCS=suspended

T11Resume(c)[(c >=n)

&& (c<m) ]/r=c; CCS=activated

SensorSpeed(c)[(c >=r)]/display(c)

T2 S2

T7Coast(c)[c >=m]/r=r-5;display(r)

Source code related to Brake is modifiedSource code related to Coast is modified

Page 64: AMOST 20091 Experimental Comparison of Code-Based and Model-Based Test Prioritization Bogdan Korel Computer Science Department Illinois Institute of Technology

AMOST 2009 64

Model-based Test Prioritization

The model is executed for the whole test suite Information about execution of the model is

collectedThe collected information is used to prioritize

the test suiteExecution of the test suite on the model is

inexpensive (very fast) as compared to execution of the system

Page 65: AMOST 20091 Experimental Comparison of Code-Based and Model-Based Test Prioritization Bogdan Korel Computer Science Department Illinois Institute of Technology

AMOST 2009 65

Model-based Test Prioritization

Model-based test prioritization methods:– Selective test prioritization – Model-based prioritization based on:

• # of executed marked transitions• the list of executed marked transitions

– Model dependence-based test prioritization• Sequence of executed transitions

– …

Page 66: AMOST 20091 Experimental Comparison of Code-Based and Model-Based Test Prioritization Bogdan Korel Computer Science Department Illinois Institute of Technology

AMOST 2009 66

Model-based Test Prioritization

Several model-based test prioritization methods:– Selective test prioritization – Model-based prioritization based on:

• # of executed marked transitions• the list of executed marked transitions

– Model dependence-based test prioritization• Sequence of executed transitions

– …

Page 67: AMOST 20091 Experimental Comparison of Code-Based and Model-Based Test Prioritization Bogdan Korel Computer Science Department Illinois Institute of Technology

AMOST 2009 67

Selective Test PrioritizationThe idea of selective test prioritization is

– Assign high priority to tests that execute at least one marked transition in the model

– Assign low priority to tests that do not execute any marked transition in the model

Page 68: AMOST 20091 Experimental Comparison of Code-Based and Model-Based Test Prioritization Bogdan Korel Computer Science Department Illinois Institute of Technology

AMOST 2009 68

Selective Test Prioritization

During system retesting, – tests with high priority are executed first

– low priority tests are executed later High priority tests are ordered using a random

ordering. Similarly, low priority tests are ordered using a random ordering

Test Suite

TSH TSL

TSH: High priority tests

TSL: Low priority tests

Page 69: AMOST 20091 Experimental Comparison of Code-Based and Model-Based Test Prioritization Bogdan Korel Computer Science Department Illinois Institute of Technology

AMOST 2009 69

Model-based Test Prioritization

Several model-based test prioritization methods:– Selective test prioritization – Model-based prioritization based on:

• # of executed marked transition• the list of executed marked transitions

(Heuristic #2)

– Model dependence-based test prioritization• Sequence of executed transitions

– …

Page 70: AMOST 20091 Experimental Comparison of Code-Based and Model-Based Test Prioritization Bogdan Korel Computer Science Department Illinois Institute of Technology

AMOST 2009 70

Allows each marked transition to have the same opportunity to be executed during software retesting

A higher priority is assigned to a test that executes a marked transition that has been executed the least number of times at the given point of system retesting by keeping a count of transition executions

This heuristic tries to balance the number of executions of marked transitions by keeping counters for each marked transition

Heuristics #2

Page 71: AMOST 20091 Experimental Comparison of Code-Based and Model-Based Test Prioritization Bogdan Korel Computer Science Department Illinois Institute of Technology

AMOST 2009 71

Heuristics #2Executed marked transitions t1: T1, T2, T3 count(T1)=0 t2 : T3, T4, T5 count(T2)=0 t3 : T3, T4 count(T3)=0

t4 : T5 count(T4)=0t5 : T1 count(T5)=0t6 : T1, T2

t7 : T2, T4

t8 : T2, T3, T4

t9 : t10 :

Page 72: AMOST 20091 Experimental Comparison of Code-Based and Model-Based Test Prioritization Bogdan Korel Computer Science Department Illinois Institute of Technology

AMOST 2009 72

Heuristics #2Executed marked transitions t1: T1, T2, T3 count(T1)=0 t2 : T3, T4, T5 count(T2)=0 t3 : T3, T4 count(T3)=0

t4 : T5 count(T4)=0t5 : T1 count(T5)=0t6 : T1, T2

t7 : T2, T4

t8 : T2, T3, T4

t9 : t10 :

S:

Page 73: AMOST 20091 Experimental Comparison of Code-Based and Model-Based Test Prioritization Bogdan Korel Computer Science Department Illinois Institute of Technology

AMOST 2009 73

Heuristics #2Executed marked transitions t1: T1, T2, T3 count(T1)=0 t2 : T3, T4, T5 count(T2)=0 t3 : T3, T4 count(T3)=0

t4 : T5 count(T4)=0t5 : T1 count(T5)=0t6 : T1, T2

t7 : T2, T4

t8 : T2, T3, T4

t9 : t10 :

S:

Page 74: AMOST 20091 Experimental Comparison of Code-Based and Model-Based Test Prioritization Bogdan Korel Computer Science Department Illinois Institute of Technology

AMOST 2009 74

Heuristics #2Executed marked transitions t1: T1, T2, T3 count(T1)=0 t2 : T3, T4, T5 count(T2)=1 t3 : T3, T4 count(T3)=1

t4 : T5 count(T4)=1t5 : T1 count(T5)=0t6 : T1, T2

t7 : T2, T4

t8 : T2, T3, T4

t9 : t10 :

S: t8

Page 75: AMOST 20091 Experimental Comparison of Code-Based and Model-Based Test Prioritization Bogdan Korel Computer Science Department Illinois Institute of Technology

AMOST 2009 75

Heuristics #2Executed marked transitions t1: T1, T2, T3 count(T1)=0 t2 : T3, T4, T5 count(T2)=1 t3 : T3, T4 count(T3)=1

t4 : T5 count(T4)=1t5 : T1 count(T5)=0t6 : T1, T2

t7 : T2, T4

t9 : t10 :

S: t8

Page 76: AMOST 20091 Experimental Comparison of Code-Based and Model-Based Test Prioritization Bogdan Korel Computer Science Department Illinois Institute of Technology

AMOST 2009 76

Heuristics #2Executed marked transitions t1: T1, T2, T3 count(T1)=0 t2 : T3, T4, T5 count(T2)=1 t3 : T3, T4 count(T3)=1

t4 : T5 count(T4)=1t5 : T1 count(T5)=0t6 : T1, T2

t7 : T2, T4

t9 : t10 :

S: t8

Page 77: AMOST 20091 Experimental Comparison of Code-Based and Model-Based Test Prioritization Bogdan Korel Computer Science Department Illinois Institute of Technology

AMOST 2009 77

Heuristics #2Executed marked transitions t1: T1, T2, T3 count(T1)=0 t2 : T3, T4, T5 count(T2)=1 t3 : T3, T4 count(T3)=1

t4 : T5 count(T4)=1t5 : T1 count(T5)=0t6 : T1, T2

t7 : T2, T4

t9 : t10 :

S: t8

Page 78: AMOST 20091 Experimental Comparison of Code-Based and Model-Based Test Prioritization Bogdan Korel Computer Science Department Illinois Institute of Technology

AMOST 2009 78

Heuristics #2Executed marked transitions t1: T1, T2, T3 count(T1)=0 t2 : T3, T4, T5 count(T2)=1 t3 : T3, T4 count(T3)=1

t4 : T5 count(T4)=1t5 : T1 count(T5)=0t6 : T1, T2

t7 : T2, T4

t9 : t10 :

S: t8

Page 79: AMOST 20091 Experimental Comparison of Code-Based and Model-Based Test Prioritization Bogdan Korel Computer Science Department Illinois Institute of Technology

AMOST 2009 79

Heuristics #2Executed marked transitions t1: T1, T2, T3 count(T1)=0 t2 : T3, T4, T5 count(T2)=1 t3 : T3, T4 count(T3)=1

t4 : T5 count(T4)=1t5 : T1 count(T5)=0t6 : T1, T2

t7 : T2, T4

t9 : t10 :

S: t8

Page 80: AMOST 20091 Experimental Comparison of Code-Based and Model-Based Test Prioritization Bogdan Korel Computer Science Department Illinois Institute of Technology

AMOST 2009 80

Heuristics #2Executed marked transitions t1: T1, T2, T3 count(T1)=0 t2 : T3, T4, T5 count(T2)=1 t3 : T3, T4 count(T3)=2

t4 : T5 count(T4)=2t5 : T1 count(T5)=1t6 : T1, T2

t7 : T2, T4

t9 : t10 :

S: t8, t2

Page 81: AMOST 20091 Experimental Comparison of Code-Based and Model-Based Test Prioritization Bogdan Korel Computer Science Department Illinois Institute of Technology

AMOST 2009 81

Outline

IntroductionModel-based testingTest prioritizationCode-based test prioritizationModel-based test prioritizationMeasuring the effectiveness of test prioritizationExperimental studyConclusions

Page 82: AMOST 20091 Experimental Comparison of Code-Based and Model-Based Test Prioritization Bogdan Korel Computer Science Department Illinois Institute of Technology

AMOST 2009 82

Measuring the Effectiveness of Test Prioritization

• Test prioritization methods may generate many different solutions (prioritized test sequences) for a given test suite.

• A factor that may influence the resulting prioritized test sequence is, for example, an order in which tests are processed during the prioritization process

Page 83: AMOST 20091 Experimental Comparison of Code-Based and Model-Based Test Prioritization Bogdan Korel Computer Science Department Illinois Institute of Technology

AMOST 2009 83

Measuring the Effectiveness of Test Prioritization

• In order to compare different test prioritization methods, different measures were introduced• The rate of fault detection is a measure of how rapidly

a prioritized test sequence detects faults. This measure is a function of the percentage of faults detected in terms of the test suite fraction

Page 84: AMOST 20091 Experimental Comparison of Code-Based and Model-Based Test Prioritization Bogdan Korel Computer Science Department Illinois Institute of Technology

AMOST 2009 84

Measuring the Effectiveness of Test Prioritization

• In our study we concentrated on one fault d• In order to compare different prioritization methods,

we use the concept of the most likely position of the first failed test that detects fault d

• The most likely position represents an average (most likely) position of the first failed test that detects fault d over all possible prioritized test sequences that may be generated by a test prioritization method (for a given system under test and a test suite)

Page 85: AMOST 20091 Experimental Comparison of Code-Based and Model-Based Test Prioritization Bogdan Korel Computer Science Department Illinois Institute of Technology

AMOST 2009 85

The most likely (average) position of the first failed test that detects fault d

Measuring Effectiveness of Early Fault Detection

M

diRi

dMLP

N

i

1

),(

)(

•R(i,d): number of prioritized test sequences for which the first failed test is in position i

•M: number of all possible prioritized test sequences•d: fault

Page 86: AMOST 20091 Experimental Comparison of Code-Based and Model-Based Test Prioritization Bogdan Korel Computer Science Department Illinois Institute of Technology

AMOST 2009 86

The most likely relative position in test suite TS of the first failed test that detects fault d:

Measuring Effectiveness of Early Fault Detection

N

dMLPdRP

)()(

0 < RP(d) 1

Page 87: AMOST 20091 Experimental Comparison of Code-Based and Model-Based Test Prioritization Bogdan Korel Computer Science Department Illinois Institute of Technology

AMOST 2009 87

For some heuristics (e.g., random) an analytical approach to compute precisely MLP can be used

However, for many test prioritization methods derivation of a precise formula for RP(d), the most likely relative position of the first failed test that detects fault d, may be very difficult.

Therefore, we have implemented a randomized approach of estimation of RP(d) for all five heuristic methods

Measuring Effectiveness of Early Fault Detection

Page 88: AMOST 20091 Experimental Comparison of Code-Based and Model-Based Test Prioritization Bogdan Korel Computer Science Department Illinois Institute of Technology

AMOST 2009 88

This estimation randomly generates prioritized test sequences according to a given test prioritization heuristic

For each generated sequence, the position of the first failed test is determined in the test sequence

After a large number of test sequences is generated, the estimated most likely position is computed

Measuring Effectiveness of Early Fault Detection

Page 89: AMOST 20091 Experimental Comparison of Code-Based and Model-Based Test Prioritization Bogdan Korel Computer Science Department Illinois Institute of Technology

AMOST 2009 89

Outline

IntroductionModel-based testingTest prioritizationCode-based test prioritizationModel-based test prioritizationMeasuring the effectiveness of test prioritizationExperimental studyConclusions

Page 90: AMOST 20091 Experimental Comparison of Code-Based and Model-Based Test Prioritization Bogdan Korel Computer Science Department Illinois Institute of Technology

AMOST 2009 90

Experimental Study

The goal of the experiment study is to compare the effectiveness of early fault detection of:Code-based prioritization (Heuristic #1) Model-based test prioritization (Heuristic #2)

The most likely relative position of the first failed test that detects fault d is used as the measure of the effectiveness of early fault detection

Page 91: AMOST 20091 Experimental Comparison of Code-Based and Model-Based Test Prioritization Bogdan Korel Computer Science Department Illinois Institute of Technology

AMOST 2009 91

Experimental Study

Table 1. System models used in the experiment

Model Name TN SN LOC TSN SDN

ATM 25 8 609 834 21

Cruise Control 20 5 633 1000 22

Fuel Pumps 28 13 795 922 19

TCP-Dialer 49 17 1025 1000 12

ISDN 87 20 1416 900 19

Print Token 89 11 550 1439 7

Vending Machine 41 8 750 1658 10

TN: number of transitions SN: number of states LOC: number of lines of code of implementation TSN: number of tests in a test suite SDN: number of seeded faults 4

Page 92: AMOST 20091 Experimental Comparison of Code-Based and Model-Based Test Prioritization Bogdan Korel Computer Science Department Illinois Institute of Technology

AMOST 2009 92

Experimental Study

We introduced incorrect modifications to implementations

For each modification we identified and marked the corresponding transitions

For each implementation we created 7-22 incorrect versions

# of failed tests for each incorrect version: 1-10 failed tests

Page 93: AMOST 20091 Experimental Comparison of Code-Based and Model-Based Test Prioritization Bogdan Korel Computer Science Department Illinois Institute of Technology

AMOST 2009 93

ATM Cruise Control

R: Random prioritizationH1: Heuristic #1 prioritization (code-based)H2: Heuristic #2 prioritization (model-based)

RP Boxplots for the Experimental Study

0

0.1

0.2

0.3

0.4

0.5

0.6

R H1 H2

0

0.1

0.2

0.3

0.4

0.5

0.6

R H1 H2

Page 94: AMOST 20091 Experimental Comparison of Code-Based and Model-Based Test Prioritization Bogdan Korel Computer Science Department Illinois Institute of Technology

AMOST 2009 94

Fuel Pump ISDN

R: Random prioritizationH1: Heuristic #1 prioritization (code-based)H2: Heuristic #2 prioritization (model-based)

RP Boxplots for the Experimental Study

0

0.1

0.2

0.3

0.4

0.5

0.6

R H1 H2

0

0.1

0.2

0.3

0.4

0.5

0.6

0.7

R H1 H2

Page 95: AMOST 20091 Experimental Comparison of Code-Based and Model-Based Test Prioritization Bogdan Korel Computer Science Department Illinois Institute of Technology

AMOST 2009 95

TCP Print-Tokens

R: Random prioritizationH1: Heuristic #1 prioritization (code-based)H2: Heuristic #2 prioritization (model-based)

RP Boxplots for the Experimental Study

00.1

0.20.30.4

0.50.6

0.70.8

R H1 H2

0

0.05

0.1

0.15

0.2

0.25

0.3

0.35

R H1 H2

Page 96: AMOST 20091 Experimental Comparison of Code-Based and Model-Based Test Prioritization Bogdan Korel Computer Science Department Illinois Institute of Technology

AMOST 2009 96

Vending Machine

R: Random prioritizationH1: Heuristic #1 prioritization (code-based)H2: Heuristic #2 prioritization (model-based)

RP Boxplots for the Experimental Study

0

0.2

0.4

0.6

0.8

1

1.2

R H1 H2

Page 97: AMOST 20091 Experimental Comparison of Code-Based and Model-Based Test Prioritization Bogdan Korel Computer Science Department Illinois Institute of Technology

AMOST 2009 97

Cumulative data for all models

R: Random prioritizationH1: Heuristic #1 prioritization (code-based)H2: Heuristic #2 prioritization (model-based)

RP Boxplots for the Experimental Study

0

0.2

0.4

0.6

0.8

1

1.2

R H1 H2

C

Page 98: AMOST 20091 Experimental Comparison of Code-Based and Model-Based Test Prioritization Bogdan Korel Computer Science Department Illinois Institute of Technology

AMOST 2009 98

Conclusions

Model-based test prioritization is less expensive than code-based prioritization– Execution of the model is inexpensive as compared to

the execution of the system

Cost of model development

Page 99: AMOST 20091 Experimental Comparison of Code-Based and Model-Based Test Prioritization Bogdan Korel Computer Science Department Illinois Institute of Technology

AMOST 2009 99

Conclusions

Model based prioritization is sensitive to correct markings of transitions when a model is not modified– Correct identification of transitions (marked

transitions) related to source code modifications is important

This is not an issue, when models are modified

Page 100: AMOST 20091 Experimental Comparison of Code-Based and Model-Based Test Prioritization Bogdan Korel Computer Science Department Illinois Institute of Technology

AMOST 2009 100

Conclusions

The small experimental study suggests that model-based prioritization may be as effective in early fault detection as code-based prioritization, if not better

Code-based test prioritization uses information related to the original system

Page 101: AMOST 20091 Experimental Comparison of Code-Based and Model-Based Test Prioritization Bogdan Korel Computer Science Department Illinois Institute of Technology

AMOST 2009 101

Future Work

Automated mapping of source code changes to a model

Experimental study on larger models and systems with multiple faults

Experimental study to compare more code-based methods with model-based methods

Page 102: AMOST 20091 Experimental Comparison of Code-Based and Model-Based Test Prioritization Bogdan Korel Computer Science Department Illinois Institute of Technology

AMOST 2009 102

Questions?