model-driven software verification

Post on 19-Jun-2015

502 Views

Category:

Technology

0 Downloads

Preview:

Click to see full reader

TRANSCRIPT

Model-Driven Software Verification (MDSV)

Gerard J. Holzmann & Rajeev JoshiJPL Laboratory for Reliable Software, Caltech

(Formal Methods for Developing Reliable Software Systems)Master in Software Engineering & Artificial Intelligence

ModelModel--Driven Software Driven Software Verification (MDSV)Verification (MDSV)

Gerard J. Holzmann & Rajeev JoshiGerard J. Holzmann & Rajeev JoshiJPL Laboratory for Reliable Software, CaltechJPL Laboratory for Reliable Software, Caltech

(Formal Methods for Developing Reliable Software Systems)(Formal Methods for Developing Reliable Software Systems)Master in Software Engineering & Artificial IntelligenceMaster in Software Engineering & Artificial Intelligence

Computer Science DepartmentUniversity of Malaga

Juan Antonio Martin Checa2011

Computer Science DepartmentUniversity of Malaga

Juan Antonio Martin Checa2011

“Learn from yesterday, live for

today, hope for tomorrow.

The important thing is not to

stop questioning.”

“Learn from yesterday, live for

today, hope for tomorrow.

The important thing is not to

stop questioning.”

- Albert Einstein

4

Index of contentsIndex of contentsIndex of contents

1. Introduction2. Model Checking with

Embedded C Code3. Two Sample Applications4. Related Work5. Conclusions

1. Introduction2. Model Checking with

Embedded C Code3. Two Sample Applications4. Related Work5. Conclusions

ModelModel--Driven Software Verification J.A.Martin Checa Driven Software Verification J.A.Martin Checa -- CIS Dep, University of MalagaCIS Dep, University of Malaga

5

1. Introduction1. 1. IntroductionIntroduction

Software EngineeringSoftware Verification (SV) sw satisfies requirements?

- Dynamic verification (test, experimentation)checks sw behaviour during execution

Test in the small: a single function/classTest in the large: a group of classes (module, integration, system)Acceptance test: functional, non-functional

- Static verification (analysis)checks sw meets requirements by physical inspection

Code conventions verificationBad practices (anti-pattern) detectionSoftware metrics calculationFormal verification (e.g. MODEL CHECKING)

Software EngineeringSoftware Verification (SV) sw satisfies requirements?

- Dynamic verification (test, experimentation)checks sw behaviour during execution

Test in the small: a single function/classTest in the large: a group of classes (module, integration, system)Acceptance test: functional, non-functional

- Static verification (analysis)checks sw meets requirements by physical inspection

Code conventions verificationBad practices (anti-pattern) detectionSoftware metrics calculationFormal verification (e.g. MODEL CHECKING)

1. Introduction 1. Introduction 2. MC + C2. MC + C--Code 3. Examples 4. Related Work Code 3. Examples 4. Related Work 5. Conclusions5. Conclusions

ModelModel--Driven Software Verification J.A.Martin Checa Driven Software Verification J.A.Martin Checa -- CIS Dep, University of MalagaCIS Dep, University of Malaga

6

1. Introduction1. 1. IntroductionIntroduction

Model Checking: Classical approachModel Checking: Classical approach

1. Introduction 1. Introduction 2. MC + C2. MC + C--Code 3. Examples 4. Related Work Code 3. Examples 4. Related Work 5. Conclusions5. Conclusions

ModelModel--Driven Software Verification J.A.Martin Checa Driven Software Verification J.A.Martin Checa -- CIS Dep, University of MalagaCIS Dep, University of Malaga

System model: high-level / manually generatedmodel abstraction ↓ complexity

same language as MC

requires knowledge of

MC + application

↓ knowledge of MC

limits scope of verification

↓ knowledge of app

undermines validity of verification

7

8

1. Introduction1. 1. IntroductionIntroductionModel Checking: Model-Driven Sw Verification approachModel Checking: Model-Driven Sw Verification approach

1. Introduction 1. Introduction 2. MC + C2. MC + C--Code 3. Examples 4. Related Work Code 3. Examples 4. Related Work 5. Conclusions5. Conclusions

ModelModel--Driven Software Verification J.A.Martin Checa Driven Software Verification J.A.Martin Checa -- CIS Dep, University of MalagaCIS Dep, University of Malaga

Model extraction: automatic SM generation

SM = ‘black box’ (no knowledge needed)model abstraction ↓ complexity

Test harness:same language as MC

drives app through all relevant states

mapping table (AM): source statements

& data manipulations (relevant for test)

test driver (TD): input / output

requirements (SP): properties to check

9

10

1. Introduction1. 1. IntroductionIntroduction

Test harness:Test harness:

1. Introduction 1. Introduction 2. MC + C2. MC + C--Code 3. Examples 4. Related Work Code 3. Examples 4. Related Work 5. Conclusions5. Conclusions

ModelModel--Driven Software Verification J.A.Martin Checa Driven Software Verification J.A.Martin Checa -- CIS Dep, University of MalagaCIS Dep, University of Malaga

11

12

1. Introduction1. 1. IntroductionIntroduction

Goal:

“A powerful extension of the SPIN model checker that allows the user to directly define data abstractionsin the logic verification of application level programs.”

“A new verification method that [...] can avoid the need to manually construct a verification model, while still retaining the capability to define powerful abstractions that can be used to reduce verification complexity.”

Goal:

“A powerful extension of the SPIN model checker that allows the user to directly define data abstractionsin the logic verification of application level programs.”

“A new verification method that [...] can avoid the need to manually construct a verification model, while still retaining the capability to define powerful abstractions that can be used to reduce verification complexity.”

1. Introduction 1. Introduction 2. MC + C2. MC + C--Code 3. Examples 4. Related Work Code 3. Examples 4. Related Work 5. Conclusions5. Conclusions

ModelModel--Driven Software Verification J.A.Martin Checa Driven Software Verification J.A.Martin Checa -- CIS Dep, University of MalagaCIS Dep, University of Malaga

13

2. Model Cheching with Embedded C-Code2. Model Cheching with Embedded C2. Model Cheching with Embedded C--CodeCode

2.1. Introduction2.2. Tracking Without Matching2.3. Validity of Abstractions2.4. Sufficient Conditions for

Soundness

2.1. Introduction2.2. Tracking Without Matching2.3. Validity of Abstractions2.4. Sufficient Conditions for

Soundness

ModelModel--Driven Software Verification J.A.Martin Checa Driven Software Verification J.A.Martin Checa -- CIS Dep, University of MalagaCIS Dep, University of Malaga

1. Introduction1. Introduction 2. MC + C2. MC + C--CodeCode 3. Examples 4. Related Work 5. 3. Examples 4. Related Work 5. ConclusionsConclusions

14

2.1. Introduction2.1. 2.1. IntroductionIntroduction

ModelModel--Driven Software Verification J.A.Martin Checa Driven Software Verification J.A.Martin Checa -- CIS Dep, University of MalagaCIS Dep, University of Malaga

1. Introduction1. Introduction 2. MC + C2. MC + C--CodeCode 3. Examples 4. Related Work 5. 3. Examples 4. Related Work 5. ConclusionsConclusions

SPIN 4.0+ allows embedded

C/C++ code within

the model

primitives: connexion app - SM

c_decl c_track

c_code c_expr

(model) state info:- within primitives

- within C source code (optional)

15

16

2.1. Introduction2.1. 2.1. IntroductionIntroduction

ModelModel--Driven Software Verification J.A.Martin Checa Driven Software Verification J.A.Martin Checa -- CIS Dep, University of MalagaCIS Dep, University of Malaga

1. Introduction1. Introduction 2. MC + C2. MC + C--CodeCode 3. Examples 4. Related Work 5. 3. Examples 4. Related Work 5. ConclusionsConclusions

Primitives:c_decl: defines state. introduces types/names of external C data objects

referred to in the model.

c_track: defines state. defines which of the C data objects should be

considered to hold state info relevant to the verification process.

c_code: defines state transitions. encloses an arbitrary fragment of C code

used to effect the desired state transition.

c_expr: defines state transitions. evaluates an arbitrary side-effect free

expression in C to compute a boolean truth value used to

determine the executability of the statement itself.

17

Primitives: example

c_decl {

extern float x;

extern void fiddle(void);

};

c_track “&x” “sizeof(float)”;

init {

do

:: c_expr { x < 10.0 } -> c_code { fiddle(); }

:: else -> break

od

}

18

2.1. Introduction2.1. 2.1. IntroductionIntroduction

ModelModel--Driven Software Verification J.A.Martin Checa Driven Software Verification J.A.Martin Checa -- CIS Dep, University of MalagaCIS Dep, University of Malaga

1. Introduction1. Introduction 2. MC + C2. MC + C--CodeCode 3. Examples 4. Related Work 5. 3. Examples 4. Related Work 5. ConclusionsConclusions

Primitives (extensions)more flexible than PROMELA

allow to introduce new datatypes in verification models

SPIN is used as usual

c_track:state tracking: allows to restore the value of data objects to their

previous states when backtracking during depth-first search.

state matching: allows to recognize when a state is revisited

during the search.

19

2.2. Tracking Without Matching2.2. Tracking Without Matching2.2. Tracking Without Matching

c_track:- state tracking: restore the value of data objects. - state matching: recognize revisited states.

Cases of interest (EDO - External Data Object): - EDO: contains no state info- EDO: contains state info (but too much detail)- state tracking: retain all necessary details for backtracking- state matching: abstractions

c_track extension (SPIN 4.1+):c_track “&x” “sizeof(float)” “Matched”;

c_track “&x” “sizeof(float)”; “Unmatched”;Matched value (EDO) SS ^ value (EDO) SDUnmatched value (EDO) SS (not saved to SD)

c_track:- state tracking: restore the value of data objects. - state matching: recognize revisited states.

Cases of interest (EDO - External Data Object): - EDO: contains no state info- EDO: contains state info (but too much detail)- state tracking: retain all necessary details for backtracking- state matching: abstractions

c_track extension (SPIN 4.1+):c_track “&x” “sizeof(float)” “Matched”;

c_track “&x” “sizeof(float)”; “Unmatched”;Matched value (EDO) SS ^ value (EDO) SDUnmatched value (EDO) SS (not saved to SD)

ModelModel--Driven Software Verification J.A.Martin Checa Driven Software Verification J.A.Martin Checa -- CIS Dep, University of MalagaCIS Dep, University of Malaga

1. Introduction1. Introduction 2. MC + C2. MC + C--CodeCode 3. Examples 4. Related Work 5. 3. Examples 4. Related Work 5. ConclusionsConclusions

20

21

2.3. Validity of Abstractions2.3. Validity of Abstractions2.3. Validity of Abstractions

ModelModel--Driven Software Verification J.A.Martin Checa Driven Software Verification J.A.Martin Checa -- CIS Dep, University of MalagaCIS Dep, University of Malaga

1. Introduction1. Introduction 2. MC + C2. MC + C--CodeCode 3. Examples 4. Related Work 5. 3. Examples 4. Related Work 5. ConclusionsConclusions

c_track extension (SPIN 4.1+):c_track “&x” “sizeof(float)” “Matched”; value (EDO) SS, SD

c_track “&x” “sizeof(float)”; “Unmatched”; value (EDO) SS allows to include data (relevant for execution, not for verification) model2 main uses:- data hidding: track data without saving it to SD (hide data from SD)

Unmatched

- data hidding + abstraction:1. hide selected data from SD2. add abstraction functions ≡ abstract data representations3. track new data saving it to SD

Unmatched + Abstraction + Matched

c_track extension (SPIN 4.1+):c_track “&x” “sizeof(float)” “Matched”; value (EDO) SS, SD

c_track “&x” “sizeof(float)”; “Unmatched”; value (EDO) SS allows to include data (relevant for execution, not for verification) model2 main uses:- data hidding: track data without saving it to SD (hide data from SD)

Unmatched

- data hidding + abstraction:1. hide selected data from SD2. add abstraction functions ≡ abstract data representations3. track new data saving it to SD

Unmatched + Abstraction + Matched

22

23

24

2.4. Sufficient Condions for Soundness2.4. Sufficient Condions for Soundness2.4. Sufficient Condions for Soundness

ModelModel--Driven Software Verification J.A.Martin Checa Driven Software Verification J.A.Martin Checa -- CIS Dep, University of MalagaCIS Dep, University of Malaga

1. Introduction1. Introduction 2. MC + C2. MC + C--CodeCode 3. Examples 4. Related Work 5. 3. Examples 4. Related Work 5. ConclusionsConclusions

Model checher (MC)initial state explore set of reachable states check propertie(s)explore a state:

1. enumerating state’s sucessors (Suc)2. determining which states in Suc are potentially relevant3. recording newly encountered states in a data structure (stack)

Symetric states relation (~)Relevant state: s is relevant if none of its symetric states have been visited yetEncountered states: those already visited at least onceExplored states: encountered states which complete Suc has been visited yetNOTE: no abstraction Symetric states relation (=) IDENTITY

Relevant state: s is relevant if not encountered yet

Model checher (MC)initial state explore set of reachable states check propertie(s)explore a state:

1. enumerating state’s sucessors (Suc)2. determining which states in Suc are potentially relevant3. recording newly encountered states in a data structure (stack)

Symetric states relation (~)Relevant state: s is relevant if none of its symetric states have been visited yetEncountered states: those already visited at least onceExplored states: encountered states which complete Suc has been visited yetNOTE: no abstraction Symetric states relation (=) IDENTITY

Relevant state: s is relevant if not encountered yet

25

2.4. Sufficient Condions for Soundness2.4. Sufficient Condions for Soundness2.4. Sufficient Condions for Soundness

ModelModel--Driven Software Verification J.A.Martin Checa Driven Software Verification J.A.Martin Checa -- CIS Dep, University of MalagaCIS Dep, University of Malaga

1. Introduction1. Introduction 2. MC + C2. MC + C--CodeCode 3. Examples 4. Related Work 5. 3. Examples 4. Related Work 5. ConclusionsConclusions

When conditions (1) and (2) are both satisfied, the abstraction will preserve logical soundness.

∀ w, y, z: (w∼y) ∧ (y→z) ⇒ (∃ x: (w→x) ∧(x∼z)) (1)

∀ x, y: P(x) ∧ (x∼y) ⇒ P(y)(2)

P( ): proposition

When conditions (1) and (2) are both satisfied, the abstraction will preserve logical soundness.

∀ w, y, z: (w∼y) ∧ (y→z) ⇒ (∃ x: (w→x) ∧(x∼z)) (1)

∀ x, y: P(x) ∧ (x∼y) ⇒ P(y)(2)

P( ): proposition

26

3. Two Sample Applications3. Two Sample Applications3. Two Sample Applications

3.1. Tic Tac Toe

3.2. JPL’s Mars Exploration Rovers (MER)

3.1. Tic Tac Toe

3.2. JPL’s Mars Exploration Rovers (MER)

ModelModel--Driven Software Verification J.A.Martin Checa Driven Software Verification J.A.Martin Checa -- CIS Dep, University of MalagaCIS Dep, University of Malaga

1. Introduction1. Introduction 2. MC + C2. MC + C--Code Code 3. Examples3. Examples 4. Related Work 5. Conclusions4. Related Work 5. Conclusions

27

3.1. Tic Tac Toe3.1. 3.1. Tic Tac ToeTic Tac ToeGoal: obtaining a 3-marks linear array of “x” or “o”Goal: obtaining a 3-marks linear array of “x” or “o”

ModelModel--Driven Software Verification J.A.Martin Checa Driven Software Verification J.A.Martin Checa -- CIS Dep, University of MalagaCIS Dep, University of Malaga

1. Introduction1. Introduction 2. MC + C2. MC + C--Code Code 3. Examples3. Examples 4. Related Work 5. Conclusions4. Related Work 5. Conclusions

28

29

3.1. Tic Tac Toe3.1. 3.1. Tic Tac ToeTic Tac Toe

Key Idea: introduce abstraction to explote board symetriesKey Idea: introduce abstraction to explote board symetries

ModelModel--Driven Software Verification J.A.Martin Checa Driven Software Verification J.A.Martin Checa -- CIS Dep, University of MalagaCIS Dep, University of Malaga

1. Introduction1. Introduction 2. MC + C2. MC + C--Code Code 3. Examples3. Examples 4. Related Work 5. Conclusions4. Related Work 5. Conclusions

30

3.2. JPL’s Mars Exploration Rovers (MER)3.2. 3.2. JPLJPL’’s Mars Exploration Rovers (MER)s Mars Exploration Rovers (MER)

ModelModel--Driven Software Verification J.A.Martin Checa Driven Software Verification J.A.Martin Checa -- CIS Dep, University of MalagaCIS Dep, University of Malaga

1. Introduction1. Introduction 2. MC + C2. MC + C--Code Code 3. Examples3. Examples 4. Related Work 5. Conclusions4. Related Work 5. Conclusions

31

3.2. JPL’s Mars Exploration Rovers (MER)3.2. 3.2. JPLJPL’’s Mars Exploration Rovers (MER)s Mars Exploration Rovers (MER)

- System: module from the flight sw of JPL’s MER- Threads: 11 (each for a specific application)- Resources: 15 (shared & access controlled by arbiter)- Target of verification: ARBITER

- ARBITER:- prevents potencial conflicts between resource requests- enforces priorities (e.g. communication > driving)

- System: module from the flight sw of JPL’s MER- Threads: 11 (each for a specific application)- Resources: 15 (shared & access controlled by arbiter)- Target of verification: ARBITER

- ARBITER:- prevents potencial conflicts between resource requests- enforces priorities (e.g. communication > driving)

ModelModel--Driven Software Verification J.A.Martin Checa Driven Software Verification J.A.Martin Checa -- CIS Dep, University of MalagaCIS Dep, University of Malaga

1. Introduction1. Introduction 2. MC + C2. MC + C--Code Code 3. Examples3. Examples 4. Related Work 5. Conclusions4. Related Work 5. Conclusions

32

3.2. JPL’s Mars Exploration Rovers (MER)3.2. 3.2. JPLJPL’’s Mars Exploration Rovers (MER)s Mars Exploration Rovers (MER)

ARBITER: priority (User U0) > priority (User U1) ARBITER: priority (User U0) > priority (User U1)

ModelModel--Driven Software Verification J.A.Martin Checa Driven Software Verification J.A.Martin Checa -- CIS Dep, University of MalagaCIS Dep, University of Malaga

1. Introduction1. Introduction 2. MC + C2. MC + C--Code Code 3. Examples3. Examples 4. Related Work 5. Conclusions4. Related Work 5. Conclusions

33

3.2. JPL’s Mars Exploration Rovers (MER)3.2. 3.2. JPLJPL’’s Mars Exploration Rovers (MER)s Mars Exploration Rovers (MER)

ARBITER:

- Source code: 3.000 lines (ANSI-C)- Lookup table:

Conflicting combinations of resource requestsPriorities

- Problem: 11 users (threads)15 resources

- Solution-1 (limited): SPIN bitstate (supertrace) prune (< phy mem)- Solution-2 (exhaustive): Divide & Conquer global problem not

verified as a whole

ARBITER:

- Source code: 3.000 lines (ANSI-C)- Lookup table:

Conflicting combinations of resource requestsPriorities

- Problem: 11 users (threads)15 resources

- Solution-1 (limited): SPIN bitstate (supertrace) prune (< phy mem)- Solution-2 (exhaustive): Divide & Conquer global problem not

verified as a whole

ModelModel--Driven Software Verification J.A.Martin Checa Driven Software Verification J.A.Martin Checa -- CIS Dep, University of MalagaCIS Dep, University of Malaga

1. Introduction1. Introduction 2. MC + C2. MC + C--Code 3. Examples Code 3. Examples 4. Related Work4. Related Work 5. Conclusions5. Conclusions

} Full scale exhaustive verification: ↑↑complexity (large search spaces)

34

3.2. JPL’s Mars Exploration Rovers (MER)3.2. 3.2. JPLJPL’’s Mars Exploration Rovers (MER)s Mars Exploration Rovers (MER)

ModelModel--Driven Software Verification J.A.Martin Checa Driven Software Verification J.A.Martin Checa -- CIS Dep, University of MalagaCIS Dep, University of Malaga

1. Introduction1. Introduction 2. MC + C2. MC + C--Code 3. Examples Code 3. Examples 4. Related Work4. Related Work 5. Conclusions5. Conclusions

Solution-2 (exhaustive): Divide & Conquer- Key Idea: subproblem = {3 users + 3 resources} (random)

↑ repetitions of subproblems ≃ global problem

1. hand-built SPIN model. 245 lines (PROMELA) + 77 lines (arbiter lookup table)2. state info (4,400 bytes) tracked & matched (c_track), no abstraction,

hand-built test harness of 110 lines (PROMELA) surrounds the arbiter code.hashcompact state compression option enabled 127 bytes / state.

3. abstraction (restriction: only essential state info recorded)

35

4. Related Work4. Related Work4. Related Work

ModelModel--Driven Software Verification J.A.Martin Checa Driven Software Verification J.A.Martin Checa -- CIS Dep, University of MalagaCIS Dep, University of Malaga

1. Introduction1. Introduction 2. MC + C2. MC + C--Code 3. Examples Code 3. Examples 4. Related Work4. Related Work 5. Conclusions5. Conclusions

- Several different approaches to direct verification of implementation level C code

- Examples:1. Verisoft:

- partial order reduction theory (state-less search)- search along a given path is stopped when user-defined depth reached- V: ↓ memory- X: code instrumentation- X: no state space maintained no state space techniques available

(systematic depth-first search, verif. liveness properties, abstraction)2. CMC tool:

- capture as much state info as possible- V: store it in state space using aggessive compression stack- X: does not distinguish relevant state info

- Several different approaches to direct verification of implementation level C code

- Examples:1. Verisoft:

- partial order reduction theory (state-less search)- search along a given path is stopped when user-defined depth reached- V: ↓ memory- X: code instrumentation- X: no state space maintained no state space techniques available

(systematic depth-first search, verif. liveness properties, abstraction)2. CMC tool:

- capture as much state info as possible- V: store it in state space using aggessive compression stack- X: does not distinguish relevant state info

36

5. Conclusions5. Conclusions5. Conclusions

ModelModel--Driven Software Verification J.A.Martin Checa Driven Software Verification J.A.Martin Checa -- CIS Dep, University of MalagaCIS Dep, University of Malaga

1. Introduction1. Introduction 2. MC + C2. MC + C--Code 3. Examples 4. Related Work Code 3. Examples 4. Related Work 5. Conclusions5. Conclusions

Goal:“A powerful extension of the SPIN model checker that

allows the user to directly define data abstractionsin the logic verification of application level programs.”

“A new verification method that [...] can avoid the need to manually construct a verification model, while still retaining the capability to define powerful abstractions that can be used to reduce verification complexity.”

Goal:“A powerful extension of the SPIN model checker that

allows the user to directly define data abstractionsin the logic verification of application level programs.”

“A new verification method that [...] can avoid the need to manually construct a verification model, while still retaining the capability to define powerful abstractions that can be used to reduce verification complexity.”

37

5. Conclusions5. Conclusions5. Conclusions

ModelModel--Driven Software Verification J.A.Martin Checa Driven Software Verification J.A.Martin Checa -- CIS Dep, University of MalagaCIS Dep, University of Malaga

1. Introduction1. Introduction 2. MC + C2. MC + C--Code 3. Examples 4. Related Work Code 3. Examples 4. Related Work 5. Conclusions5. Conclusions

Model Checking Classical approach Model-Driven Sw Verification approach

Model Checking Classical approach Model-Driven Sw Verification approach

38

5. Conclusions5. Conclusions5. Conclusions

ModelModel--Driven Software Verification J.A.Martin Checa Driven Software Verification J.A.Martin Checa -- CIS Dep, University of MalagaCIS Dep, University of Malaga

1. Introduction1. Introduction 2. MC + C2. MC + C--Code 3. Examples 4. Related Work Code 3. Examples 4. Related Work 5. Conclusions5. Conclusions

SPIN 4.0+ allows embedded

C/C++ code within

the model

primitives: connexion app - SM

c_decl c_track

c_code c_expr

(model) state info:- within primitives

- within C source code (optional)

39

5. Conclusions5. Conclusions5. Conclusions

ModelModel--Driven Software Verification J.A.Martin Checa Driven Software Verification J.A.Martin Checa -- CIS Dep, University of MalagaCIS Dep, University of Malaga

1. Introduction1. Introduction 2. MC + C2. MC + C--Code 3. Examples 4. Related Work Code 3. Examples 4. Related Work 5. Conclusions5. Conclusions

40

5. Conclusions5. Conclusions5. Conclusions

ModelModel--Driven Software Verification J.A.Martin Checa Driven Software Verification J.A.Martin Checa -- CIS Dep, University of MalagaCIS Dep, University of Malaga

1. Introduction1. Introduction 2. MC + C2. MC + C--Code 3. Examples 4. Related Work Code 3. Examples 4. Related Work 5. Conclusions5. Conclusions

41

5. Conclusions5. Conclusions5. Conclusions

ModelModel--Driven Software Verification J.A.Martin Checa Driven Software Verification J.A.Martin Checa -- CIS Dep, University of MalagaCIS Dep, University of Malaga

1. Introduction1. Introduction 2. MC + C2. MC + C--Code 3. Examples 4. Related Work Code 3. Examples 4. Related Work 5. Conclusions5. Conclusions

42

5. Conclusions5. Conclusions5. Conclusions

ModelModel--Driven Software Verification J.A.Martin Checa Driven Software Verification J.A.Martin Checa -- CIS Dep, University of MalagaCIS Dep, University of Malaga

1. Introduction1. Introduction 2. MC + C2. MC + C--Code 3. Examples 4. Related Work Code 3. Examples 4. Related Work 5. Conclusions5. Conclusions

Tic Tac Toe / JPL’s Mars Exploration Rovers (MER)Tic Tac Toe / JPL’s Mars Exploration Rovers (MER)

43

ReferencesReferencesReferences

Holzmann, G.J. Joshi, R. Model-Driven Software Verification. In PROCS 11th SPIN.WORKSHOP. BARCELONA, SPAIN. pages 77-92. 2004

Holzmann, G.J. Joshi, R. Model-Driven Software Verification. In PROCS 11th SPIN.WORKSHOP. BARCELONA, SPAIN. pages 77-92. 2004

ModelModel--Driven Software Verification J.A.Martin Checa Driven Software Verification J.A.Martin Checa -- CIS Dep, University of MalagaCIS Dep, University of Malaga

1. Introduction1. Introduction 2. MC + C2. MC + C--Code 3. Examples 4. Related Work Code 3. Examples 4. Related Work 5. Conclusions5. Conclusions

Adobe Acrobat 7.0 Document

Adobe Acrobat 7.0 Document

Adobe Acrobat 7.0 Document

Adobe Acrobat 7.0 Document

44

ReferencesReferencesReferences

ModelModel--Driven Software Verification J.A.Martin Checa Driven Software Verification J.A.Martin Checa -- CIS Dep, University of MalagaCIS Dep, University of Malaga

1. Introduction1. Introduction 2. MC + C2. MC + C--Code 3. Examples 4. Related Work Code 3. Examples 4. Related Work 5. Conclusions5. Conclusions

Adobe Acrobat 7.0 Document

Adobe Acrobat 7.0 Document

Adobe Acrobat 7.0 Document

45

ReferencesReferencesReferences

Computer Science, Caltech. http://www.cs.caltech.edu/people.htmlFormal Verification. Wikipedia. http://en.wikipedia.org/wiki/Formal_verificationGerard J. Holzmann. http://spinroot.com/gerard/JLP Laboratory for Reliable Software (LaRS) http://lars-lab.jpl.nasa.gov/Model checking. Wikipedia. http://en.wikipedia.org/wiki/Model_checkingNasa JPL - Jet Propulsion Laboratory. http://www.jpl.nasa.gov/Rajeev Joshi. http://rjoshi.org/bio/Software verification. Wikipedia. http://en.wikipedia.org/wiki/Software_verificationSPIN model checker. Wikipedia. http://en.wikipedia.org/wiki/SPIN_model_checkerTic-tac-toe. Wikipedia. http://en.wikipedia.org/wiki/Tic-tac-toe

Computer Science, Caltech. http://www.cs.caltech.edu/people.htmlFormal Verification. Wikipedia. http://en.wikipedia.org/wiki/Formal_verificationGerard J. Holzmann. http://spinroot.com/gerard/JLP Laboratory for Reliable Software (LaRS) http://lars-lab.jpl.nasa.gov/Model checking. Wikipedia. http://en.wikipedia.org/wiki/Model_checkingNasa JPL - Jet Propulsion Laboratory. http://www.jpl.nasa.gov/Rajeev Joshi. http://rjoshi.org/bio/Software verification. Wikipedia. http://en.wikipedia.org/wiki/Software_verificationSPIN model checker. Wikipedia. http://en.wikipedia.org/wiki/SPIN_model_checkerTic-tac-toe. Wikipedia. http://en.wikipedia.org/wiki/Tic-tac-toe

ModelModel--Driven Software Verification J.A.Martin Checa Driven Software Verification J.A.Martin Checa -- CIS Dep, University of MalagaCIS Dep, University of Malaga

1. Introduction1. Introduction 2. MC + C2. MC + C--Code 3. Examples 4. Related Work Code 3. Examples 4. Related Work 5. Conclusions5. Conclusions

46

ReferencesReferencesReferences

ModelModel--Driven Software Verification J.A.Martin Checa Driven Software Verification J.A.Martin Checa -- CIS Dep, University of MalagaCIS Dep, University of Malaga

1. Introduction1. Introduction 2. MC + C2. MC + C--Code 3. Examples 4. Related Work Code 3. Examples 4. Related Work 5. Conclusions5. Conclusions

47

You might be thinking...You might be thinking...You might be thinking...

Now you can impress your friends talking about MDSV ...

please, ask!

Now you can impress your friends talking about MDSV ...

please, ask!

ModelModel--Driven Software Verification J.A.Martin Checa Driven Software Verification J.A.Martin Checa -- CIS Dep, University of MalagaCIS Dep, University of Malaga

48

“By learning you will teach, by teaching

you will learn.”

“You see things; and you say, 'Why?' But I dream things that never were; and I say, ‘Why not?’”

49

“By learning you will teach, by teaching

you will learn.”- Latin Proverb

“You see things; and you say, 'Why?' But I dream things that never were; and I say, ‘Why not?’”

- George Bernard Shaw

Model-Driven Software Verification (MDSV)

Gerard J. Holzmann & Rajeev JoshiJPL Laboratory for Reliable Software, Caltech

(Formal Methods for Developing Reliable Software Systems)Master in Software Engineering & Artificial Intelligence

ModelModel--Driven Software Driven Software Verification (MDSV)Verification (MDSV)

Gerard J. Holzmann & Rajeev JoshiGerard J. Holzmann & Rajeev JoshiJPL Laboratory for Reliable Software, CaltechJPL Laboratory for Reliable Software, Caltech

(Formal Methods for Developing Reliable Software Systems)(Formal Methods for Developing Reliable Software Systems)Master in Software Engineering & Artificial IntelligenceMaster in Software Engineering & Artificial Intelligence

Computer Science DepartmentUniversity of Malaga

Juan Antonio Martin Checa2011

Computer Science DepartmentUniversity of Malaga

Juan Antonio Martin Checa2011

top related