model based software testing - ibm researchsummer seminar october 14 2004. ibm labs in haifa –...

20
Programming Languages and Environments Department IBM Labs in Haifa © 2004 IBM Corporation Model Based Software Testing Alan Hartman www.haifa.il.ibm.com/projects/verification/gtcb/ Summer Seminar October 14 2004

Upload: others

Post on 29-May-2020

1 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Model Based Software Testing - IBM ResearchSummer Seminar October 14 2004. IBM Labs in Haifa – Programming Languages and Environments 2 Outline Motivation Model Driven Testing Tools

Programming Languages and Environments Department

IBM Labs in Haifa © 2004 IBM Corporation

Model Based Software Testing

Alan Hartman

www.haifa.il.ibm.com/projects/verification/gtcb/

Summer Seminar October 14 2004

Page 2: Model Based Software Testing - IBM ResearchSummer Seminar October 14 2004. IBM Labs in Haifa – Programming Languages and Environments 2 Outline Motivation Model Driven Testing Tools

IBM Labs in Haifa – Programming Languages and Environments

2

Outline

�Motivation

�Model Driven Testing Tools� Process

� Technology� Test Generation

� Test Execution

� Test Analysis

� Deployments

�The Path to Model Driven SW Engineering

Page 3: Model Based Software Testing - IBM ResearchSummer Seminar October 14 2004. IBM Labs in Haifa – Programming Languages and Environments 2 Outline Motivation Model Driven Testing Tools

IBM Labs in Haifa – Programming Languages and Environments

3

Defect Cost Over Time

%Defects

this phase

Coding UnitTest

FunctTest

FieldTest

PostRelease

% Defects found inthis phase

Per

cent

age

of B

ugs

$ Cost to repair defect in this phase

$250

$14,000

Introduced

(Apar $15-40,000)

Introduced

in

in $25

$1000

$130

in85%

� �� �� �� �� � � � � � � �� � � � � � � � � �� �� ��� � � � � � � � �� � �� �� � �

Page 4: Model Based Software Testing - IBM ResearchSummer Seminar October 14 2004. IBM Labs in Haifa – Programming Languages and Environments 2 Outline Motivation Model Driven Testing Tools

IBM Labs in Haifa – Programming Languages and Environments

4

Downtime Costs (per Hour)

�Brokerage operations $6,450,000�Credit card authorization $2,600,000�Ebay (1 outage 22 hours) $225,000�Amazon.com $180,000�Package shipping services $150,000�Home shopping channel $113,000�Catalog sales center $90,000�Airline reservation center $89,000�Cellular service activation $41,000�On-line network fees $25,000�ATM service fees $14,000

��������������� ��� �������� ������ ��������� ��� �������������� ������!"#$�� �������!�#%#&###���� ��������' ����'������(���')������("�������#*

Page 5: Model Based Software Testing - IBM ResearchSummer Seminar October 14 2004. IBM Labs in Haifa – Programming Languages and Environments 2 Outline Motivation Model Driven Testing Tools

IBM Labs in Haifa – Programming Languages and Environments

5

Testing Problem

�Today 80% of testers’ focus is making testing possible and only 20% is in making it meaningful

�Most defects discovered in system test could have been discovered in function test

�Cost of developing and supporting private test automation solutions in each lab

�Gap between developer and tester environment

�Gap between unit, function and system test

�System Under Test complexity

Page 6: Model Based Software Testing - IBM ResearchSummer Seminar October 14 2004. IBM Labs in Haifa – Programming Languages and Environments 2 Outline Motivation Model Driven Testing Tools

IBM Labs in Haifa – Programming Languages and Environments

6

History of Tools and Haifa Projects

�GOTCHA� Began 1997

� Text based modeling language

� Grew out of hardware testing technology

�AGEDIS� Began 2000

� UML based modeling language

� Focused on distributed applications

�Modelware� Began September 2004

� MOF based modeling

� Eclipse SW engineering project – for integrated MDD lifecycle

Page 7: Model Based Software Testing - IBM ResearchSummer Seminar October 14 2004. IBM Labs in Haifa – Programming Languages and Environments 2 Outline Motivation Model Driven Testing Tools

IBM Labs in Haifa – Programming Languages and Environments

7

Model Based Testing Process

State Machine Model

Abstract Test Suite

Test Scripts

Test Execution

Log

1.Modelling 2.Generation 3.Translation

5.Execution

4.Execution

Editor GOTCHA Spider

Spider

Executor

System Specs.

Design

Code

Code

InterfaceDesign

Data

Tool

Legend

Bug

System UnderTest

Interface

Page 8: Model Based Software Testing - IBM ResearchSummer Seminar October 14 2004. IBM Labs in Haifa – Programming Languages and Environments 2 Outline Motivation Model Driven Testing Tools

IBM Labs in Haifa – Programming Languages and Environments

8

Benefits of Model Based Testing

�Starting from specification� Involves testers early in the development process � Teams testers with developers� Forces testability into product design

�Building behavioural model and test interface� Finds design and specification bugs - before code exists� The model is the test plan - and is easily maintained

�Automated test suite generation� Coverage is guaranteed - increases testing thoroughness� Zero test suite maintenance costs

�Automated test suite execution� Finds code and interface bugs� Includes a framework for the testing of distributed applications� Reduces test execution costs

Page 9: Model Based Software Testing - IBM ResearchSummer Seminar October 14 2004. IBM Labs in Haifa – Programming Languages and Environments 2 Outline Motivation Model Driven Testing Tools

IBM Labs in Haifa – Programming Languages and Environments

9

AGEDIS ArchitectureAGEDIS ArchitectureG

UI

& P

rodu

ctiv

ity A

ids Model

GenerationDirectives

ExecutionDirectives

Compiler IntermediateFormat

Simulator

Generator

AbstractTest Suite

Suite Execution

Trace

Analyzers

VisualizerEditor

Execution

Page 10: Model Based Software Testing - IBM ResearchSummer Seminar October 14 2004. IBM Labs in Haifa – Programming Languages and Environments 2 Outline Motivation Model Driven Testing Tools

IBM Labs in Haifa – Programming Languages and Environments

10

GOTCHA Coverage directed Test Generation

�Behavior model describes:� Data types� Variables� Behavior rules (methods)

�Generate tests to cover the input� Cover the behavior rules� Cover the rule parameter combinations� Cover transitions between parameter combinations

�Generate tests to cover the behavior� Cover variable combinations� Cover transitions between variable combinations

�Interactive test generation � Walk through the model � Record and play the walk-through

Page 11: Model Based Software Testing - IBM ResearchSummer Seminar October 14 2004. IBM Labs in Haifa – Programming Languages and Environments 2 Outline Motivation Model Driven Testing Tools

IBM Labs in Haifa – Programming Languages and Environments

11

AGEDIS Test GeneratorAGEDIS Test Generator

� Based on GOTCHA and TGV

� GOTCHA � uses textual specification language

� explicit traversal of state space

� extensive coverage criteria

� TGV� language independent simulator

� focus on distributed applications

� explicit test purposes as sequences of interactions

Page 12: Model Based Software Testing - IBM ResearchSummer Seminar October 14 2004. IBM Labs in Haifa – Programming Languages and Environments 2 Outline Motivation Model Driven Testing Tools

IBM Labs in Haifa – Programming Languages and Environments

12

Dealing with State Explosion

�State machine model contains states defined by values of state variables, arcs defined by method calls

�Coverage tasks are equivalence classes of states, arcs, paths

�Aim of a test generator is to select a set of execution paths to serve as test cases

�Exhaustive exploration of all execution paths is infeasible� Selection of parameters to methods is done by sophisticated

sampling techniques

� Selection of test cases is done by careful randomization of coverage task representative selection

Page 13: Model Based Software Testing - IBM ResearchSummer Seminar October 14 2004. IBM Labs in Haifa – Programming Languages and Environments 2 Outline Motivation Model Driven Testing Tools

IBM Labs in Haifa – Programming Languages and Environments

13

Test Execution Engine

� Distributed support for major platforms (Windows, Unix)

� Direct execution in Java, C, C++, command line, sockets

� Translation to existing test harness

� Automated comparison with predicted results (traceable back to ATS)

� Synchronous & Asynchronous support:

� Environment to SUT interactions

� SUT to Environment interactions

� Add invariant operations, e.g. setup and cleanup

� Test object multiplication and stepwise synchronization

� Interactive and batch execution

Page 14: Model Based Software Testing - IBM ResearchSummer Seminar October 14 2004. IBM Labs in Haifa – Programming Languages and Environments 2 Outline Motivation Model Driven Testing Tools

IBM Labs in Haifa – Programming Languages and Environments

14

SUT – System Under Test

Distributed Components

N e t w o r k

. . .Host Host

. . .Process Process

. . .Object Object

Page 15: Model Based Software Testing - IBM ResearchSummer Seminar October 14 2004. IBM Labs in Haifa – Programming Languages and Environments 2 Outline Motivation Model Driven Testing Tools

IBM Labs in Haifa – Programming Languages and Environments

15

System Overview

Host 1 Host 2 Host 3

Main Host

TSD

HM HMHMPCPC PCPC

O O O O O O

Legend:TSD – Test Suite DriverHM – Host ManagerPC – Process ControllerO – Object

Page 16: Model Based Software Testing - IBM ResearchSummer Seminar October 14 2004. IBM Labs in Haifa – Programming Languages and Environments 2 Outline Motivation Model Driven Testing Tools

IBM Labs in Haifa – Programming Languages and Environments

16

N e t w o r k

ProcessController

SUTObject

Object Proxy

Process Controller may interact:

� directly with the SUT Object� indirectly via Object Proxies created

by the tester� Java and C++ Proxy support� wizards for creating proxy templates

Object ProxySUT

Object

Page 17: Model Based Software Testing - IBM ResearchSummer Seminar October 14 2004. IBM Labs in Haifa – Programming Languages and Environments 2 Outline Motivation Model Driven Testing Tools

IBM Labs in Haifa – Programming Languages and Environments

17

AGEDIS Test Analysis Technology

�Defect cluster analysis� Analysis of defect traces

� Clustering by similarity measures to extract defect signature

�Coverage analysis of test suite and execution trace� Which parts of the model have been explored

� Which data variables have taken which values

� Analysis of uncovered combinations

�Feedback to test generation� Synthesis of test purposes to recreate defect signature and/or

coverage hole

Page 18: Model Based Software Testing - IBM ResearchSummer Seminar October 14 2004. IBM Labs in Haifa – Programming Languages and Environments 2 Outline Motivation Model Driven Testing Tools

IBM Labs in Haifa – Programming Languages and Environments

18

Deployments

�Service Processor Controller

�Wireless

�Storage Controller

�Mainframe

�Messaging middleware

�Data Base Application

�Call Center Application

Page 19: Model Based Software Testing - IBM ResearchSummer Seminar October 14 2004. IBM Labs in Haifa – Programming Languages and Environments 2 Outline Motivation Model Driven Testing Tools

IBM Labs in Haifa – Programming Languages and Environments

19

File System

� Retest of functions� Modelling and translation by testers� Comparison

�Original test: 18 bugs,12 PM �Pilot test: 15 original bugs + 2 escapes, 10 PM (INCLUDING learning curve)

� Conclusions:�Efficient way to free the tester for creative testing�Replaces a large part of the manual test case writing

DEFECTS BY SEVERITY

# %

1 0 02 10 58.83 6 35.24 1 5.8

DEFECTS BY ODC TRIGGERS

# %

Coverage 6 35.2Variation 1 5.8Sequencing 8 47.0Interaction 1 5.8Load 1 5.8

Page 20: Model Based Software Testing - IBM ResearchSummer Seminar October 14 2004. IBM Labs in Haifa – Programming Languages and Environments 2 Outline Motivation Model Driven Testing Tools

IBM Labs in Haifa – Programming Languages and Environments

20

DataBase Application

�A large number of DB update methods

�Three tier Websphere application

�COBOL code with a Java testing interface

�Rapid deployment!� 5 person team working within 2 weeks

� Automated model template from COBOL

� Wizard for execution interface

� SQL code generated

�Bugs: 21 code, 19 error handling, 39 documentation