cs48717-1 /30 illinois institute of technology

30
CS48717-1/30 Illinois Institute of Technology CS487 Software Engineering Testing - Part II Software Testing Strategies Mr. David A. Lash © Carl J. Mueller, 200

Upload: softwarecentral

Post on 28-Jun-2015

432 views

Category:

Documents


1 download

TRANSCRIPT

Page 1: CS48717-1 /30 Illinois Institute of Technology

CS48717-1/30

Illinois Institute of Technology

CS487

Software Engineering

Testing - Part II

Software Testing Strategies

Mr. David A. Lash

© Carl J. Mueller, 2000

Page 2: CS48717-1 /30 Illinois Institute of Technology

CS48717-2/30

Why Software Testing Strategy?

Some Test Strategy Issues:Should develop a formal test plan?

Test entire program as a whole or various pieces?

When should customer be included? Testing often accounts for more effort then developmentUsually strategy is defined in an overall testing strategy and procedure document (often by independent test group)

Page 3: CS48717-1 /30 Illinois Institute of Technology

CS48717-3/30

Elements Of Software Test Strategy

Unit Testing - Tests the implemented source code of the component. Uses White Box.

Integration Testing - focuses on the design, testing and integration of components. Black box and limited White Box.

Validation Testing - Looks at requirements a validates software against it. Black Box Testing.

System Testing - Looks at software and other system elements tested as a whole. (Other hardware, people, databases, processes.)

Page 4: CS48717-1 /30 Illinois Institute of Technology

CS48717-4/30

Unit Testing

Unit Testing - Module test – interfaces are tested to ensure information

flows into and out of the program correctly. – Data structures are tested. – Independent paths (basis paths) are

exercised)– Error handling paths - A mistake to miss these.

Use drivers (accepts data

and tests and passes

to module) and stubs

(replace other modules)

Driver

Stub

Module

Stub

TestCases

Results

Page 5: CS48717-1 /30 Illinois Institute of Technology

CS48717-5/30

Integration A systematic technique to uncover problems

while putting the modules together. Composed of two activities:

– Building Compile and link all of the modules together

– Testing Verify interface function as specified

Avoid the “big bang” approach A.K.A. non-incremental integration

– can yield chaotic results.

Page 6: CS48717-1 /30 Illinois Institute of Technology

CS48717-6/30

Implementation Process

ImplementUnit 1

O bject

ImplementUnit n

O bject

In tegration

Executable

Verification

Report

Page 7: CS48717-1 /30 Illinois Institute of Technology

CS48717-7/30

Top-down (depth-first or breadth-first)– Test higher level modules first– Require stubs (not just returns)

Bottom-up– Test lower-level modules first– Require drivers (require logic)

Sandwich– Test highest level modules in conjunction with

lowest level meeting in the middle– Use stubs and drivers

Approaches to Integration Testing

Page 8: CS48717-1 /30 Illinois Institute of Technology

CS48717-8/30

Example System

M 1

M 2 M 3 M 4

M 5 M 6

M 8

M 7

Page 9: CS48717-1 /30 Illinois Institute of Technology

CS48717-9/30

Top Down (depth) Example

Depth First

– Integrate M1, use stubs for M2, M3 and M5 .

Test all M1 interfaces

– Next would test M2 and use operational stubs

for M3, M4, M5, M6 .Test all M2 interfaces

– Repeat for M5, then M8, M6, and M3...M 1

M 2 M 3 M 4

M 5 M 6

M 8

M 7

Page 10: CS48717-1 /30 Illinois Institute of Technology

CS48717-10/30

Top Down (depth) Example

Integrate M1, use stubs for M2, M3 and M5 . Test all M1 interfaces

Next would test M2 and use operational stubs for M3, M4, M5, M6 .Test all M2 interfaces

Replace stub M5 with module M5. Use operational stubs for M3, M4, M6, M8. Test all the interfaces in M5

Repeat for M8, M6, and M3...

M 1

M 2 M 3 M 4

M 5 M 6

M 8

M 7

Page 11: CS48717-1 /30 Illinois Institute of Technology

CS48717-11/30

Top Down (breath) Example Breath First Order

– M1, M2, M3, M4,

– Next level would be M5, M6, ...

Next Top down testing can have problems when low level

modules are needed (delay full tests or develop better stubs

or go bottom up)

M 1

M 2 M 3 M 4

M 5 M 6

M 8

M 7

Page 12: CS48717-1 /30 Illinois Institute of Technology

CS48717-12/30

Bottom Up

Lower Items first using drivers :

– Using an M8 driver test module M8

– Using an M7 driver test module M7

– Using an M6 driver test module M6

– Using an M5 driver test module M5M 1

M 2 M 3 M 4

M 5 M 6

M 8

M 7

Page 13: CS48717-1 /30 Illinois Institute of Technology

CS48717-13/30

Bottom Up Example II

In reality lower items are clustered together into sub-functions (aka builds).

M2

Driver1 Diver2 Driver3

M1

M3

Page 14: CS48717-1 /30 Illinois Institute of Technology

CS48717-14/30

Sandwich Example

First Round

– Top-down Use operational stubs for M2, M3, M4

Test all the interfaces in M1

– Bottom-up Using an M8 driver test module M8

Using an M7 driver test module M7

Using an M6 driver test module M6M 1

M 2 M 3 M 4

M 5 M 6

M 8

M 7

Page 15: CS48717-1 /30 Illinois Institute of Technology

CS48717-15/30

Regression Testing

The reexecution of some subset of tests to ensure changes have not propagated errors.

Tests conducted when a previously tested software is delivered

– Continuing testing after repair

– New Version

Partial testing old features when testing a maintenance release.

Might include tests of features/functions that were unchanged doing the current release.

Capture/playback tools enable re-execute of tests previously run.

Page 16: CS48717-1 /30 Illinois Institute of Technology

CS48717-16/30

Verification

(1) The process of determining whether or not the products of a given phase of the software development cycle fulfill the requirements established during the previous phase

(2) Formal proof of program correctness

(3) The act of reviewing, inspecting, testing, checking, auditing, or otherwise establishing and documenting whether or not items, processes, services and documents conform to specified requirements

Page 17: CS48717-1 /30 Illinois Institute of Technology

CS48717-17/30

Functional Verification

Verification of the Functional Design Specification

Should be done by developers Normally done by user or testing group

Can be combined with integration testing

Page 18: CS48717-1 /30 Illinois Institute of Technology

CS48717-18/30

Validation

Are we building the right product?Does it meet the requirements?

Page 19: CS48717-1 /30 Illinois Institute of Technology

CS48717-19/30

Validation PhaseRequirem ents

Specification

Design

Specification

Im plem entation

Software

Validation

Report

Page 20: CS48717-1 /30 Illinois Institute of Technology

CS48717-20/30

Validation

The process of evaluating software at the end of the software development process to ensure compliance with software requirements.

Test plans define classes of test to be completed

Page 21: CS48717-1 /30 Illinois Institute of Technology

CS48717-21/30

Validation Activities

System A process that attempts to demonstratethat a program or system does notmeet its original requirements andobjectives as stated in theRequirements

Acceptance The process of comparing the end-product to the current needs of its endusers.

Regression The re-execution of some subset oftest that have already been conducted.

Page 22: CS48717-1 /30 Illinois Institute of Technology

CS48717-22/30

Validation Process

Regression

System

Acceptance

New Version

Re-startingValidation or

Next Interation

Defect

Defect

Defect

Deploy

Page 23: CS48717-1 /30 Illinois Institute of Technology

CS48717-23/30

System Tests

FunctionalRequirements

The process of attempting todetect discrepancies between aprogram’s function and it’sspecification.

Volume Determines whether the programcan handle the required volumesof data requests, etc.

Load/Stress Identify the peak load conditions atwhich the program will fail tohandle required processing loadswithin required time spans.

Security Show that the program’s securityrequirements can be subverted.

Page 24: CS48717-1 /30 Illinois Institute of Technology

CS48717-24/30

System Tests

Recovery Determine whether the system orprogram meets its requirements forrecovery

Performance Determines whether the programmeets the performancerequirements.

ResourceUsage

Determines whether the programuses resources at levels, whichexceed requirements.

Configuration Determine whether the programoperates properly when thesoftware or hardware is configured.

Page 25: CS48717-1 /30 Illinois Institute of Technology

CS48717-25/30

System Tests

Compatibility Determine whether the compatibility objects of the program have been.

Installability Identify the ways in which the installation procedures lead to incorrect results.

Serviceability Identify conditions whose serviceability needs won’t meet requirements.

Reliability Determines whether the system meets its reliability and availability requirements.

Page 26: CS48717-1 /30 Illinois Institute of Technology

CS48717-26/30

System Tests

Usability Identify those operations that willdifficult or inconvenient for theusers. Pubs, Doc and manualprocedures.

Page 27: CS48717-1 /30 Illinois Institute of Technology

CS48717-27/30

Rules for High-Level Tests

1. Test are run on operational production hardware and software.

2. Tests must stress the system significantly

3. All interfaced systems must be in operation throughout the test.

4. Test duration should run a full cycle

5. Tests should exercise the system over a full range of inputs, both legal and illegal

6. All major functions and interfaces should be exercised.

7. If running the entire test cannot be completed, a new run should include a complete restart.

Page 28: CS48717-1 /30 Illinois Institute of Technology

CS48717-28/30

Acceptance Testing

Depends on the type of product and the situation.

When working with an a non-technical user, the following are recommended.

– First, train user on system with a number of test cases.

– Use a comparison test. Re-processing Parallel to existing system

Page 29: CS48717-1 /30 Illinois Institute of Technology

CS48717-29/30

Acceptance Testing

Product Development– Alpha testing

Performed by a selected end-users usually in side the development environment.

– Beta testing Performed by a subset of actual customers

outside the company, before the software is made available to the customer.

– Field Trial Large number of users, but not general

release.

Page 30: CS48717-1 /30 Illinois Institute of Technology

CS48717-30/30

Acceptance Testing

Work for Hire– Training Programming

Evaluate based on comments users general reaction to system.

Usability Test– Parallel Operation

Comparison Testing New software used after old system

process.– Controlled deployment

Representative monitors software at site.