1. 2 software quality engineering cs410 class 2 software process models

Post on 15-Dec-2015

225 Views

Category:

Documents

1 Downloads

Preview:

Click to see full reader

TRANSCRIPT

1

2

Software Quality EngineeringCS410

Class 2

Software Process Models

3

SW Process Models

• A process model is:– Methodology - 1) A body of methods, rules,

and postulates (accepted theory or statement) employed by a discipline. 2) A particular procedure or set of procedures. (Webster’s 9th Collegiate Dictionary 1988)

– Life Cycle - synonym for methodology

• A process model is not:– Method

4

SW Process Models (cont.)

• Method - A step-by-step technical approach for performing one or more of the major activities identified in an overall methodology for software development.

• For example: Structured analysis and Object-Oriented analysis are both methods for carrying out the analysis phase of a software development project.

5

SW Process Models (cont.)

• Common Process Models– Waterfall– Prototyping

• Rapid Throwaway

• Evolutionary

– Spiral– Iterative– Object-Oriented– Cleanroom

6

Waterfall• First attempt at controlling SW development.

• Encourages requirements gathering, ordered stages, and documentation

• Divide and conquer approach

• Clearly defined stages– Requirements– Design– Code– Test– Maintenance

7

Waterfall (cont.)• Deliverables at each stage

• Entry/Exit criteria for each stage

• Feedback between stages

• Emphasis on documents

• Waterfall model– Figure 1.2 p. 6– Figure 2.1 p. 15

8

Waterfall (cont.)

• Requirements Gathering/Analysis Stage– Meetings with customers– Feedback forms– Trade shows– Conferences– Internal requirements (ISO, etc.)– Requirements document is deliverable

9

Waterfall (cont.)• Architectural Design Stage

– Ensure operational consistency with product line(s), and standards

– Architecture:• The structure of the components of a

program/system, their relationships, and principles and guidelines governing their design and evolution over time.

• Gross organization and control structure

• Protocols for communication, synchronization, and data access

10

Waterfall Sample

Source: MS Project

11

High-Level Design (HLD)• Develop external functions and interfaces

– User interfaces– Application Programming Interface (API)– System Interfaces– Inter-component Interfaces– Data structures

• Design control structure

• Ensure functional requirements are met

12

High-Level Design (HLD)

• Ensure components fit into system/product structure (feedback to Architecture stage)

• Ensure component design is complete

• Ensure external functions can be accomplished (feedback to Requirements stage)

13

Low Level Design (LLD)

• Identify parts, modules, macros, include files, etc. that will be written or changed

• Finalize detail level of design

• Verify changes in HLD (feedback to HLD)

• Verify correctness/completeness of HLD (feedback to HLD)

• Create component test plans (from requirements and design specs)

14

Code Stage

• Transform LLD into coded parts– Code modules, macros, includes, messages etc.– Create component test cases– Verify design (feedback to LLD and HLD)

15

Unit Test Stage

• Verify code against LLD/HLD

• Execute all new and/or changed code– Branch coverage– Statement coverage– Logic coverage

• Verify error messages, error handling, return codes, input parameters

• May require stubs and drivers

16

Component Test• Test the combined software parts that make

up a component after the parts have been integrated into a library (Configuration Management - CM)

• Objective is to test:– External user interfaces (do they meet

requirements)– Inter-component interfaces– APIs

17

Component Test– Functionality– Intra-component (module) interfaces– Error recovery– Messages– Concurrency (multi-tasking)– Shared Resources– Existing functionality (regression)

18

System-Level Test• Four portions

– System Test, Regression Test, Performance, Usability

• System Test– Test concurrency– Stress test– Test overall system stability and completeness

19

System-Level Test• Regression Test

– Test original (unchanged) functions

• Performance Test– Validate performance of system/product against

requirements– Record performance metrics– Establish baselines (I.e. benchmark)

• Usability Test– Test for usability requirements

20

Goal of Testing

• Verification– Verify that the system/product we built meets

all of the user requirements for usability, performance, reliability, etc. Verify that the intrinsic quality is high and that standards have been met.

• Validation– Validate that the requirements that drove the

process were correct.

21

Waterfall (cont.)• Advantages:

– Better than an adhoc approach– Process, requirements, entry/exit criteria, designs

are all documented

• Disadvantages:– Assumption that requirements will not change– Heavy emphasis on documents– Performance problems detected late in cycle– Rework is costly– Feedback is not formalized

22

Prototyping Model

• Designed to deal with changing or unknown customer requirements.

• A prototype is a partial implementation of the product expressed either logically or physically with all external interfaces presented.

• Prototype is ‘used’ by customer to help develop requirements

23

Prototyping Model

• Prototyping steps:– Requirements gathering and analysis– Quick design– Build prototype– Customer evaluation– Refinement of design and prototype (and

requirements)– Decision: Iterate or accept

• Figure 2.2 p. 20

24

Prototyping Model

• Key to success: Quick turnaround and inexpensive

• Throwaway Prototyping– Good for:

• High-risk projects

• Complex problems

• Performance concerns

– “quick and dirty” approach– Once customer satisfied, then development of

“real thing” begins

25

Prototyping Model

• Evolutionary Prototyping– Some requirements are known– Each iteration evolves (refines) prototype– May or may not become final product

26

Prototyping Model• Advantages

– Good for interface design– Good when requirements/problem not understood– Problems detected/corrected early– Requirements refined and validated

• Disadvantages– Hard to know when to stop– Could be costly– Tempting to use prototype as product (in

throwaway approach)

27

Spiral Model

• Refinement of the Waterfall Model

• Focus is on Risk Management and prototyping

• Idea: A sequence of steps (cycle) is executed for each level of elaboration. A risk analysis is performed in each cycle.

• Prototyping is applied to the high-risk areas

• Figure 2.3 p. 23

28

Spiral Model

• Advantages– Risk Driven– Incorporates prototypes– May encourage reuse– Eliminates bad alternatives early– Incorporates maintenance– Allows for quality objectives to be incorporated

29

Spiral Model

• Disadvantages– Immature process– Looser checkpoints (compared to Waterfall

with the documents and entry/exit criteria)– Requires good understanding of Risk-Analysis

and good risk driven decisions

30

Iterative Model

• Idea: Start with subset of requirements and develop a subset of the product to meet these requirements. After analysis of product, iteration is done and process is repeated with new requirement subset.

• Goal: Provide a system/product to user that meets evolving customer needs with improved design based on feedback and learning.

31

Iterative Model

• Combination of Waterfall and Prototyping– Non-iterative portion like Waterfall Model– Iterative portion like Prototyping Model

• Similar to Spiral model

• Key elements– Test suite developed at each iteration– Each iteration is verified and validated

• Figure 2.4 p. 26

32

Iterative Model

• Advantages– Complexity broken down– Problems detected early– Allows evolution of requirements– Smaller development teams

• Disadvantages– Risk Analysis not formalized– Less parallelism

33

Object-Oriented Model

• Paradigm shift away from data and control, to objects which incorporate both data and methods.

• Eight step process (Branson and Herness)

• 1. Model the essential system– User view– Essential activities– Identify essential model

34

Object-Oriented Model

• 2. Derive candidate essential classes– Class and method selection based on the essential

model identified in step 1

• 3. Constrain the essential model– Model must be constrained to work within the

limits of the target implementation environment

• 4. Derive additional classes– Derive classes/methods specific to

implementation environment (I.e. environmentals)

35

Object-Oriented Model

• 5. Synthesize classes– Organize classes into a class hierarchy

• 6. Define interfaces– Interfaces and class definitions are written

based on the class hierarchy

• 7. Complete design– Complete design to include logic, system

interactions, method invocations

• 8. Implement solution

36

Object-Oriented Model

• Phases– Analysis: steps 1-2– Design: steps 3-6 (can be iterated)– Implementation: steps 7-8

• Reviews are performed to enhance control of the project

• Prototypes can be used for the essential model

37

Object-Oriented Model• Advantages

– OO may promote higher reuse levels– Promising new technology– Works well on small projects

• Disadvantages– No commonly recognized OO model– Training– Paradigm shifts– Process needs to mature

38

Cleanroom Model

• Statistical (mathematical) based

• Based on correctness verification and incremental development

• Developers must ‘prove’ designs/code rather than test designs/code

• Statistical testing replaces unit testing

• Figure 2.5 p. 33

39

Cleanroom Model

• Advantages– Developers are motivated to “get it right the

first time” - no unit test safety net– Promises significant quality improvements

• Disadvantages– All (statistical) testing is based on expected use– Learning curve, training requirements– Limited use in industry– Questions regarding scalability

40

Pop QuizIn 27 words or less:

• Define:• CMM• Defect Rate• ISO 9000• Spiral Model• DPP

• What are the advantages of doing a Malcolm Baldrige Quality assessment?

41

Review

• CMM

• Waterfall Model– High Level vs Low Level– Unit vs Component Testing

• Prototyping

• Spiral Model

42

Software Quality EngineeringCS410

Class 3

DPP, Process Maturity,

Quality Standards

43

Defect Prevention Process• A process for continually improving a

software development process

• It is not a software methodology or model

• Three main steps– Analyze existing defects or errors to trace the

root cause– Suggest preventative actions to eliminate the

defect root cause– Implement the preventative actions

44

Defect Prevention Process• Four key elements:

– Causal analysis meeting - analyze defects for each stage of the development process, identify root cause, identify prevention actions, look for trends

– Action team - prioritize and implement action items

– Stage kickoff meeting - level setting, emphasis on quality improvement through action items and process improvements, pitfalls identified and discussed

45

Defect Prevention Process– Action tracking and data collecting - record all

problems, root causes, and actions (and action status)

• DPP should be done at every stage (unlike postmortem which is at end of entire project)

• DPP addresses ISO 9000 corrective action

• DPP can be used with any development model, but may be more effective with some (I.e. iterative)

46

Process Maturity Frameworksand Quality Standards

• Attempt to measure implementation maturity of a software development process (I.e. how well is it executed), and analyze the quality management systems in place

• Capability Maturity Model (CMM)• Software Productivity Research (SPR) assessment• Malcolm Baldrige process/assessment• ISO 9000

47

Capability Maturity Model• Developed by Software Engineering Institute

(SEI) at Carnegie-Mellon Univ.

• A (process) maturity assessment framework

• Based on 85 item questionnaire

• Five levels of process maturity (p. 40-41)1. Initial

2. Repeatable

3. Defined

4. Managed

5. Optimizing

48

SPR Assessment(p. 43)

• Developed by Software Productivity Research (SPR) Inc.

• Based on 400 question questionnaire

• Questions are rated 1-5, and overall process rating is expressed on 1-5 scale

• Questions focus on – Strategic corporate issues– Tactical project issues

• Quality, Productivity, Customer satisfaction

49

Malcolm Baldrige Assessment• Malcolm Baldrige National Quality Award

(MBNQA) - “most prestigious quality award in the United States”

• Seven categories of examination criteria– Leadership

– Information and analysis

– Strategic quality planning

– Human resource utilization

– Quality assurance of products and service

– Quality results

– Customer Satisfaction

50

Malcolm Baldrige Assessment• 28 examination items• 1000 possible points awarded• Scoring is based on three evaluation dimensions

– Approach - methods used to address the item

– Deployment - extent to which the approach is applied

– Results - outcomes and effects

• Winners need to score 70% or better to be considered

• Evaluation feedback/suggestions are provided regardless of score

51

ISO 9000

• International Organization for Standards– (standards 9000, 9001, 9002, 9003)

• A set of standards for quality assurance

• Heavy influence in Europe

• Focus– Process quality– Process Implementation

52

ISO 9000

• Goal - to achieve certification/registration

• Formal Audit performed

• Trial-runs can be performed by independent consulting firms - goal is to get feedback and suggestions

• First time failures 60% - 70%

• 20 elements to guideline (p. 46)

• Heavy emphasis on documentation (p. 47)

53

ISO 9000

• ISO Focus on metrics (statistical techniques)

• Product Metrics Goals– Collect data and report values on regular basis

– Identify current metric performance level

– Take action if performance level is unacceptable

– Establish improvement goals

• Process Metrics Goals– Determine if quality objectives are being met

– Track adherence to process model and methods

– Track defect prevention effectiveness

top related