best practices for software process and product measurement · software process and product...

77
Click to edit Master title style Software Engineering Institute Carnegie Mellon University Pittsburgh, PA 15213 Best Practices for Software Process and Product Measurement Dr. Bill Curtis Executive Director, CISQ SVP & Chief Scientist, CAST Research Labs

Upload: others

Post on 22-May-2020

8 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Best Practices for Software Process and Product Measurement · Software Process and Product Measurement Dr. Bill Curtis ... measurement best practice • Best guidance for starting

Click to edit Master title style

Software Engineering Institute Carnegie Mellon University Pittsburgh PA 15213

Best Practices for Software Process and Product Measurement Dr Bill Curtis Executive Director CISQ

SVP amp Chief Scientist CAST Research Labs

Click to edit Master title style Instructor Dr Bill Curtis Dr Bill Curtis is Senior Vice President Chief Scientist of CAST Software where he heads CAST Research Labs CAST is a leading provider of technology for measuring the structural quality of business applications He leads the production of CASTrsquos annual CRASH Report on the global state of structural quality in business applications He is also the Director of the Consortium for IT Software Quality a Special Interest Group of the Object Management Group (OMG) where he leads an international effort to establish standards for automating measures of software quality characteristics Dr Curtis is a Fellow of the Institute of Electrical and Electronics Engineers (IEEE) for his career contributions to software process improvement and measurement

Dr Curtis is a co-author of the Capability Maturity Model for Software (CMM) the People CMM and the Business Process Maturity Model He was also a member of the Product Suite Team for CMMI He is a globally recognized expert in software measurement process improvement and workforce development He has been leader in the application of statistics to software engineering He was the Chief Process Officer at Borland Software Corporation Until itrsquos acquisition by Borland he was a Co-founder and Chief Scientist of TeraQuest in Austin Texas the premier provider of CMM-based services He is a former Director of the Software Process Program in the Software Engineering Institute at Carnegie Mellon University Prior to joining the SEI Dr Curtis directed research on advanced user interface technologies and the software design process at MCC the USrsquos fifth generation computer research center in Austin Texas Prior to MCC he developed a global software productivity and quality measurement system at ITTrsquos Programming Technology Center Prior to ITT he evaluated software development methods in GE Space Division worked in organizational effectiveness at Weyerhaeuser and taught statistics at the University of Washington He has co-authored four books is on the editorial boards of several journals and has published over 150 papers on software development and management Dr Curtis earned a PhD from Texas Christian University and a MA from The University of Texas He earned his BA from Eckerd College in mathematics psychology and theater

Click to edit Master title style Agenda section slide

I Measuring Software Size and Productivity 3 Break

II Adopting Measures to Development Methods 33 Lunch

III Measuring Software Improvement Programs 61 Break

IV Measuring Software Product Quality 89 Adjourn

V References 115

copy 2017 The presentation material in these tutorial notes is copyrighted by Dr Bill Curtis It is to be used for the benefit of the individuals attending this tutorial It may not be copied or distributed to others without written permission from Dr Curtis For further information please contact

reg Capability Maturity Model Capability Maturity Model Integration People Capability Maturity Model CMM and CMMI are registered in the US Patent and Trademark Office

Dr Bill Curtis PO Box 126079 Fort Worth Texas 76126-0079 USA

curtisacmorg

Click to edit Master title style Section 1 Section 1 Measuring Software Size and Productivity

1 Practical Software Measurement

2 Size and functional measures

3 Measuring software productivity

4 Analyzing software productivity

4

Click to edit Master title style Why Does Measurement Languish

Unreliable data

Too many measures

Poorly matched to process maturity

Badly defined measures

Use in personal appraisals

Click to edit Master title style Measurement Process Guidance

bull Based on decades of software measurement best practice

bull Best guidance for starting a measurement program

bull Compatible with ISO 15939 minus Software Measurement Process

bull Free guides amp licensed training and consulting available

bull httppsmsccom

6 McGarry et al (2002) Practical Software Measurement

Click to edit Master title style Measurement Categories Category Issue Example measure

Schedule amp Progress

Milestone completion Earned value

Actual vs planned dates Actual vs planned completions

Resources amp Cost

Personnel effort Facilities used

Person-hours Test equipment amp hours

Product Size Size of coded product Size of documentation

Function Points Pages of manuals

Product Quality Functional quality Structural quality

Defects per KLOC Violations of reliability rules

Process Performance

Efficiency Compliance

Defect detection efficiency Audit results

Product Value Return on investment Risk avoidance

∆ Increase in target revenue Cost of system outage

Customer Satisfaction

Account growth Selfndashsufficiency

Repeat business within 1 year Number of helpdesk calls

McGarry et al (2002) Practical Software Measurement p37

Leve

l 2

ge L

evel

3

Click to edit Master title style Productivity Analysis Objectives

Improvement

Productivity Analysis Estimation Benchmarking

Managing Vendors

Click to edit Master title style Productivity Analysis Measures

Adjustment Measures

bull Functional bull Structural bull Behavioral

bull Application bull Project bull Organization

Quality Demo-graphics

Primary Measures

bull Instructions bull Functions bull Requirements

bull Hours bull Roles bull Phases

Size Effort

Productivity Analysis

Click to edit Master title style Software Productivity

Release Productivity =

developed Size of software deleted or modified

Total effort expended on the release

Software Productivity =

Size of software produced

Total effort expended to produce it

Click to edit Master title style Software Size Measures

Instructions Lines of Code

Most frequently used Different definitions of a line can cause counts to vary by 10x Smaller programs often accomplish the same functionality with higher quality coding

Requirements-based Use Case Points Story Points

Use Case Points have not become widely used and need more development Story points are subjective to each team and are susceptible to several forms of bias

Functions Function Points

Popular in IT Several counting schemes (IFPUG NESMA Mark II COSMIC etc) Manual counting is expensive and subjectivemdashcertified counters can differ by 10 Automated FPs taking root

Click to edit Master title style How Many Lines of Code define LOWER 0 lower limit of table define UPPER 300 upper limit define STEP 20 step size main() print a Fahrenheit-Celsius conversion table int fahr for(fahr=LOWER fahrlt=UPPER fahr=fahr+STEP) printf(ldquo4d 61fnrdquo fahr (5090)(fahr-32))

1 2 3 4 5 6 7 8 9 0

2

4

6

8

10

12

Number of Lines of Code

Number of votes

B Park (1992)

Click to edit Master title style Function Point Estimation

Ebert amp Dumke (2007) Software Measurement p188

R2 = 95 y = 779x + 4350

0 200 400 600 800 1000 0

2000

4000

6000

8000

Un

adju

sted

Fu

nct

ion

Po

ints

EIs + EOs

Data store Software

Entries

Exits

Reads

Writes

Functional view of software

Functional size can be estimated from external inputs and outputs

Upfront functional analysis provides basis for good estimates

Repository of FP data provides basis for calibrating estimates

39

Click to edit Master title style Effort mdash Weakest Measure

Effort Unreliable

Inconsistent

Effort

After the fact estimates

bull Memory lapses bull Time-splicing bull Inconsistency

Under-reporting

bull Contract issues

bull HR issues

bull Impressions

Lack of normalization

bull Roles included

bull Phases included

bull Hours in P-Year

Click to edit Master title style How Quality Affects Productivity

Original productivity baseline

Incremental increases in technical debt

Continuing decrease in productivity

Unless you take action

Assumption Productivity is a stable number

Reality Productivity is unstable tending to decline

Click to edit Master title style Carry-forward Rework

Release N+2

Develop N+2 Rework N+2 Rework N+1 Rework N+0

Develop N

Release N

Rework N

Unfixed defects release N

Release N+1

Develop N+1 Rework N+1 Rework N+0

Unfixed defects release N

Unfixed defects release N+1

Click to edit Master title style Example of Quality Impact Project B (Better Faster Cheaper)

ndash 20 developers 3 months

ndash $120k per FTE

ndash 4 FPs per staff month

ndash 240 FPs delivered

$2500FP cost

ndash 5 critical violations per FP

ndash $500 per fix

ndash Cost for 1200 fixes = $600k

ndash Total Cost to Own = $1200k

$5000FP of TCO

Project A (Plodders)

ndash 20 developers 3 months

ndash $120k per FTE

ndash 3 FPs per staff month

ndash 180 FPs delivered

$3333FP cost

ndash 2 critical violations per FP

ndash $500 per fix

ndash Cost for 360 fixes = $180k

ndash Total Cost to Own = $780k

$4333FP of TCO

Project B is 25 more productive

However

Project A is 134 more productive

Click to edit Master title style Quality-Adjusted Productivity

Productivity

Estimation

Benchmarks

Value amp ROI

Etc

Automated Enhancement

Points

Size of both functional and non-functional code segments

Corrective effort in future releases for defects injected in

this release

Must add future effort to fix bugs into productivity

calculations

Effort amp Cost

Quality-Adjusted

Productivity

Automated Technical Debt

Click to edit Master title style Best Practices for Analysis

1) Segment baselines

2) Beware sampling effects

3) Understand variation

4) Evaluate demographics

5) Investigate distributions

6) Account for maturity effects

7) Beware external data sets

Click to edit Master title style 1 Evaluate Demographics

Routine reports

COBOL apps

Function Points

Prod

uctiv

ity

Web-based apps

Innovative mobile

Custom ERP

Click to edit Master title style 2 Segment Baselines

Year Projects Productivity Total Corporate 1981 28 2342 1980 21 1939 Telecommunications 1981 14 1811 1980 12 1458 Engineering amp Defense 1981 8 2965 1980 6 2739 Business Applications 1981 6 3054 1980 3 1813

Multiple baselines are usually the most valid

Click to edit Master title style 3 Beware Sampling Effects

22

0

500

1000

1500

2000

2500

1980 1981 1982

Line

s of

cod

epe

rson

-yea

rs

Baselines

Lots of small

projects Lots of large

projects

Click to edit Master title style 4 Understand Variation

23

Product factors bull timing constraints bull memory utilization bull CPU occupancy bull resource constraints bull complexity bull product size

Project factors bull concurrent development bull development computer bull requirements definition bull requirements churn bull programming practices bull personnel experience bull client experience bull client participation

Project 49

Product 16

Not explained

35

R2product = 16

R2project = 49

R2

model = 65

J Vosburgh B Curtis et al (1984) Proceedings of ICSE 1994

Click to edit Master title style 5 Investigate Distributions

Low Medium High

Rel

ativ

e p

rod

uct

ivit

y (

of m

ean

)

Software Engineering Practices Usage

-100

-50

0

50

100

150

200

250

Click to edit Master title style 6 Account for Maturity Effects

Which organization will be required to spend more effort on correcting software flaws

Click to edit Master title style 7 Caution for External Data Sets

26

DACS Us

Productivity

Siz

e Data definitions

Data collection

Data validity

Data age

Demographics

Click to edit Master title style

1) Tracking traditional projects

2) Iterative and Agile measurement

3) Team Software Process measurement

33

Section 2 Measuring to Manage

Software Projects

Click to edit Master title style Measuring Project Progress

300K 250K 200K 150K 100K 50K 0K

Jan Feb Mar Apr May Jun

Planned rate of code Growth

Actual code growth

Completion of coding

ldquoThe measures say we are on schedulerdquo

300K

250K

200K

150K

100K

50K

0K Jan Feb Mar Apr May Jun

Planned rate of code Growth

Actual code growth

Completion of coding

ldquoBut the code keeps growing We should have measured the requirements growthrdquo

50

100

150

Defect reports

Total

Open

Closed

125

75

25

Jan Feb Mar Apr May Jun

Completion of coding

ldquoAnd all the defect reports remaining to be fixed will keep us far behind schedulerdquo

Click to edit Master title style Requirements Change Impact

Number of requirements

affected

Chart6

Requirements Added
Requirements Changed
Requirements Deleted
Person hours of work or rework
2
20
1
2
11
1
45
35
3
15
61
1
1
20
4
15
15
0
18
05
2

Sheet1

Sheet1

Requirements Added
Requirements Changed
Requirements Deleted
Person hours of work or rework

Sheet2

Sheet3

Click to edit Master title style Phase Defect Removal Model

PHASE

Escapes Previous Phase KSLOC

Defect Injection KSLOC

Subtotal KSLOC

Removal Effective-ness

Defect Removal KSLOC

Escapes at phase exit KSLOC

Rqmts 0 12 12 0 0 12

Design 12 180 192 76 146 46

Code amp UT 46 154 200 71 142 58

Inte-gration 58 0 58 67 39 19

Sys Test 19 0 19 58 11 08

Opera-tion 08

Kan (1999) Metrics and Model in Software Quality Engineering

Click to edit Master title style SCRUM Board Scrumboard displays

the path to completion for each story by showing task status

Story points are an estimate of days for each story

Agile estimating is by feature (story) rather than by task

lsquoDonersquo does not always mean all tasks are finished

Kanban methods restrict the work in progress to a limited number of paths

Click to edit Master title style Burndown Chart and Velocity

Burndown chart displays story completion by day

Burndown chart indicates actual versus estimated progress

Velocity is the rate at which stories are completed

Velocity indicates sustainable progress

Velocity results provide one source of historical data for improving estimates

Story points without the context of productivity factors are dangerous for estimating

Click to edit Master title style Team Velocity

0

1

2

3

4

5

1 2 2 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20Days

Story Points per Hour

0

1

2

3

4

5

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

HrsStryPt

Analyzing Velocity

Item Cycle Times

02468

101214161820

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

Stories Hours HrsStryPt

0

10

20

30

40

50

60

1 2 2 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

Burndown Chart

Days

Click to edit Master title style Measuring and Managing Flow

Released

In progress

Backlogged

Cumulative Flow Diagram Days to delivery

outliers

Anderson (2010) Kanban

Click to edit Master title style

0

10

20

30

40

50

60

70

80

1 2 3 4 5 6 7 8 9 10 11 12

Sprints

Sto

ry p

oin

ts d

eliv

ered

Trends Over Sprints

0

10

20

30

40

50

60

70

80

1 2 3 4 5 6 7 8 9 10 11 12

Sprints

Sto

ry p

oin

ts d

eliv

ered

0

05

1

15

2

25

3

35

4

1 2 3 4 5 6 7 8 9 10 11 12

Sprints

Tech

nic

al d

ebt

Track and correlate multiple measures

across sprints

Predictive models

Click to edit Master title style Sources and Measures

Stories submitted

Stories pulled back

Growth in size of stories

Stories added mid-sprint

Stories under review

Failed tests

Non-development tasks

Assists to other projects

Data sources

Measures Project tracking

Code control

Bug tracking

Build system

Test environment

Deployment tools

Operational monitoring

Click to edit Master title style Recommended Books

Click to edit Master title style Personal Software Process (PSP)

PSP 0 Current process Time recording

Defect recording Defect type standards

PSP 01 Coding standard

Size measurement Process improvement

proposal

Baseline Personal Process

Humphrey (1997) Introduction to the Personal Software Process

PSP 1 Size estimating

Test report

Personal Planning

PSP 2 Code reviews

Design reviews

Personal Quality Management

PSP 21 Cyclic development

Cyclic Personal Process

PSP 11 Task planning

Schedule planning

PSP 21 Design templates

Click to edit Master title style Team Software Process (TSP) bull Built from personal processes of team members

bull Well defined team roles

bull Project launch workshops for planning

bull Team measurement and tracking

bull Post-mortems for improvement

bull Team application of lean principles

Humphrey (2000) Introduction to the Team Software Process

Click to edit Master title style Aggregated Personal Data Best

R2s range from 70 to 87

The ability to predict the amount of time required to produce a piece of software is dramatically improved by estimating at the individual level first and then combining the estimates rather than developing a single team estimate averaged over individuals

R2 = 53

Click to edit Master title style Defect Detection at Intuit

Consequently test changes from defect

detection into correctness verification

PSPTSP shifts defect detection from the test phase back up into development

Fagan (2005)

Personal reviews

33

Compile 33

Team reviews 19

Unit test 15

Test 14

Click to edit Master title style

Dramatic reductions

ndash Delivered defects

ndash Days per KLOC

ndash Schedule deviation

ndash Test defectskloc

ndash Variation in results

Humphrey (2006) TSP Leading a Development Team

TSP Benefits

Click to edit Master title style Section 3 Section 3 Measuring Software

Improvement Programs

1) Improvement program methods

2) Maturity-based measurement

3) Improvement program results

61

Click to edit Master title style CMMI Maturity Transitions

Level 5 Optimizing

Innovation management

Level 4 Quantitatively

Managed

Capability management

Level 3 Defined

Process management

Level 2 Managed

Project management

Level 1 Initial

Inconsistent management

Chrisis Konrad amp Shrum (2005) CMMI

Click to edit Master title style Enhanced Maturity Framework

Level 5 Innovating

Innovation management

Level 4 Optimized

Capability management

Level 3 Standardized

Organizational management

Level 2 Stabilized

Work unit management

Level 1 Initial

Inconsistent management

Click to edit Master title style Maturity Level Transformation

Level 5 Opportunistic improvements

Empowered culture

Level 5 Proactive

improvements Agile culture

Organization

Projects

Individual

Level 4 Process and results

managed statistically Precision culture

Level 3 Organization develops

standard processes Common culture

Level 2 Project managers stabilize projects

Local culture

Level 1 Ad Hoc inconsistent

development practices Relationship culture

Trust

Click to edit Master title style Measurement by Level

Level 5 mdash Exploratory data ∆ t = N x ROI

Level 4 mdash Predictive data X plusmn (196σ radicn )

Level 3 mdash Process data 83 defects per review

Level 2 mdash Management data Effort = α LOCβ

Level 1 mdash Haphazard data ldquoabout a million linesrdquo

Measure-maturity mismatch slows improvement and creates

dysfunction

Click to edit Master title style Average Improvement Results

Productivity growth 4 35 9 - 67

Pre-test defect detection 3 22 6 - 25

Time to market 2 darr 19 15 - 23

Field error reports 5 darr 39 10 - 94

Return on investment 5 501 41 - 881

Annual Annual Improvement benefit Orgs median range

Click to edit Master title style Schedule Performance

0

02

04

06

08

1

12

14

16

1 2 3

Schedule performance

index

CMM maturity level

Budgeted cost of work performed

Budgeted cost of work scheduled

behind ahead

Air Force study of contractors

Flowe amp Thordahl (1994)

Click to edit Master title style Cost Performance

Cost performance

index

0

05

1

15

2

25

1 2 3 CMM Maturity Level

Budgeted cost of work performed Actual cost of

work performed

overrun underrun

Air Force study of contractors

Flowe amp Thordahl (1994)

Click to edit Master title style Crosbyrsquos Cost of Quality Model Project Cost

Performance Cost of Quality Generating plans documentation Development - requirements - design - code - integration Prevention Appraisal

Reviews - system - requirements - design - test plan - test scripts Walkthroughs Testing (first time) Independent Verification and Validation (first time) Audits

Training Policies Procedures and methods Tools Planning Quality Improvement Projects Data Gathering and Analysis Root Cause Analysis Quality Reporting

Nonconformance Conformance Fixing Defects Reworking any Document Updating Source Code Re-reviews Re-tests Lab costs for re-tests Patches(Internal and External) Engineering Changes Change Control Boards External Failures and fines Customer Support Help Desks

Dion (1993) Crosby (1979) Quality Is Free

Click to edit Master title style Raytheons Cost of Quality

bull Performance mdash cost of building it right first time bull Nonconformance mdash cost of rework bull Appraisal mdash cost of testing bull Prevention mdash cost of preventing nonconformance

Dion (1993) Haley (1995)

23

18

Year Level Perform Nonconf Appraise Prevent

1988 1 34 41 15 7

1990 2 55 18 15 12

1992 3 66 11

1994 4 76 6 75

Click to edit Master title style Raytheonrsquos Productivity

0

1

2

3

4

1988 1989 1990 1991 1992 1993 1994

Growth relative to 1989 baseline

Lydon (1995)

Click to edit Master title style Raytheonrsquos Cost Reduction

0

02

04

06

08

1

12

1988 1989 1990 1991 1992 1993 1994

Reduction relative to 1989 baseline

Lydon (1995)

Click to edit Master title style

Haley (1995)

1993 1992 1991 1990 1989 1988 1994

50

40

30

20

10

0

-10

Actual cost

Budgeted cost

over-run under-run

4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1

Raytheonrsquos Cost Predictability

Click to edit Master title style OMRONrsquos Reuse gains

Sakamoto et al (1996)

0

5 0 0

1 0 0 0

1 5 0 0

2 0 0 0

2 5 0 0

3 0 0 0

1 9 8 7 1 9 8 8 1 9 8 9 1 9 9 0 1 9 9 1 1 9 9 2 1 9 9 3 1 9 9 4

Steps

100

75

50

25

0

of reuse

Click to edit Master title style Section 4 Section 4 Measuring Quality and

Technical Debt

1) Structural quality measurement

2) Technical debt

3) Consortium for IT Software Quality (CISQ)

89

Click to edit Master title style What about the Product CMMIrsquos primary assumption

ldquoThe quality of a software system is governed by quality of the process used to develop itrdquo

- Watts Humphrey

CMMIrsquos product quality problem

1 Assessments do not verify product quality

2 Compliance focus undermines CMMIrsquos primary assumption

3 Product quality focus at Level 4 in CMM was lost in quantitative process control in CMMI

4 The assumptions underlying control charts are violated by software data

Click to edit Master title style Six Sigmarsquos Challenge

Process focus ndash process improvement ndash Six Sigma Product focus ndash product improvement ndash Design for 6σ

Product focus supplements product focus to unlock even more value from software

Six Sigma projects must have significant benefits Huge benefits

Large benefits

Good benefits

Okay benefits

Small benefits

Now what

Ultimately we run out of projects with significant enough benefits to continue the programhellip

so what is the solution that allows continual improvement

Click to edit Master title style Testing is Not Enough ldquoAs higher levels of assurance are demandedhelliptesting cannot deliver the level of confidence required at a reasonable costrdquo

ldquoThe correctness of the code is rarely the weakest linkrdquo

Jackson D (2009) Spinellis D (2006)

ldquohellipa failure to satisfy a non-functional requirement can be critical even catastrophichellipnon-functional requirements are sometimes difficult to verify We cannot write a test case to verify a systemrsquos reliabilityhellipThe ability to associate code to non-functional properties can be a powerful weapon in a software engineerrsquos arsenalrdquo

Click to edit Master title style CASTrsquos Application Intelligence Platform

Application Analysis

Evaluation of 1200+ coding amp

architectural rules

Application meta-data

Transferability

Changeability

Robustness

Performance

Security

Quality Measurements

Detected Violations

Expensive operation in loop Static vs pooled connections Complex query on big table Large indices on big table

Empty CATCH block Uncontrolled data access Poor memory management Opened resource not closed

SQL injection Cross-site scripting Buffer overflow Uncontrolled format string

Unstructured code Misuse of inheritance Lack of comments Violated naming convention

Highly coupled component Duplicated code Index modified in loop High cyclomatic complexity

Language Parsers

Oracle PLSQL Sybase T-SQL SQL Server T-SQL IBM SQLPSM C C++ C Pro C Cobol CICS Visual Basic VBNet ASPNet Java J2EE JSP XML HTML Javascript VBScript PHP PowerBuilder Oracle Forms PeopleSoft SAP ABAP Netweaver Tibco Business Objects Universal Analyzer for other languages

Click to edit Master title style Modern Apps are a Technology Stack Ar

chite

ctur

al C

ompl

ianc

e

Data Flow Transaction Risk

EJB PLSQL

Oracle

SQL Server

DB2

TSQL

Hibernate

Spring

Struts NET

COBOL

Sybase IMS

Messaging

Integration quality Architectural

compliance Risk propagation Application security Resiliency checks Transaction integrity

Function point Effort estimation Data access control SDK versioning Calibration across

technologies IT organization level

System Level 3

Code style amp layout Expression complexity Code documentation Class or program design Basic coding standards Developer level

Unit Level 1

Java Java Java

Java Java

Web Services Java Java Single languagetechnology layer

Intra-technology architecture Intra-layer dependencies Inter-program invocation Security vulnerabilities Development team level

Technology Level 2

JSP ASPNET APIs

Click to edit Master title style Architecturally Complex Defects

Primary cause of operational

problems

20x as many

fixes to correct

Architecturally Complex Defect

A structural flaw involving interactions among multiple components that reside in different application layers

48

52 92

8 Architecturally

Complex Defects

Code unit-level violations

of total app defects

of total repair effort

Most architecturally complex defects involve an architectural hotspot ndash a badly-built component that should be replaced

Click to edit Master title style Propagated Risk

The impact of violation 284 in Component B is very widely propagated throughout the system ndash

it presents very high risk

The impact of violation 342 in Component A is only narrowly propagated in the system

ndash it presents low risk

Propagated Risk Index

A ranking of the risk created by each violation based on the its severity and the breadth of its impact in the system

Violation 342 in component A

Violation 284 in component B

Remediating violation 284 in Component B will have

the greater impact on reducing risk and cost

Click to edit Master title style Transaction Risk

Customers care most about the dependability of their transactions

The risk of a transaction is determined by the number and severity of violations along its path

Transactions can be ranked by risk to prioritize their remediation

The Transaction Risk Index can be further tuned using the operational frequency of each transaction

Transaction Risk Index

A ranking of transactions based on the risk created by the number and severity of violations along the transaction path

Transaction entryexit in the user interface layer

Transaction fulfillment in the data storage layer

1 3 2 4 5

Transaction 1 poses greatest risk

Transaction processing in the business logic layer

Click to edit Master title style Risk-Based Prioritization

Violation Health Factor Severity Propagated Risk Index

In a high risk transaction

Frequency of path execution

148 Security 9

371 Robustness 9

062 Performance 8

284 Robustness 7

016 Changeability 7

241 Security 5

173 Transferability 4

359 Transferability 3

Click to edit Master title style Static Analysis in US DoD

John Keane (2013)

Testing finds lt13 of structural defects

Static analysis before testing is 85 efficient

With testing and static analysis combined test schedules are reduced by 50

Click to edit Master title style Reducing Operational Incidents amp Costs

Study of structural quality measures and maintenance effort across 20 customers in a large global system integrator

TQI increase of 24 decreased corrective maintenance effort by 50

Total Quality Index

Click to edit Master title style Reducing Operational Losses

0

20

40

60

80

100

120

140

160

180

100 150 200 250 300 350 400

Technical Quality Index

Pro

du

ctio

n In

cid

ents

+$2000000 Annual Loss

$50000 Annual Loss

$5000 Annual Loss

Large international investment bank Business critical applications

Click to edit Master title style Factors Affecting Structural Quality

2017 CRASH Report Request from infocastsoftwarecom

Factor apps Robust Security Perform Change Transfer

Technology 1606 20 3 16 26 7

Java-EE

Industry 677 5 5 7 4 6

App type 510 1 1 1 1 1

Source 580 1 1 4

Shore 540

Maturity 72 28 25 15 24 12

Method 208 10 6 14 6

Team size 186 7 5 8 8

of users 262 4 3 5 4

Percentage of variation in structural quality scores accounted for by various factors

Click to edit Master title style The Technical Debt Metaphor

Structural quality problems in production code

Technical Debt

Principal borrowed

Interest on the debt

Business Risk Liability from debt

Opportunity cost

Interestmdashcontinuing IT costs attributable to the violations causing technical debt ie higher maintenance costs greater resource usage etc

Principalcost of fixing problems remaining in the code after release that must be remediated

Opportunity costmdashbenefits that could have been achieved had resources been put on new capability rather than retiring technical debt

Liabilitymdashbusiness costs related to outages breaches corrupted data etc

Technical Debt the future cost of fixing defects remaining in code at release a component of the cost of ownership

Todayrsquos talk focuses on the principal

Curtis et al (2012) IEEE Software

Click to edit Master title style Tech Debt by Quality Factor

Transferability

40

Changeability

30

Security 7

Robustness

18

70 of Technical Debt is in IT Cost (Transferability Changeability)

30 of Technical Debt is in Business Risk

(Robustness Performance Security)

Health Factor proportions are mostly

consistent across technologies

Curtis et al (2012) IEEE Software

Click to edit Master title style Join CISQ wwwit-cisqorg

Click to edit Master title style Section

References

115

Click to edit Master title style References 1 Anderson DJ (2010) Kanban Sequim WA Blue Hole Press

Basili V Trendowicz A Kowalczyk M Heidrich J Seaman C Munch J amp Rombach D (2014) Aligning Organizations through Measurement Heidelberg Springer

Boehm B (1987) Increasing software productivity IEEE Computer 20 (9) 43-57

Boehm B et al (2000) Software Cost Estimation with COCOMO II Prentice-Hall

Chrissis MB Konrad M amp Shrum S (2011) CMMI Third Edition Guidelines for Process Integration and Product Improvement Reading MA Addison Wesley

Cohn M (2006) Agile Estimating and Planning Upper Saddle River NJ Prentice-Hall

Crosby P (1979) Quality Is Free New York McGraw-Hill

Curtis B Sapiddi J amp Szynkarski A (2012) Estimating the principle of an applicationrsquos technical debt IEEE Software 29 (6) 34-42

Dion R (1993) Process improvement and the corporate balance sheet IEEE Software 10 (4) 28-35

Duncker R (1992) Presentation at the 25th Annual Conference of the Singapore Computer Society Singapore November 1992

Dyne K (1999) ESSI Benefits from Continuous Software Improvements In Proceedings of ESEPGrsquo99 Milton Keyes UK ESPI Foundation

Flowe RM amp Thordahl JB (1994) A Correlational Study of the SEIrsquos Capability Maturity Model and Software Development Performance in DoD Contracts (AFITGSSLAR94D-2) Dayton OH Air Force Institute of Technology Wright Patterson Air Force Base

116

Click to edit Master title style References 2 Haley T Ireland B Wojtaszek E Nash D amp Dion R (1995) Raytheon Electronic Systems

Experience in Software Process Improvement (Tech Rep CMUSEI-95-TR-017) ) Pittsburgh Software Engineering Institute Carnegie Mellon University

Ebert C amp Dumke R (2007) Software Measurement Berlin Springer-Verlag

Fagan E amp Davis N (2005) TSP PSP at Intuit Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Fenton N (2006) New directions for Software Metrics CIO Symposium on Software Best Practices

Garmus D amp Herron D (2001) Function Point Analysis Boston Addison-Wesley

George E (2005) Project management with TSP Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Herbsleb J Carleton A Rozum J Siegel J amp Zubrow D (1994) Benefits of CMM-Based Software Process Improvement Initial Results (Tech Rep CMUSEI-94-TR-13) Pittsburgh Software Engineering Institute Carnegie Mellon University

Humphrey W (1997) Introduction to the Personal Software Process Boston Addison Wesley

Humphrey W (2000) Introduction to the Team Software Process Boston Addison Wesley

Humphrey W (2006) TSP Leading a Development Team Boston Addison Wesley

Jackson D (2009) A direct path to dependable software Communications of the ACM 52 (4)

Kan SH (1995) Metrics and Models in Software Quality Engineering Reading MA Addison-Wesley

Keeni B (2000) IEEE Software 16 (4)

117

Click to edit Master title style References 2 Larman C (2004) Agile and Iterative Development Boston Addison Wesley

Lydon T (1995) Productivity drivers Process and capital In Proceedings of the 1995 SEPG Conference Pittsburgh Software Engineering Institute Carnegie Mellon University

McGarry J et al (2002) Practical Software Measurement Reading MA Addison Wesley

Paulk M Weber C Curtis B amp Chrissis M (1995) The Capability Maturity Model Guidelines for Improving the Software Process Reading MA Addison-Wesley

Peterman W (2000) IEEE Software JulyAugust issue

Poppendieck M amp Poppendieck T (2007) Implementing Lean Software Development Boston Addison-Wesley

Sakamoto K Kishida K amp Nakakoji K (1996) Cultural adaptation of the CMM In Fuggetta A amp Wolf A (Eds) Software Process Chichester UK Wiley 137-154

Sinha M (2006) Performance models ndash Enabling organizations prediction capability Proceedings of SEPGrsquo06 Pittsburgh Software Engineering Institute Carnegie Mellon University

Schwaber K amp Beedle M (2002) Agile Software Development Using Scrum

Spinellis D (2006) Code Quality Addison-Wesley

Van Epps S amp Marshall R (2012) Using TSP at the end of the development cycle Proceedings of TSP 2012 Pittsburgh Software Engineering Institute Carnegie Mellon University

Vu JD (1996) Software process improvement A business case In Proceedings of the European SEPG Conference Milton Keynes UK European Software Process Improvement Foundation

118

  • Best Practices for Software Process and Product Measurement
  • Instructor Dr Bill Curtis
  • Agenda
  • Section 1
  • Why Does Measurement Languish
  • Measurement Process Guidance
  • Measurement Categories
  • Productivity Analysis Objectives
  • Productivity Analysis Measures
  • Software Productivity
  • Software Size Measures
  • How Many Lines of Code
  • Function Point Estimation
  • Effort mdash Weakest Measure
  • How Quality Affects Productivity
  • Carry-forward Rework
  • Example of Quality Impact
  • Quality-Adjusted Productivity
  • Best Practices for Analysis
  • 1 Evaluate Demographics
  • 2 Segment Baselines
  • 3 Beware Sampling Effects
  • 4 Understand Variation
  • 5 Investigate Distributions
  • 6 Account for Maturity Effects
  • 7 Caution for External Data Sets
  • Slide Number 27
  • Measuring Project Progress
  • Requirements Change Impact
  • Phase Defect Removal Model
  • SCRUM Board
  • Burndown Chart and Velocity
  • Analyzing Velocity
  • Measuring and Managing Flow
  • Trends Over Sprints
  • Sources and Measures
  • Recommended Books
  • Personal Software Process (PSP)
  • Team Software Process (TSP)
  • Aggregated Personal Data Best
  • Defect Detection at Intuit
  • TSP Benefits
  • Section 3
  • CMMI Maturity Transitions
  • Enhanced Maturity Framework
  • Maturity Level Transformation
  • Measurement by Level
  • Average Improvement Results
  • Schedule Performance
  • Cost Performance
  • Crosbyrsquos Cost of Quality Model
  • Raytheons Cost of Quality
  • Raytheonrsquos Productivity
  • Raytheonrsquos Cost Reduction
  • Raytheonrsquos Cost Predictability
  • OMRONrsquos Reuse gains
  • Section 4
  • What about the Product
  • Six Sigmarsquos Challenge
  • Testing is Not Enough
  • CASTrsquos Application Intelligence Platform
  • Modern Apps are a Technology Stack
  • Architecturally Complex Defects
  • Propagated Risk
  • Transaction Risk
  • Risk-Based Prioritization
  • Static Analysis in US DoD
  • Reducing Operational Incidents amp Costs
  • Reducing Operational Losses
  • Slide Number 70
  • The Technical Debt Metaphor
  • Tech Debt by Quality Factor
  • Join CISQ wwwit-cisqorg
  • Section
  • References 1
  • References 2
  • References 2

PHASE

Escapes Previous Phase

KSLOC

Defect Injection

KSLOC

Subtotal

KSLOC

Removal

Effective-ness

Defect Removal

KSLOC

Escapes at phase exit KSLOC

Rqmts

0

12

12

0

0

12

Design

12

180

192

76

146

46

Code amp UT

46

154

200

71

142

58

Inte-gration

58

0

58

67

39

19

Sys Test

19

0

19

58

11

08

Opera-tion

08

Requirements Added Requirements Changed Requirements Deleted
Feb 2 20 1
Mar 2 11 1
Apr 45 35 3
May 15 61 1
Jun 1 20 4
Jul 15 15 0
Aug 18 05 2
Feb Feb Feb
Mar Mar Mar
Apr Apr Apr
May May May
Jun Jun Jun
Jul Jul Jul
Aug Aug Aug
Page 2: Best Practices for Software Process and Product Measurement · Software Process and Product Measurement Dr. Bill Curtis ... measurement best practice • Best guidance for starting

Click to edit Master title style Instructor Dr Bill Curtis Dr Bill Curtis is Senior Vice President Chief Scientist of CAST Software where he heads CAST Research Labs CAST is a leading provider of technology for measuring the structural quality of business applications He leads the production of CASTrsquos annual CRASH Report on the global state of structural quality in business applications He is also the Director of the Consortium for IT Software Quality a Special Interest Group of the Object Management Group (OMG) where he leads an international effort to establish standards for automating measures of software quality characteristics Dr Curtis is a Fellow of the Institute of Electrical and Electronics Engineers (IEEE) for his career contributions to software process improvement and measurement

Dr Curtis is a co-author of the Capability Maturity Model for Software (CMM) the People CMM and the Business Process Maturity Model He was also a member of the Product Suite Team for CMMI He is a globally recognized expert in software measurement process improvement and workforce development He has been leader in the application of statistics to software engineering He was the Chief Process Officer at Borland Software Corporation Until itrsquos acquisition by Borland he was a Co-founder and Chief Scientist of TeraQuest in Austin Texas the premier provider of CMM-based services He is a former Director of the Software Process Program in the Software Engineering Institute at Carnegie Mellon University Prior to joining the SEI Dr Curtis directed research on advanced user interface technologies and the software design process at MCC the USrsquos fifth generation computer research center in Austin Texas Prior to MCC he developed a global software productivity and quality measurement system at ITTrsquos Programming Technology Center Prior to ITT he evaluated software development methods in GE Space Division worked in organizational effectiveness at Weyerhaeuser and taught statistics at the University of Washington He has co-authored four books is on the editorial boards of several journals and has published over 150 papers on software development and management Dr Curtis earned a PhD from Texas Christian University and a MA from The University of Texas He earned his BA from Eckerd College in mathematics psychology and theater

Click to edit Master title style Agenda section slide

I Measuring Software Size and Productivity 3 Break

II Adopting Measures to Development Methods 33 Lunch

III Measuring Software Improvement Programs 61 Break

IV Measuring Software Product Quality 89 Adjourn

V References 115

copy 2017 The presentation material in these tutorial notes is copyrighted by Dr Bill Curtis It is to be used for the benefit of the individuals attending this tutorial It may not be copied or distributed to others without written permission from Dr Curtis For further information please contact

reg Capability Maturity Model Capability Maturity Model Integration People Capability Maturity Model CMM and CMMI are registered in the US Patent and Trademark Office

Dr Bill Curtis PO Box 126079 Fort Worth Texas 76126-0079 USA

curtisacmorg

Click to edit Master title style Section 1 Section 1 Measuring Software Size and Productivity

1 Practical Software Measurement

2 Size and functional measures

3 Measuring software productivity

4 Analyzing software productivity

4

Click to edit Master title style Why Does Measurement Languish

Unreliable data

Too many measures

Poorly matched to process maturity

Badly defined measures

Use in personal appraisals

Click to edit Master title style Measurement Process Guidance

bull Based on decades of software measurement best practice

bull Best guidance for starting a measurement program

bull Compatible with ISO 15939 minus Software Measurement Process

bull Free guides amp licensed training and consulting available

bull httppsmsccom

6 McGarry et al (2002) Practical Software Measurement

Click to edit Master title style Measurement Categories Category Issue Example measure

Schedule amp Progress

Milestone completion Earned value

Actual vs planned dates Actual vs planned completions

Resources amp Cost

Personnel effort Facilities used

Person-hours Test equipment amp hours

Product Size Size of coded product Size of documentation

Function Points Pages of manuals

Product Quality Functional quality Structural quality

Defects per KLOC Violations of reliability rules

Process Performance

Efficiency Compliance

Defect detection efficiency Audit results

Product Value Return on investment Risk avoidance

∆ Increase in target revenue Cost of system outage

Customer Satisfaction

Account growth Selfndashsufficiency

Repeat business within 1 year Number of helpdesk calls

McGarry et al (2002) Practical Software Measurement p37

Leve

l 2

ge L

evel

3

Click to edit Master title style Productivity Analysis Objectives

Improvement

Productivity Analysis Estimation Benchmarking

Managing Vendors

Click to edit Master title style Productivity Analysis Measures

Adjustment Measures

bull Functional bull Structural bull Behavioral

bull Application bull Project bull Organization

Quality Demo-graphics

Primary Measures

bull Instructions bull Functions bull Requirements

bull Hours bull Roles bull Phases

Size Effort

Productivity Analysis

Click to edit Master title style Software Productivity

Release Productivity =

developed Size of software deleted or modified

Total effort expended on the release

Software Productivity =

Size of software produced

Total effort expended to produce it

Click to edit Master title style Software Size Measures

Instructions Lines of Code

Most frequently used Different definitions of a line can cause counts to vary by 10x Smaller programs often accomplish the same functionality with higher quality coding

Requirements-based Use Case Points Story Points

Use Case Points have not become widely used and need more development Story points are subjective to each team and are susceptible to several forms of bias

Functions Function Points

Popular in IT Several counting schemes (IFPUG NESMA Mark II COSMIC etc) Manual counting is expensive and subjectivemdashcertified counters can differ by 10 Automated FPs taking root

Click to edit Master title style How Many Lines of Code define LOWER 0 lower limit of table define UPPER 300 upper limit define STEP 20 step size main() print a Fahrenheit-Celsius conversion table int fahr for(fahr=LOWER fahrlt=UPPER fahr=fahr+STEP) printf(ldquo4d 61fnrdquo fahr (5090)(fahr-32))

1 2 3 4 5 6 7 8 9 0

2

4

6

8

10

12

Number of Lines of Code

Number of votes

B Park (1992)

Click to edit Master title style Function Point Estimation

Ebert amp Dumke (2007) Software Measurement p188

R2 = 95 y = 779x + 4350

0 200 400 600 800 1000 0

2000

4000

6000

8000

Un

adju

sted

Fu

nct

ion

Po

ints

EIs + EOs

Data store Software

Entries

Exits

Reads

Writes

Functional view of software

Functional size can be estimated from external inputs and outputs

Upfront functional analysis provides basis for good estimates

Repository of FP data provides basis for calibrating estimates

39

Click to edit Master title style Effort mdash Weakest Measure

Effort Unreliable

Inconsistent

Effort

After the fact estimates

bull Memory lapses bull Time-splicing bull Inconsistency

Under-reporting

bull Contract issues

bull HR issues

bull Impressions

Lack of normalization

bull Roles included

bull Phases included

bull Hours in P-Year

Click to edit Master title style How Quality Affects Productivity

Original productivity baseline

Incremental increases in technical debt

Continuing decrease in productivity

Unless you take action

Assumption Productivity is a stable number

Reality Productivity is unstable tending to decline

Click to edit Master title style Carry-forward Rework

Release N+2

Develop N+2 Rework N+2 Rework N+1 Rework N+0

Develop N

Release N

Rework N

Unfixed defects release N

Release N+1

Develop N+1 Rework N+1 Rework N+0

Unfixed defects release N

Unfixed defects release N+1

Click to edit Master title style Example of Quality Impact Project B (Better Faster Cheaper)

ndash 20 developers 3 months

ndash $120k per FTE

ndash 4 FPs per staff month

ndash 240 FPs delivered

$2500FP cost

ndash 5 critical violations per FP

ndash $500 per fix

ndash Cost for 1200 fixes = $600k

ndash Total Cost to Own = $1200k

$5000FP of TCO

Project A (Plodders)

ndash 20 developers 3 months

ndash $120k per FTE

ndash 3 FPs per staff month

ndash 180 FPs delivered

$3333FP cost

ndash 2 critical violations per FP

ndash $500 per fix

ndash Cost for 360 fixes = $180k

ndash Total Cost to Own = $780k

$4333FP of TCO

Project B is 25 more productive

However

Project A is 134 more productive

Click to edit Master title style Quality-Adjusted Productivity

Productivity

Estimation

Benchmarks

Value amp ROI

Etc

Automated Enhancement

Points

Size of both functional and non-functional code segments

Corrective effort in future releases for defects injected in

this release

Must add future effort to fix bugs into productivity

calculations

Effort amp Cost

Quality-Adjusted

Productivity

Automated Technical Debt

Click to edit Master title style Best Practices for Analysis

1) Segment baselines

2) Beware sampling effects

3) Understand variation

4) Evaluate demographics

5) Investigate distributions

6) Account for maturity effects

7) Beware external data sets

Click to edit Master title style 1 Evaluate Demographics

Routine reports

COBOL apps

Function Points

Prod

uctiv

ity

Web-based apps

Innovative mobile

Custom ERP

Click to edit Master title style 2 Segment Baselines

Year Projects Productivity Total Corporate 1981 28 2342 1980 21 1939 Telecommunications 1981 14 1811 1980 12 1458 Engineering amp Defense 1981 8 2965 1980 6 2739 Business Applications 1981 6 3054 1980 3 1813

Multiple baselines are usually the most valid

Click to edit Master title style 3 Beware Sampling Effects

22

0

500

1000

1500

2000

2500

1980 1981 1982

Line

s of

cod

epe

rson

-yea

rs

Baselines

Lots of small

projects Lots of large

projects

Click to edit Master title style 4 Understand Variation

23

Product factors bull timing constraints bull memory utilization bull CPU occupancy bull resource constraints bull complexity bull product size

Project factors bull concurrent development bull development computer bull requirements definition bull requirements churn bull programming practices bull personnel experience bull client experience bull client participation

Project 49

Product 16

Not explained

35

R2product = 16

R2project = 49

R2

model = 65

J Vosburgh B Curtis et al (1984) Proceedings of ICSE 1994

Click to edit Master title style 5 Investigate Distributions

Low Medium High

Rel

ativ

e p

rod

uct

ivit

y (

of m

ean

)

Software Engineering Practices Usage

-100

-50

0

50

100

150

200

250

Click to edit Master title style 6 Account for Maturity Effects

Which organization will be required to spend more effort on correcting software flaws

Click to edit Master title style 7 Caution for External Data Sets

26

DACS Us

Productivity

Siz

e Data definitions

Data collection

Data validity

Data age

Demographics

Click to edit Master title style

1) Tracking traditional projects

2) Iterative and Agile measurement

3) Team Software Process measurement

33

Section 2 Measuring to Manage

Software Projects

Click to edit Master title style Measuring Project Progress

300K 250K 200K 150K 100K 50K 0K

Jan Feb Mar Apr May Jun

Planned rate of code Growth

Actual code growth

Completion of coding

ldquoThe measures say we are on schedulerdquo

300K

250K

200K

150K

100K

50K

0K Jan Feb Mar Apr May Jun

Planned rate of code Growth

Actual code growth

Completion of coding

ldquoBut the code keeps growing We should have measured the requirements growthrdquo

50

100

150

Defect reports

Total

Open

Closed

125

75

25

Jan Feb Mar Apr May Jun

Completion of coding

ldquoAnd all the defect reports remaining to be fixed will keep us far behind schedulerdquo

Click to edit Master title style Requirements Change Impact

Number of requirements

affected

Chart6

Requirements Added
Requirements Changed
Requirements Deleted
Person hours of work or rework
2
20
1
2
11
1
45
35
3
15
61
1
1
20
4
15
15
0
18
05
2

Sheet1

Sheet1

Requirements Added
Requirements Changed
Requirements Deleted
Person hours of work or rework

Sheet2

Sheet3

Click to edit Master title style Phase Defect Removal Model

PHASE

Escapes Previous Phase KSLOC

Defect Injection KSLOC

Subtotal KSLOC

Removal Effective-ness

Defect Removal KSLOC

Escapes at phase exit KSLOC

Rqmts 0 12 12 0 0 12

Design 12 180 192 76 146 46

Code amp UT 46 154 200 71 142 58

Inte-gration 58 0 58 67 39 19

Sys Test 19 0 19 58 11 08

Opera-tion 08

Kan (1999) Metrics and Model in Software Quality Engineering

Click to edit Master title style SCRUM Board Scrumboard displays

the path to completion for each story by showing task status

Story points are an estimate of days for each story

Agile estimating is by feature (story) rather than by task

lsquoDonersquo does not always mean all tasks are finished

Kanban methods restrict the work in progress to a limited number of paths

Click to edit Master title style Burndown Chart and Velocity

Burndown chart displays story completion by day

Burndown chart indicates actual versus estimated progress

Velocity is the rate at which stories are completed

Velocity indicates sustainable progress

Velocity results provide one source of historical data for improving estimates

Story points without the context of productivity factors are dangerous for estimating

Click to edit Master title style Team Velocity

0

1

2

3

4

5

1 2 2 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20Days

Story Points per Hour

0

1

2

3

4

5

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

HrsStryPt

Analyzing Velocity

Item Cycle Times

02468

101214161820

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

Stories Hours HrsStryPt

0

10

20

30

40

50

60

1 2 2 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

Burndown Chart

Days

Click to edit Master title style Measuring and Managing Flow

Released

In progress

Backlogged

Cumulative Flow Diagram Days to delivery

outliers

Anderson (2010) Kanban

Click to edit Master title style

0

10

20

30

40

50

60

70

80

1 2 3 4 5 6 7 8 9 10 11 12

Sprints

Sto

ry p

oin

ts d

eliv

ered

Trends Over Sprints

0

10

20

30

40

50

60

70

80

1 2 3 4 5 6 7 8 9 10 11 12

Sprints

Sto

ry p

oin

ts d

eliv

ered

0

05

1

15

2

25

3

35

4

1 2 3 4 5 6 7 8 9 10 11 12

Sprints

Tech

nic

al d

ebt

Track and correlate multiple measures

across sprints

Predictive models

Click to edit Master title style Sources and Measures

Stories submitted

Stories pulled back

Growth in size of stories

Stories added mid-sprint

Stories under review

Failed tests

Non-development tasks

Assists to other projects

Data sources

Measures Project tracking

Code control

Bug tracking

Build system

Test environment

Deployment tools

Operational monitoring

Click to edit Master title style Recommended Books

Click to edit Master title style Personal Software Process (PSP)

PSP 0 Current process Time recording

Defect recording Defect type standards

PSP 01 Coding standard

Size measurement Process improvement

proposal

Baseline Personal Process

Humphrey (1997) Introduction to the Personal Software Process

PSP 1 Size estimating

Test report

Personal Planning

PSP 2 Code reviews

Design reviews

Personal Quality Management

PSP 21 Cyclic development

Cyclic Personal Process

PSP 11 Task planning

Schedule planning

PSP 21 Design templates

Click to edit Master title style Team Software Process (TSP) bull Built from personal processes of team members

bull Well defined team roles

bull Project launch workshops for planning

bull Team measurement and tracking

bull Post-mortems for improvement

bull Team application of lean principles

Humphrey (2000) Introduction to the Team Software Process

Click to edit Master title style Aggregated Personal Data Best

R2s range from 70 to 87

The ability to predict the amount of time required to produce a piece of software is dramatically improved by estimating at the individual level first and then combining the estimates rather than developing a single team estimate averaged over individuals

R2 = 53

Click to edit Master title style Defect Detection at Intuit

Consequently test changes from defect

detection into correctness verification

PSPTSP shifts defect detection from the test phase back up into development

Fagan (2005)

Personal reviews

33

Compile 33

Team reviews 19

Unit test 15

Test 14

Click to edit Master title style

Dramatic reductions

ndash Delivered defects

ndash Days per KLOC

ndash Schedule deviation

ndash Test defectskloc

ndash Variation in results

Humphrey (2006) TSP Leading a Development Team

TSP Benefits

Click to edit Master title style Section 3 Section 3 Measuring Software

Improvement Programs

1) Improvement program methods

2) Maturity-based measurement

3) Improvement program results

61

Click to edit Master title style CMMI Maturity Transitions

Level 5 Optimizing

Innovation management

Level 4 Quantitatively

Managed

Capability management

Level 3 Defined

Process management

Level 2 Managed

Project management

Level 1 Initial

Inconsistent management

Chrisis Konrad amp Shrum (2005) CMMI

Click to edit Master title style Enhanced Maturity Framework

Level 5 Innovating

Innovation management

Level 4 Optimized

Capability management

Level 3 Standardized

Organizational management

Level 2 Stabilized

Work unit management

Level 1 Initial

Inconsistent management

Click to edit Master title style Maturity Level Transformation

Level 5 Opportunistic improvements

Empowered culture

Level 5 Proactive

improvements Agile culture

Organization

Projects

Individual

Level 4 Process and results

managed statistically Precision culture

Level 3 Organization develops

standard processes Common culture

Level 2 Project managers stabilize projects

Local culture

Level 1 Ad Hoc inconsistent

development practices Relationship culture

Trust

Click to edit Master title style Measurement by Level

Level 5 mdash Exploratory data ∆ t = N x ROI

Level 4 mdash Predictive data X plusmn (196σ radicn )

Level 3 mdash Process data 83 defects per review

Level 2 mdash Management data Effort = α LOCβ

Level 1 mdash Haphazard data ldquoabout a million linesrdquo

Measure-maturity mismatch slows improvement and creates

dysfunction

Click to edit Master title style Average Improvement Results

Productivity growth 4 35 9 - 67

Pre-test defect detection 3 22 6 - 25

Time to market 2 darr 19 15 - 23

Field error reports 5 darr 39 10 - 94

Return on investment 5 501 41 - 881

Annual Annual Improvement benefit Orgs median range

Click to edit Master title style Schedule Performance

0

02

04

06

08

1

12

14

16

1 2 3

Schedule performance

index

CMM maturity level

Budgeted cost of work performed

Budgeted cost of work scheduled

behind ahead

Air Force study of contractors

Flowe amp Thordahl (1994)

Click to edit Master title style Cost Performance

Cost performance

index

0

05

1

15

2

25

1 2 3 CMM Maturity Level

Budgeted cost of work performed Actual cost of

work performed

overrun underrun

Air Force study of contractors

Flowe amp Thordahl (1994)

Click to edit Master title style Crosbyrsquos Cost of Quality Model Project Cost

Performance Cost of Quality Generating plans documentation Development - requirements - design - code - integration Prevention Appraisal

Reviews - system - requirements - design - test plan - test scripts Walkthroughs Testing (first time) Independent Verification and Validation (first time) Audits

Training Policies Procedures and methods Tools Planning Quality Improvement Projects Data Gathering and Analysis Root Cause Analysis Quality Reporting

Nonconformance Conformance Fixing Defects Reworking any Document Updating Source Code Re-reviews Re-tests Lab costs for re-tests Patches(Internal and External) Engineering Changes Change Control Boards External Failures and fines Customer Support Help Desks

Dion (1993) Crosby (1979) Quality Is Free

Click to edit Master title style Raytheons Cost of Quality

bull Performance mdash cost of building it right first time bull Nonconformance mdash cost of rework bull Appraisal mdash cost of testing bull Prevention mdash cost of preventing nonconformance

Dion (1993) Haley (1995)

23

18

Year Level Perform Nonconf Appraise Prevent

1988 1 34 41 15 7

1990 2 55 18 15 12

1992 3 66 11

1994 4 76 6 75

Click to edit Master title style Raytheonrsquos Productivity

0

1

2

3

4

1988 1989 1990 1991 1992 1993 1994

Growth relative to 1989 baseline

Lydon (1995)

Click to edit Master title style Raytheonrsquos Cost Reduction

0

02

04

06

08

1

12

1988 1989 1990 1991 1992 1993 1994

Reduction relative to 1989 baseline

Lydon (1995)

Click to edit Master title style

Haley (1995)

1993 1992 1991 1990 1989 1988 1994

50

40

30

20

10

0

-10

Actual cost

Budgeted cost

over-run under-run

4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1

Raytheonrsquos Cost Predictability

Click to edit Master title style OMRONrsquos Reuse gains

Sakamoto et al (1996)

0

5 0 0

1 0 0 0

1 5 0 0

2 0 0 0

2 5 0 0

3 0 0 0

1 9 8 7 1 9 8 8 1 9 8 9 1 9 9 0 1 9 9 1 1 9 9 2 1 9 9 3 1 9 9 4

Steps

100

75

50

25

0

of reuse

Click to edit Master title style Section 4 Section 4 Measuring Quality and

Technical Debt

1) Structural quality measurement

2) Technical debt

3) Consortium for IT Software Quality (CISQ)

89

Click to edit Master title style What about the Product CMMIrsquos primary assumption

ldquoThe quality of a software system is governed by quality of the process used to develop itrdquo

- Watts Humphrey

CMMIrsquos product quality problem

1 Assessments do not verify product quality

2 Compliance focus undermines CMMIrsquos primary assumption

3 Product quality focus at Level 4 in CMM was lost in quantitative process control in CMMI

4 The assumptions underlying control charts are violated by software data

Click to edit Master title style Six Sigmarsquos Challenge

Process focus ndash process improvement ndash Six Sigma Product focus ndash product improvement ndash Design for 6σ

Product focus supplements product focus to unlock even more value from software

Six Sigma projects must have significant benefits Huge benefits

Large benefits

Good benefits

Okay benefits

Small benefits

Now what

Ultimately we run out of projects with significant enough benefits to continue the programhellip

so what is the solution that allows continual improvement

Click to edit Master title style Testing is Not Enough ldquoAs higher levels of assurance are demandedhelliptesting cannot deliver the level of confidence required at a reasonable costrdquo

ldquoThe correctness of the code is rarely the weakest linkrdquo

Jackson D (2009) Spinellis D (2006)

ldquohellipa failure to satisfy a non-functional requirement can be critical even catastrophichellipnon-functional requirements are sometimes difficult to verify We cannot write a test case to verify a systemrsquos reliabilityhellipThe ability to associate code to non-functional properties can be a powerful weapon in a software engineerrsquos arsenalrdquo

Click to edit Master title style CASTrsquos Application Intelligence Platform

Application Analysis

Evaluation of 1200+ coding amp

architectural rules

Application meta-data

Transferability

Changeability

Robustness

Performance

Security

Quality Measurements

Detected Violations

Expensive operation in loop Static vs pooled connections Complex query on big table Large indices on big table

Empty CATCH block Uncontrolled data access Poor memory management Opened resource not closed

SQL injection Cross-site scripting Buffer overflow Uncontrolled format string

Unstructured code Misuse of inheritance Lack of comments Violated naming convention

Highly coupled component Duplicated code Index modified in loop High cyclomatic complexity

Language Parsers

Oracle PLSQL Sybase T-SQL SQL Server T-SQL IBM SQLPSM C C++ C Pro C Cobol CICS Visual Basic VBNet ASPNet Java J2EE JSP XML HTML Javascript VBScript PHP PowerBuilder Oracle Forms PeopleSoft SAP ABAP Netweaver Tibco Business Objects Universal Analyzer for other languages

Click to edit Master title style Modern Apps are a Technology Stack Ar

chite

ctur

al C

ompl

ianc

e

Data Flow Transaction Risk

EJB PLSQL

Oracle

SQL Server

DB2

TSQL

Hibernate

Spring

Struts NET

COBOL

Sybase IMS

Messaging

Integration quality Architectural

compliance Risk propagation Application security Resiliency checks Transaction integrity

Function point Effort estimation Data access control SDK versioning Calibration across

technologies IT organization level

System Level 3

Code style amp layout Expression complexity Code documentation Class or program design Basic coding standards Developer level

Unit Level 1

Java Java Java

Java Java

Web Services Java Java Single languagetechnology layer

Intra-technology architecture Intra-layer dependencies Inter-program invocation Security vulnerabilities Development team level

Technology Level 2

JSP ASPNET APIs

Click to edit Master title style Architecturally Complex Defects

Primary cause of operational

problems

20x as many

fixes to correct

Architecturally Complex Defect

A structural flaw involving interactions among multiple components that reside in different application layers

48

52 92

8 Architecturally

Complex Defects

Code unit-level violations

of total app defects

of total repair effort

Most architecturally complex defects involve an architectural hotspot ndash a badly-built component that should be replaced

Click to edit Master title style Propagated Risk

The impact of violation 284 in Component B is very widely propagated throughout the system ndash

it presents very high risk

The impact of violation 342 in Component A is only narrowly propagated in the system

ndash it presents low risk

Propagated Risk Index

A ranking of the risk created by each violation based on the its severity and the breadth of its impact in the system

Violation 342 in component A

Violation 284 in component B

Remediating violation 284 in Component B will have

the greater impact on reducing risk and cost

Click to edit Master title style Transaction Risk

Customers care most about the dependability of their transactions

The risk of a transaction is determined by the number and severity of violations along its path

Transactions can be ranked by risk to prioritize their remediation

The Transaction Risk Index can be further tuned using the operational frequency of each transaction

Transaction Risk Index

A ranking of transactions based on the risk created by the number and severity of violations along the transaction path

Transaction entryexit in the user interface layer

Transaction fulfillment in the data storage layer

1 3 2 4 5

Transaction 1 poses greatest risk

Transaction processing in the business logic layer

Click to edit Master title style Risk-Based Prioritization

Violation Health Factor Severity Propagated Risk Index

In a high risk transaction

Frequency of path execution

148 Security 9

371 Robustness 9

062 Performance 8

284 Robustness 7

016 Changeability 7

241 Security 5

173 Transferability 4

359 Transferability 3

Click to edit Master title style Static Analysis in US DoD

John Keane (2013)

Testing finds lt13 of structural defects

Static analysis before testing is 85 efficient

With testing and static analysis combined test schedules are reduced by 50

Click to edit Master title style Reducing Operational Incidents amp Costs

Study of structural quality measures and maintenance effort across 20 customers in a large global system integrator

TQI increase of 24 decreased corrective maintenance effort by 50

Total Quality Index

Click to edit Master title style Reducing Operational Losses

0

20

40

60

80

100

120

140

160

180

100 150 200 250 300 350 400

Technical Quality Index

Pro

du

ctio

n In

cid

ents

+$2000000 Annual Loss

$50000 Annual Loss

$5000 Annual Loss

Large international investment bank Business critical applications

Click to edit Master title style Factors Affecting Structural Quality

2017 CRASH Report Request from infocastsoftwarecom

Factor apps Robust Security Perform Change Transfer

Technology 1606 20 3 16 26 7

Java-EE

Industry 677 5 5 7 4 6

App type 510 1 1 1 1 1

Source 580 1 1 4

Shore 540

Maturity 72 28 25 15 24 12

Method 208 10 6 14 6

Team size 186 7 5 8 8

of users 262 4 3 5 4

Percentage of variation in structural quality scores accounted for by various factors

Click to edit Master title style The Technical Debt Metaphor

Structural quality problems in production code

Technical Debt

Principal borrowed

Interest on the debt

Business Risk Liability from debt

Opportunity cost

Interestmdashcontinuing IT costs attributable to the violations causing technical debt ie higher maintenance costs greater resource usage etc

Principalcost of fixing problems remaining in the code after release that must be remediated

Opportunity costmdashbenefits that could have been achieved had resources been put on new capability rather than retiring technical debt

Liabilitymdashbusiness costs related to outages breaches corrupted data etc

Technical Debt the future cost of fixing defects remaining in code at release a component of the cost of ownership

Todayrsquos talk focuses on the principal

Curtis et al (2012) IEEE Software

Click to edit Master title style Tech Debt by Quality Factor

Transferability

40

Changeability

30

Security 7

Robustness

18

70 of Technical Debt is in IT Cost (Transferability Changeability)

30 of Technical Debt is in Business Risk

(Robustness Performance Security)

Health Factor proportions are mostly

consistent across technologies

Curtis et al (2012) IEEE Software

Click to edit Master title style Join CISQ wwwit-cisqorg

Click to edit Master title style Section

References

115

Click to edit Master title style References 1 Anderson DJ (2010) Kanban Sequim WA Blue Hole Press

Basili V Trendowicz A Kowalczyk M Heidrich J Seaman C Munch J amp Rombach D (2014) Aligning Organizations through Measurement Heidelberg Springer

Boehm B (1987) Increasing software productivity IEEE Computer 20 (9) 43-57

Boehm B et al (2000) Software Cost Estimation with COCOMO II Prentice-Hall

Chrissis MB Konrad M amp Shrum S (2011) CMMI Third Edition Guidelines for Process Integration and Product Improvement Reading MA Addison Wesley

Cohn M (2006) Agile Estimating and Planning Upper Saddle River NJ Prentice-Hall

Crosby P (1979) Quality Is Free New York McGraw-Hill

Curtis B Sapiddi J amp Szynkarski A (2012) Estimating the principle of an applicationrsquos technical debt IEEE Software 29 (6) 34-42

Dion R (1993) Process improvement and the corporate balance sheet IEEE Software 10 (4) 28-35

Duncker R (1992) Presentation at the 25th Annual Conference of the Singapore Computer Society Singapore November 1992

Dyne K (1999) ESSI Benefits from Continuous Software Improvements In Proceedings of ESEPGrsquo99 Milton Keyes UK ESPI Foundation

Flowe RM amp Thordahl JB (1994) A Correlational Study of the SEIrsquos Capability Maturity Model and Software Development Performance in DoD Contracts (AFITGSSLAR94D-2) Dayton OH Air Force Institute of Technology Wright Patterson Air Force Base

116

Click to edit Master title style References 2 Haley T Ireland B Wojtaszek E Nash D amp Dion R (1995) Raytheon Electronic Systems

Experience in Software Process Improvement (Tech Rep CMUSEI-95-TR-017) ) Pittsburgh Software Engineering Institute Carnegie Mellon University

Ebert C amp Dumke R (2007) Software Measurement Berlin Springer-Verlag

Fagan E amp Davis N (2005) TSP PSP at Intuit Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Fenton N (2006) New directions for Software Metrics CIO Symposium on Software Best Practices

Garmus D amp Herron D (2001) Function Point Analysis Boston Addison-Wesley

George E (2005) Project management with TSP Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Herbsleb J Carleton A Rozum J Siegel J amp Zubrow D (1994) Benefits of CMM-Based Software Process Improvement Initial Results (Tech Rep CMUSEI-94-TR-13) Pittsburgh Software Engineering Institute Carnegie Mellon University

Humphrey W (1997) Introduction to the Personal Software Process Boston Addison Wesley

Humphrey W (2000) Introduction to the Team Software Process Boston Addison Wesley

Humphrey W (2006) TSP Leading a Development Team Boston Addison Wesley

Jackson D (2009) A direct path to dependable software Communications of the ACM 52 (4)

Kan SH (1995) Metrics and Models in Software Quality Engineering Reading MA Addison-Wesley

Keeni B (2000) IEEE Software 16 (4)

117

Click to edit Master title style References 2 Larman C (2004) Agile and Iterative Development Boston Addison Wesley

Lydon T (1995) Productivity drivers Process and capital In Proceedings of the 1995 SEPG Conference Pittsburgh Software Engineering Institute Carnegie Mellon University

McGarry J et al (2002) Practical Software Measurement Reading MA Addison Wesley

Paulk M Weber C Curtis B amp Chrissis M (1995) The Capability Maturity Model Guidelines for Improving the Software Process Reading MA Addison-Wesley

Peterman W (2000) IEEE Software JulyAugust issue

Poppendieck M amp Poppendieck T (2007) Implementing Lean Software Development Boston Addison-Wesley

Sakamoto K Kishida K amp Nakakoji K (1996) Cultural adaptation of the CMM In Fuggetta A amp Wolf A (Eds) Software Process Chichester UK Wiley 137-154

Sinha M (2006) Performance models ndash Enabling organizations prediction capability Proceedings of SEPGrsquo06 Pittsburgh Software Engineering Institute Carnegie Mellon University

Schwaber K amp Beedle M (2002) Agile Software Development Using Scrum

Spinellis D (2006) Code Quality Addison-Wesley

Van Epps S amp Marshall R (2012) Using TSP at the end of the development cycle Proceedings of TSP 2012 Pittsburgh Software Engineering Institute Carnegie Mellon University

Vu JD (1996) Software process improvement A business case In Proceedings of the European SEPG Conference Milton Keynes UK European Software Process Improvement Foundation

118

  • Best Practices for Software Process and Product Measurement
  • Instructor Dr Bill Curtis
  • Agenda
  • Section 1
  • Why Does Measurement Languish
  • Measurement Process Guidance
  • Measurement Categories
  • Productivity Analysis Objectives
  • Productivity Analysis Measures
  • Software Productivity
  • Software Size Measures
  • How Many Lines of Code
  • Function Point Estimation
  • Effort mdash Weakest Measure
  • How Quality Affects Productivity
  • Carry-forward Rework
  • Example of Quality Impact
  • Quality-Adjusted Productivity
  • Best Practices for Analysis
  • 1 Evaluate Demographics
  • 2 Segment Baselines
  • 3 Beware Sampling Effects
  • 4 Understand Variation
  • 5 Investigate Distributions
  • 6 Account for Maturity Effects
  • 7 Caution for External Data Sets
  • Slide Number 27
  • Measuring Project Progress
  • Requirements Change Impact
  • Phase Defect Removal Model
  • SCRUM Board
  • Burndown Chart and Velocity
  • Analyzing Velocity
  • Measuring and Managing Flow
  • Trends Over Sprints
  • Sources and Measures
  • Recommended Books
  • Personal Software Process (PSP)
  • Team Software Process (TSP)
  • Aggregated Personal Data Best
  • Defect Detection at Intuit
  • TSP Benefits
  • Section 3
  • CMMI Maturity Transitions
  • Enhanced Maturity Framework
  • Maturity Level Transformation
  • Measurement by Level
  • Average Improvement Results
  • Schedule Performance
  • Cost Performance
  • Crosbyrsquos Cost of Quality Model
  • Raytheons Cost of Quality
  • Raytheonrsquos Productivity
  • Raytheonrsquos Cost Reduction
  • Raytheonrsquos Cost Predictability
  • OMRONrsquos Reuse gains
  • Section 4
  • What about the Product
  • Six Sigmarsquos Challenge
  • Testing is Not Enough
  • CASTrsquos Application Intelligence Platform
  • Modern Apps are a Technology Stack
  • Architecturally Complex Defects
  • Propagated Risk
  • Transaction Risk
  • Risk-Based Prioritization
  • Static Analysis in US DoD
  • Reducing Operational Incidents amp Costs
  • Reducing Operational Losses
  • Slide Number 70
  • The Technical Debt Metaphor
  • Tech Debt by Quality Factor
  • Join CISQ wwwit-cisqorg
  • Section
  • References 1
  • References 2
  • References 2

PHASE

Escapes Previous Phase

KSLOC

Defect Injection

KSLOC

Subtotal

KSLOC

Removal

Effective-ness

Defect Removal

KSLOC

Escapes at phase exit KSLOC

Rqmts

0

12

12

0

0

12

Design

12

180

192

76

146

46

Code amp UT

46

154

200

71

142

58

Inte-gration

58

0

58

67

39

19

Sys Test

19

0

19

58

11

08

Opera-tion

08

Requirements Added Requirements Changed Requirements Deleted
Feb 2 20 1
Mar 2 11 1
Apr 45 35 3
May 15 61 1
Jun 1 20 4
Jul 15 15 0
Aug 18 05 2
Feb Feb Feb
Mar Mar Mar
Apr Apr Apr
May May May
Jun Jun Jun
Jul Jul Jul
Aug Aug Aug
Page 3: Best Practices for Software Process and Product Measurement · Software Process and Product Measurement Dr. Bill Curtis ... measurement best practice • Best guidance for starting

Click to edit Master title style Agenda section slide

I Measuring Software Size and Productivity 3 Break

II Adopting Measures to Development Methods 33 Lunch

III Measuring Software Improvement Programs 61 Break

IV Measuring Software Product Quality 89 Adjourn

V References 115

copy 2017 The presentation material in these tutorial notes is copyrighted by Dr Bill Curtis It is to be used for the benefit of the individuals attending this tutorial It may not be copied or distributed to others without written permission from Dr Curtis For further information please contact

reg Capability Maturity Model Capability Maturity Model Integration People Capability Maturity Model CMM and CMMI are registered in the US Patent and Trademark Office

Dr Bill Curtis PO Box 126079 Fort Worth Texas 76126-0079 USA

curtisacmorg

Click to edit Master title style Section 1 Section 1 Measuring Software Size and Productivity

1 Practical Software Measurement

2 Size and functional measures

3 Measuring software productivity

4 Analyzing software productivity

4

Click to edit Master title style Why Does Measurement Languish

Unreliable data

Too many measures

Poorly matched to process maturity

Badly defined measures

Use in personal appraisals

Click to edit Master title style Measurement Process Guidance

bull Based on decades of software measurement best practice

bull Best guidance for starting a measurement program

bull Compatible with ISO 15939 minus Software Measurement Process

bull Free guides amp licensed training and consulting available

bull httppsmsccom

6 McGarry et al (2002) Practical Software Measurement

Click to edit Master title style Measurement Categories Category Issue Example measure

Schedule amp Progress

Milestone completion Earned value

Actual vs planned dates Actual vs planned completions

Resources amp Cost

Personnel effort Facilities used

Person-hours Test equipment amp hours

Product Size Size of coded product Size of documentation

Function Points Pages of manuals

Product Quality Functional quality Structural quality

Defects per KLOC Violations of reliability rules

Process Performance

Efficiency Compliance

Defect detection efficiency Audit results

Product Value Return on investment Risk avoidance

∆ Increase in target revenue Cost of system outage

Customer Satisfaction

Account growth Selfndashsufficiency

Repeat business within 1 year Number of helpdesk calls

McGarry et al (2002) Practical Software Measurement p37

Leve

l 2

ge L

evel

3

Click to edit Master title style Productivity Analysis Objectives

Improvement

Productivity Analysis Estimation Benchmarking

Managing Vendors

Click to edit Master title style Productivity Analysis Measures

Adjustment Measures

bull Functional bull Structural bull Behavioral

bull Application bull Project bull Organization

Quality Demo-graphics

Primary Measures

bull Instructions bull Functions bull Requirements

bull Hours bull Roles bull Phases

Size Effort

Productivity Analysis

Click to edit Master title style Software Productivity

Release Productivity =

developed Size of software deleted or modified

Total effort expended on the release

Software Productivity =

Size of software produced

Total effort expended to produce it

Click to edit Master title style Software Size Measures

Instructions Lines of Code

Most frequently used Different definitions of a line can cause counts to vary by 10x Smaller programs often accomplish the same functionality with higher quality coding

Requirements-based Use Case Points Story Points

Use Case Points have not become widely used and need more development Story points are subjective to each team and are susceptible to several forms of bias

Functions Function Points

Popular in IT Several counting schemes (IFPUG NESMA Mark II COSMIC etc) Manual counting is expensive and subjectivemdashcertified counters can differ by 10 Automated FPs taking root

Click to edit Master title style How Many Lines of Code define LOWER 0 lower limit of table define UPPER 300 upper limit define STEP 20 step size main() print a Fahrenheit-Celsius conversion table int fahr for(fahr=LOWER fahrlt=UPPER fahr=fahr+STEP) printf(ldquo4d 61fnrdquo fahr (5090)(fahr-32))

1 2 3 4 5 6 7 8 9 0

2

4

6

8

10

12

Number of Lines of Code

Number of votes

B Park (1992)

Click to edit Master title style Function Point Estimation

Ebert amp Dumke (2007) Software Measurement p188

R2 = 95 y = 779x + 4350

0 200 400 600 800 1000 0

2000

4000

6000

8000

Un

adju

sted

Fu

nct

ion

Po

ints

EIs + EOs

Data store Software

Entries

Exits

Reads

Writes

Functional view of software

Functional size can be estimated from external inputs and outputs

Upfront functional analysis provides basis for good estimates

Repository of FP data provides basis for calibrating estimates

39

Click to edit Master title style Effort mdash Weakest Measure

Effort Unreliable

Inconsistent

Effort

After the fact estimates

bull Memory lapses bull Time-splicing bull Inconsistency

Under-reporting

bull Contract issues

bull HR issues

bull Impressions

Lack of normalization

bull Roles included

bull Phases included

bull Hours in P-Year

Click to edit Master title style How Quality Affects Productivity

Original productivity baseline

Incremental increases in technical debt

Continuing decrease in productivity

Unless you take action

Assumption Productivity is a stable number

Reality Productivity is unstable tending to decline

Click to edit Master title style Carry-forward Rework

Release N+2

Develop N+2 Rework N+2 Rework N+1 Rework N+0

Develop N

Release N

Rework N

Unfixed defects release N

Release N+1

Develop N+1 Rework N+1 Rework N+0

Unfixed defects release N

Unfixed defects release N+1

Click to edit Master title style Example of Quality Impact Project B (Better Faster Cheaper)

ndash 20 developers 3 months

ndash $120k per FTE

ndash 4 FPs per staff month

ndash 240 FPs delivered

$2500FP cost

ndash 5 critical violations per FP

ndash $500 per fix

ndash Cost for 1200 fixes = $600k

ndash Total Cost to Own = $1200k

$5000FP of TCO

Project A (Plodders)

ndash 20 developers 3 months

ndash $120k per FTE

ndash 3 FPs per staff month

ndash 180 FPs delivered

$3333FP cost

ndash 2 critical violations per FP

ndash $500 per fix

ndash Cost for 360 fixes = $180k

ndash Total Cost to Own = $780k

$4333FP of TCO

Project B is 25 more productive

However

Project A is 134 more productive

Click to edit Master title style Quality-Adjusted Productivity

Productivity

Estimation

Benchmarks

Value amp ROI

Etc

Automated Enhancement

Points

Size of both functional and non-functional code segments

Corrective effort in future releases for defects injected in

this release

Must add future effort to fix bugs into productivity

calculations

Effort amp Cost

Quality-Adjusted

Productivity

Automated Technical Debt

Click to edit Master title style Best Practices for Analysis

1) Segment baselines

2) Beware sampling effects

3) Understand variation

4) Evaluate demographics

5) Investigate distributions

6) Account for maturity effects

7) Beware external data sets

Click to edit Master title style 1 Evaluate Demographics

Routine reports

COBOL apps

Function Points

Prod

uctiv

ity

Web-based apps

Innovative mobile

Custom ERP

Click to edit Master title style 2 Segment Baselines

Year Projects Productivity Total Corporate 1981 28 2342 1980 21 1939 Telecommunications 1981 14 1811 1980 12 1458 Engineering amp Defense 1981 8 2965 1980 6 2739 Business Applications 1981 6 3054 1980 3 1813

Multiple baselines are usually the most valid

Click to edit Master title style 3 Beware Sampling Effects

22

0

500

1000

1500

2000

2500

1980 1981 1982

Line

s of

cod

epe

rson

-yea

rs

Baselines

Lots of small

projects Lots of large

projects

Click to edit Master title style 4 Understand Variation

23

Product factors bull timing constraints bull memory utilization bull CPU occupancy bull resource constraints bull complexity bull product size

Project factors bull concurrent development bull development computer bull requirements definition bull requirements churn bull programming practices bull personnel experience bull client experience bull client participation

Project 49

Product 16

Not explained

35

R2product = 16

R2project = 49

R2

model = 65

J Vosburgh B Curtis et al (1984) Proceedings of ICSE 1994

Click to edit Master title style 5 Investigate Distributions

Low Medium High

Rel

ativ

e p

rod

uct

ivit

y (

of m

ean

)

Software Engineering Practices Usage

-100

-50

0

50

100

150

200

250

Click to edit Master title style 6 Account for Maturity Effects

Which organization will be required to spend more effort on correcting software flaws

Click to edit Master title style 7 Caution for External Data Sets

26

DACS Us

Productivity

Siz

e Data definitions

Data collection

Data validity

Data age

Demographics

Click to edit Master title style

1) Tracking traditional projects

2) Iterative and Agile measurement

3) Team Software Process measurement

33

Section 2 Measuring to Manage

Software Projects

Click to edit Master title style Measuring Project Progress

300K 250K 200K 150K 100K 50K 0K

Jan Feb Mar Apr May Jun

Planned rate of code Growth

Actual code growth

Completion of coding

ldquoThe measures say we are on schedulerdquo

300K

250K

200K

150K

100K

50K

0K Jan Feb Mar Apr May Jun

Planned rate of code Growth

Actual code growth

Completion of coding

ldquoBut the code keeps growing We should have measured the requirements growthrdquo

50

100

150

Defect reports

Total

Open

Closed

125

75

25

Jan Feb Mar Apr May Jun

Completion of coding

ldquoAnd all the defect reports remaining to be fixed will keep us far behind schedulerdquo

Click to edit Master title style Requirements Change Impact

Number of requirements

affected

Chart6

Requirements Added
Requirements Changed
Requirements Deleted
Person hours of work or rework
2
20
1
2
11
1
45
35
3
15
61
1
1
20
4
15
15
0
18
05
2

Sheet1

Sheet1

Requirements Added
Requirements Changed
Requirements Deleted
Person hours of work or rework

Sheet2

Sheet3

Click to edit Master title style Phase Defect Removal Model

PHASE

Escapes Previous Phase KSLOC

Defect Injection KSLOC

Subtotal KSLOC

Removal Effective-ness

Defect Removal KSLOC

Escapes at phase exit KSLOC

Rqmts 0 12 12 0 0 12

Design 12 180 192 76 146 46

Code amp UT 46 154 200 71 142 58

Inte-gration 58 0 58 67 39 19

Sys Test 19 0 19 58 11 08

Opera-tion 08

Kan (1999) Metrics and Model in Software Quality Engineering

Click to edit Master title style SCRUM Board Scrumboard displays

the path to completion for each story by showing task status

Story points are an estimate of days for each story

Agile estimating is by feature (story) rather than by task

lsquoDonersquo does not always mean all tasks are finished

Kanban methods restrict the work in progress to a limited number of paths

Click to edit Master title style Burndown Chart and Velocity

Burndown chart displays story completion by day

Burndown chart indicates actual versus estimated progress

Velocity is the rate at which stories are completed

Velocity indicates sustainable progress

Velocity results provide one source of historical data for improving estimates

Story points without the context of productivity factors are dangerous for estimating

Click to edit Master title style Team Velocity

0

1

2

3

4

5

1 2 2 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20Days

Story Points per Hour

0

1

2

3

4

5

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

HrsStryPt

Analyzing Velocity

Item Cycle Times

02468

101214161820

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

Stories Hours HrsStryPt

0

10

20

30

40

50

60

1 2 2 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

Burndown Chart

Days

Click to edit Master title style Measuring and Managing Flow

Released

In progress

Backlogged

Cumulative Flow Diagram Days to delivery

outliers

Anderson (2010) Kanban

Click to edit Master title style

0

10

20

30

40

50

60

70

80

1 2 3 4 5 6 7 8 9 10 11 12

Sprints

Sto

ry p

oin

ts d

eliv

ered

Trends Over Sprints

0

10

20

30

40

50

60

70

80

1 2 3 4 5 6 7 8 9 10 11 12

Sprints

Sto

ry p

oin

ts d

eliv

ered

0

05

1

15

2

25

3

35

4

1 2 3 4 5 6 7 8 9 10 11 12

Sprints

Tech

nic

al d

ebt

Track and correlate multiple measures

across sprints

Predictive models

Click to edit Master title style Sources and Measures

Stories submitted

Stories pulled back

Growth in size of stories

Stories added mid-sprint

Stories under review

Failed tests

Non-development tasks

Assists to other projects

Data sources

Measures Project tracking

Code control

Bug tracking

Build system

Test environment

Deployment tools

Operational monitoring

Click to edit Master title style Recommended Books

Click to edit Master title style Personal Software Process (PSP)

PSP 0 Current process Time recording

Defect recording Defect type standards

PSP 01 Coding standard

Size measurement Process improvement

proposal

Baseline Personal Process

Humphrey (1997) Introduction to the Personal Software Process

PSP 1 Size estimating

Test report

Personal Planning

PSP 2 Code reviews

Design reviews

Personal Quality Management

PSP 21 Cyclic development

Cyclic Personal Process

PSP 11 Task planning

Schedule planning

PSP 21 Design templates

Click to edit Master title style Team Software Process (TSP) bull Built from personal processes of team members

bull Well defined team roles

bull Project launch workshops for planning

bull Team measurement and tracking

bull Post-mortems for improvement

bull Team application of lean principles

Humphrey (2000) Introduction to the Team Software Process

Click to edit Master title style Aggregated Personal Data Best

R2s range from 70 to 87

The ability to predict the amount of time required to produce a piece of software is dramatically improved by estimating at the individual level first and then combining the estimates rather than developing a single team estimate averaged over individuals

R2 = 53

Click to edit Master title style Defect Detection at Intuit

Consequently test changes from defect

detection into correctness verification

PSPTSP shifts defect detection from the test phase back up into development

Fagan (2005)

Personal reviews

33

Compile 33

Team reviews 19

Unit test 15

Test 14

Click to edit Master title style

Dramatic reductions

ndash Delivered defects

ndash Days per KLOC

ndash Schedule deviation

ndash Test defectskloc

ndash Variation in results

Humphrey (2006) TSP Leading a Development Team

TSP Benefits

Click to edit Master title style Section 3 Section 3 Measuring Software

Improvement Programs

1) Improvement program methods

2) Maturity-based measurement

3) Improvement program results

61

Click to edit Master title style CMMI Maturity Transitions

Level 5 Optimizing

Innovation management

Level 4 Quantitatively

Managed

Capability management

Level 3 Defined

Process management

Level 2 Managed

Project management

Level 1 Initial

Inconsistent management

Chrisis Konrad amp Shrum (2005) CMMI

Click to edit Master title style Enhanced Maturity Framework

Level 5 Innovating

Innovation management

Level 4 Optimized

Capability management

Level 3 Standardized

Organizational management

Level 2 Stabilized

Work unit management

Level 1 Initial

Inconsistent management

Click to edit Master title style Maturity Level Transformation

Level 5 Opportunistic improvements

Empowered culture

Level 5 Proactive

improvements Agile culture

Organization

Projects

Individual

Level 4 Process and results

managed statistically Precision culture

Level 3 Organization develops

standard processes Common culture

Level 2 Project managers stabilize projects

Local culture

Level 1 Ad Hoc inconsistent

development practices Relationship culture

Trust

Click to edit Master title style Measurement by Level

Level 5 mdash Exploratory data ∆ t = N x ROI

Level 4 mdash Predictive data X plusmn (196σ radicn )

Level 3 mdash Process data 83 defects per review

Level 2 mdash Management data Effort = α LOCβ

Level 1 mdash Haphazard data ldquoabout a million linesrdquo

Measure-maturity mismatch slows improvement and creates

dysfunction

Click to edit Master title style Average Improvement Results

Productivity growth 4 35 9 - 67

Pre-test defect detection 3 22 6 - 25

Time to market 2 darr 19 15 - 23

Field error reports 5 darr 39 10 - 94

Return on investment 5 501 41 - 881

Annual Annual Improvement benefit Orgs median range

Click to edit Master title style Schedule Performance

0

02

04

06

08

1

12

14

16

1 2 3

Schedule performance

index

CMM maturity level

Budgeted cost of work performed

Budgeted cost of work scheduled

behind ahead

Air Force study of contractors

Flowe amp Thordahl (1994)

Click to edit Master title style Cost Performance

Cost performance

index

0

05

1

15

2

25

1 2 3 CMM Maturity Level

Budgeted cost of work performed Actual cost of

work performed

overrun underrun

Air Force study of contractors

Flowe amp Thordahl (1994)

Click to edit Master title style Crosbyrsquos Cost of Quality Model Project Cost

Performance Cost of Quality Generating plans documentation Development - requirements - design - code - integration Prevention Appraisal

Reviews - system - requirements - design - test plan - test scripts Walkthroughs Testing (first time) Independent Verification and Validation (first time) Audits

Training Policies Procedures and methods Tools Planning Quality Improvement Projects Data Gathering and Analysis Root Cause Analysis Quality Reporting

Nonconformance Conformance Fixing Defects Reworking any Document Updating Source Code Re-reviews Re-tests Lab costs for re-tests Patches(Internal and External) Engineering Changes Change Control Boards External Failures and fines Customer Support Help Desks

Dion (1993) Crosby (1979) Quality Is Free

Click to edit Master title style Raytheons Cost of Quality

bull Performance mdash cost of building it right first time bull Nonconformance mdash cost of rework bull Appraisal mdash cost of testing bull Prevention mdash cost of preventing nonconformance

Dion (1993) Haley (1995)

23

18

Year Level Perform Nonconf Appraise Prevent

1988 1 34 41 15 7

1990 2 55 18 15 12

1992 3 66 11

1994 4 76 6 75

Click to edit Master title style Raytheonrsquos Productivity

0

1

2

3

4

1988 1989 1990 1991 1992 1993 1994

Growth relative to 1989 baseline

Lydon (1995)

Click to edit Master title style Raytheonrsquos Cost Reduction

0

02

04

06

08

1

12

1988 1989 1990 1991 1992 1993 1994

Reduction relative to 1989 baseline

Lydon (1995)

Click to edit Master title style

Haley (1995)

1993 1992 1991 1990 1989 1988 1994

50

40

30

20

10

0

-10

Actual cost

Budgeted cost

over-run under-run

4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1

Raytheonrsquos Cost Predictability

Click to edit Master title style OMRONrsquos Reuse gains

Sakamoto et al (1996)

0

5 0 0

1 0 0 0

1 5 0 0

2 0 0 0

2 5 0 0

3 0 0 0

1 9 8 7 1 9 8 8 1 9 8 9 1 9 9 0 1 9 9 1 1 9 9 2 1 9 9 3 1 9 9 4

Steps

100

75

50

25

0

of reuse

Click to edit Master title style Section 4 Section 4 Measuring Quality and

Technical Debt

1) Structural quality measurement

2) Technical debt

3) Consortium for IT Software Quality (CISQ)

89

Click to edit Master title style What about the Product CMMIrsquos primary assumption

ldquoThe quality of a software system is governed by quality of the process used to develop itrdquo

- Watts Humphrey

CMMIrsquos product quality problem

1 Assessments do not verify product quality

2 Compliance focus undermines CMMIrsquos primary assumption

3 Product quality focus at Level 4 in CMM was lost in quantitative process control in CMMI

4 The assumptions underlying control charts are violated by software data

Click to edit Master title style Six Sigmarsquos Challenge

Process focus ndash process improvement ndash Six Sigma Product focus ndash product improvement ndash Design for 6σ

Product focus supplements product focus to unlock even more value from software

Six Sigma projects must have significant benefits Huge benefits

Large benefits

Good benefits

Okay benefits

Small benefits

Now what

Ultimately we run out of projects with significant enough benefits to continue the programhellip

so what is the solution that allows continual improvement

Click to edit Master title style Testing is Not Enough ldquoAs higher levels of assurance are demandedhelliptesting cannot deliver the level of confidence required at a reasonable costrdquo

ldquoThe correctness of the code is rarely the weakest linkrdquo

Jackson D (2009) Spinellis D (2006)

ldquohellipa failure to satisfy a non-functional requirement can be critical even catastrophichellipnon-functional requirements are sometimes difficult to verify We cannot write a test case to verify a systemrsquos reliabilityhellipThe ability to associate code to non-functional properties can be a powerful weapon in a software engineerrsquos arsenalrdquo

Click to edit Master title style CASTrsquos Application Intelligence Platform

Application Analysis

Evaluation of 1200+ coding amp

architectural rules

Application meta-data

Transferability

Changeability

Robustness

Performance

Security

Quality Measurements

Detected Violations

Expensive operation in loop Static vs pooled connections Complex query on big table Large indices on big table

Empty CATCH block Uncontrolled data access Poor memory management Opened resource not closed

SQL injection Cross-site scripting Buffer overflow Uncontrolled format string

Unstructured code Misuse of inheritance Lack of comments Violated naming convention

Highly coupled component Duplicated code Index modified in loop High cyclomatic complexity

Language Parsers

Oracle PLSQL Sybase T-SQL SQL Server T-SQL IBM SQLPSM C C++ C Pro C Cobol CICS Visual Basic VBNet ASPNet Java J2EE JSP XML HTML Javascript VBScript PHP PowerBuilder Oracle Forms PeopleSoft SAP ABAP Netweaver Tibco Business Objects Universal Analyzer for other languages

Click to edit Master title style Modern Apps are a Technology Stack Ar

chite

ctur

al C

ompl

ianc

e

Data Flow Transaction Risk

EJB PLSQL

Oracle

SQL Server

DB2

TSQL

Hibernate

Spring

Struts NET

COBOL

Sybase IMS

Messaging

Integration quality Architectural

compliance Risk propagation Application security Resiliency checks Transaction integrity

Function point Effort estimation Data access control SDK versioning Calibration across

technologies IT organization level

System Level 3

Code style amp layout Expression complexity Code documentation Class or program design Basic coding standards Developer level

Unit Level 1

Java Java Java

Java Java

Web Services Java Java Single languagetechnology layer

Intra-technology architecture Intra-layer dependencies Inter-program invocation Security vulnerabilities Development team level

Technology Level 2

JSP ASPNET APIs

Click to edit Master title style Architecturally Complex Defects

Primary cause of operational

problems

20x as many

fixes to correct

Architecturally Complex Defect

A structural flaw involving interactions among multiple components that reside in different application layers

48

52 92

8 Architecturally

Complex Defects

Code unit-level violations

of total app defects

of total repair effort

Most architecturally complex defects involve an architectural hotspot ndash a badly-built component that should be replaced

Click to edit Master title style Propagated Risk

The impact of violation 284 in Component B is very widely propagated throughout the system ndash

it presents very high risk

The impact of violation 342 in Component A is only narrowly propagated in the system

ndash it presents low risk

Propagated Risk Index

A ranking of the risk created by each violation based on the its severity and the breadth of its impact in the system

Violation 342 in component A

Violation 284 in component B

Remediating violation 284 in Component B will have

the greater impact on reducing risk and cost

Click to edit Master title style Transaction Risk

Customers care most about the dependability of their transactions

The risk of a transaction is determined by the number and severity of violations along its path

Transactions can be ranked by risk to prioritize their remediation

The Transaction Risk Index can be further tuned using the operational frequency of each transaction

Transaction Risk Index

A ranking of transactions based on the risk created by the number and severity of violations along the transaction path

Transaction entryexit in the user interface layer

Transaction fulfillment in the data storage layer

1 3 2 4 5

Transaction 1 poses greatest risk

Transaction processing in the business logic layer

Click to edit Master title style Risk-Based Prioritization

Violation Health Factor Severity Propagated Risk Index

In a high risk transaction

Frequency of path execution

148 Security 9

371 Robustness 9

062 Performance 8

284 Robustness 7

016 Changeability 7

241 Security 5

173 Transferability 4

359 Transferability 3

Click to edit Master title style Static Analysis in US DoD

John Keane (2013)

Testing finds lt13 of structural defects

Static analysis before testing is 85 efficient

With testing and static analysis combined test schedules are reduced by 50

Click to edit Master title style Reducing Operational Incidents amp Costs

Study of structural quality measures and maintenance effort across 20 customers in a large global system integrator

TQI increase of 24 decreased corrective maintenance effort by 50

Total Quality Index

Click to edit Master title style Reducing Operational Losses

0

20

40

60

80

100

120

140

160

180

100 150 200 250 300 350 400

Technical Quality Index

Pro

du

ctio

n In

cid

ents

+$2000000 Annual Loss

$50000 Annual Loss

$5000 Annual Loss

Large international investment bank Business critical applications

Click to edit Master title style Factors Affecting Structural Quality

2017 CRASH Report Request from infocastsoftwarecom

Factor apps Robust Security Perform Change Transfer

Technology 1606 20 3 16 26 7

Java-EE

Industry 677 5 5 7 4 6

App type 510 1 1 1 1 1

Source 580 1 1 4

Shore 540

Maturity 72 28 25 15 24 12

Method 208 10 6 14 6

Team size 186 7 5 8 8

of users 262 4 3 5 4

Percentage of variation in structural quality scores accounted for by various factors

Click to edit Master title style The Technical Debt Metaphor

Structural quality problems in production code

Technical Debt

Principal borrowed

Interest on the debt

Business Risk Liability from debt

Opportunity cost

Interestmdashcontinuing IT costs attributable to the violations causing technical debt ie higher maintenance costs greater resource usage etc

Principalcost of fixing problems remaining in the code after release that must be remediated

Opportunity costmdashbenefits that could have been achieved had resources been put on new capability rather than retiring technical debt

Liabilitymdashbusiness costs related to outages breaches corrupted data etc

Technical Debt the future cost of fixing defects remaining in code at release a component of the cost of ownership

Todayrsquos talk focuses on the principal

Curtis et al (2012) IEEE Software

Click to edit Master title style Tech Debt by Quality Factor

Transferability

40

Changeability

30

Security 7

Robustness

18

70 of Technical Debt is in IT Cost (Transferability Changeability)

30 of Technical Debt is in Business Risk

(Robustness Performance Security)

Health Factor proportions are mostly

consistent across technologies

Curtis et al (2012) IEEE Software

Click to edit Master title style Join CISQ wwwit-cisqorg

Click to edit Master title style Section

References

115

Click to edit Master title style References 1 Anderson DJ (2010) Kanban Sequim WA Blue Hole Press

Basili V Trendowicz A Kowalczyk M Heidrich J Seaman C Munch J amp Rombach D (2014) Aligning Organizations through Measurement Heidelberg Springer

Boehm B (1987) Increasing software productivity IEEE Computer 20 (9) 43-57

Boehm B et al (2000) Software Cost Estimation with COCOMO II Prentice-Hall

Chrissis MB Konrad M amp Shrum S (2011) CMMI Third Edition Guidelines for Process Integration and Product Improvement Reading MA Addison Wesley

Cohn M (2006) Agile Estimating and Planning Upper Saddle River NJ Prentice-Hall

Crosby P (1979) Quality Is Free New York McGraw-Hill

Curtis B Sapiddi J amp Szynkarski A (2012) Estimating the principle of an applicationrsquos technical debt IEEE Software 29 (6) 34-42

Dion R (1993) Process improvement and the corporate balance sheet IEEE Software 10 (4) 28-35

Duncker R (1992) Presentation at the 25th Annual Conference of the Singapore Computer Society Singapore November 1992

Dyne K (1999) ESSI Benefits from Continuous Software Improvements In Proceedings of ESEPGrsquo99 Milton Keyes UK ESPI Foundation

Flowe RM amp Thordahl JB (1994) A Correlational Study of the SEIrsquos Capability Maturity Model and Software Development Performance in DoD Contracts (AFITGSSLAR94D-2) Dayton OH Air Force Institute of Technology Wright Patterson Air Force Base

116

Click to edit Master title style References 2 Haley T Ireland B Wojtaszek E Nash D amp Dion R (1995) Raytheon Electronic Systems

Experience in Software Process Improvement (Tech Rep CMUSEI-95-TR-017) ) Pittsburgh Software Engineering Institute Carnegie Mellon University

Ebert C amp Dumke R (2007) Software Measurement Berlin Springer-Verlag

Fagan E amp Davis N (2005) TSP PSP at Intuit Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Fenton N (2006) New directions for Software Metrics CIO Symposium on Software Best Practices

Garmus D amp Herron D (2001) Function Point Analysis Boston Addison-Wesley

George E (2005) Project management with TSP Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Herbsleb J Carleton A Rozum J Siegel J amp Zubrow D (1994) Benefits of CMM-Based Software Process Improvement Initial Results (Tech Rep CMUSEI-94-TR-13) Pittsburgh Software Engineering Institute Carnegie Mellon University

Humphrey W (1997) Introduction to the Personal Software Process Boston Addison Wesley

Humphrey W (2000) Introduction to the Team Software Process Boston Addison Wesley

Humphrey W (2006) TSP Leading a Development Team Boston Addison Wesley

Jackson D (2009) A direct path to dependable software Communications of the ACM 52 (4)

Kan SH (1995) Metrics and Models in Software Quality Engineering Reading MA Addison-Wesley

Keeni B (2000) IEEE Software 16 (4)

117

Click to edit Master title style References 2 Larman C (2004) Agile and Iterative Development Boston Addison Wesley

Lydon T (1995) Productivity drivers Process and capital In Proceedings of the 1995 SEPG Conference Pittsburgh Software Engineering Institute Carnegie Mellon University

McGarry J et al (2002) Practical Software Measurement Reading MA Addison Wesley

Paulk M Weber C Curtis B amp Chrissis M (1995) The Capability Maturity Model Guidelines for Improving the Software Process Reading MA Addison-Wesley

Peterman W (2000) IEEE Software JulyAugust issue

Poppendieck M amp Poppendieck T (2007) Implementing Lean Software Development Boston Addison-Wesley

Sakamoto K Kishida K amp Nakakoji K (1996) Cultural adaptation of the CMM In Fuggetta A amp Wolf A (Eds) Software Process Chichester UK Wiley 137-154

Sinha M (2006) Performance models ndash Enabling organizations prediction capability Proceedings of SEPGrsquo06 Pittsburgh Software Engineering Institute Carnegie Mellon University

Schwaber K amp Beedle M (2002) Agile Software Development Using Scrum

Spinellis D (2006) Code Quality Addison-Wesley

Van Epps S amp Marshall R (2012) Using TSP at the end of the development cycle Proceedings of TSP 2012 Pittsburgh Software Engineering Institute Carnegie Mellon University

Vu JD (1996) Software process improvement A business case In Proceedings of the European SEPG Conference Milton Keynes UK European Software Process Improvement Foundation

118

  • Best Practices for Software Process and Product Measurement
  • Instructor Dr Bill Curtis
  • Agenda
  • Section 1
  • Why Does Measurement Languish
  • Measurement Process Guidance
  • Measurement Categories
  • Productivity Analysis Objectives
  • Productivity Analysis Measures
  • Software Productivity
  • Software Size Measures
  • How Many Lines of Code
  • Function Point Estimation
  • Effort mdash Weakest Measure
  • How Quality Affects Productivity
  • Carry-forward Rework
  • Example of Quality Impact
  • Quality-Adjusted Productivity
  • Best Practices for Analysis
  • 1 Evaluate Demographics
  • 2 Segment Baselines
  • 3 Beware Sampling Effects
  • 4 Understand Variation
  • 5 Investigate Distributions
  • 6 Account for Maturity Effects
  • 7 Caution for External Data Sets
  • Slide Number 27
  • Measuring Project Progress
  • Requirements Change Impact
  • Phase Defect Removal Model
  • SCRUM Board
  • Burndown Chart and Velocity
  • Analyzing Velocity
  • Measuring and Managing Flow
  • Trends Over Sprints
  • Sources and Measures
  • Recommended Books
  • Personal Software Process (PSP)
  • Team Software Process (TSP)
  • Aggregated Personal Data Best
  • Defect Detection at Intuit
  • TSP Benefits
  • Section 3
  • CMMI Maturity Transitions
  • Enhanced Maturity Framework
  • Maturity Level Transformation
  • Measurement by Level
  • Average Improvement Results
  • Schedule Performance
  • Cost Performance
  • Crosbyrsquos Cost of Quality Model
  • Raytheons Cost of Quality
  • Raytheonrsquos Productivity
  • Raytheonrsquos Cost Reduction
  • Raytheonrsquos Cost Predictability
  • OMRONrsquos Reuse gains
  • Section 4
  • What about the Product
  • Six Sigmarsquos Challenge
  • Testing is Not Enough
  • CASTrsquos Application Intelligence Platform
  • Modern Apps are a Technology Stack
  • Architecturally Complex Defects
  • Propagated Risk
  • Transaction Risk
  • Risk-Based Prioritization
  • Static Analysis in US DoD
  • Reducing Operational Incidents amp Costs
  • Reducing Operational Losses
  • Slide Number 70
  • The Technical Debt Metaphor
  • Tech Debt by Quality Factor
  • Join CISQ wwwit-cisqorg
  • Section
  • References 1
  • References 2
  • References 2

PHASE

Escapes Previous Phase

KSLOC

Defect Injection

KSLOC

Subtotal

KSLOC

Removal

Effective-ness

Defect Removal

KSLOC

Escapes at phase exit KSLOC

Rqmts

0

12

12

0

0

12

Design

12

180

192

76

146

46

Code amp UT

46

154

200

71

142

58

Inte-gration

58

0

58

67

39

19

Sys Test

19

0

19

58

11

08

Opera-tion

08

Requirements Added Requirements Changed Requirements Deleted
Feb 2 20 1
Mar 2 11 1
Apr 45 35 3
May 15 61 1
Jun 1 20 4
Jul 15 15 0
Aug 18 05 2
Feb Feb Feb
Mar Mar Mar
Apr Apr Apr
May May May
Jun Jun Jun
Jul Jul Jul
Aug Aug Aug
Page 4: Best Practices for Software Process and Product Measurement · Software Process and Product Measurement Dr. Bill Curtis ... measurement best practice • Best guidance for starting

Click to edit Master title style Section 1 Section 1 Measuring Software Size and Productivity

1 Practical Software Measurement

2 Size and functional measures

3 Measuring software productivity

4 Analyzing software productivity

4

Click to edit Master title style Why Does Measurement Languish

Unreliable data

Too many measures

Poorly matched to process maturity

Badly defined measures

Use in personal appraisals

Click to edit Master title style Measurement Process Guidance

bull Based on decades of software measurement best practice

bull Best guidance for starting a measurement program

bull Compatible with ISO 15939 minus Software Measurement Process

bull Free guides amp licensed training and consulting available

bull httppsmsccom

6 McGarry et al (2002) Practical Software Measurement

Click to edit Master title style Measurement Categories Category Issue Example measure

Schedule amp Progress

Milestone completion Earned value

Actual vs planned dates Actual vs planned completions

Resources amp Cost

Personnel effort Facilities used

Person-hours Test equipment amp hours

Product Size Size of coded product Size of documentation

Function Points Pages of manuals

Product Quality Functional quality Structural quality

Defects per KLOC Violations of reliability rules

Process Performance

Efficiency Compliance

Defect detection efficiency Audit results

Product Value Return on investment Risk avoidance

∆ Increase in target revenue Cost of system outage

Customer Satisfaction

Account growth Selfndashsufficiency

Repeat business within 1 year Number of helpdesk calls

McGarry et al (2002) Practical Software Measurement p37

Leve

l 2

ge L

evel

3

Click to edit Master title style Productivity Analysis Objectives

Improvement

Productivity Analysis Estimation Benchmarking

Managing Vendors

Click to edit Master title style Productivity Analysis Measures

Adjustment Measures

bull Functional bull Structural bull Behavioral

bull Application bull Project bull Organization

Quality Demo-graphics

Primary Measures

bull Instructions bull Functions bull Requirements

bull Hours bull Roles bull Phases

Size Effort

Productivity Analysis

Click to edit Master title style Software Productivity

Release Productivity =

developed Size of software deleted or modified

Total effort expended on the release

Software Productivity =

Size of software produced

Total effort expended to produce it

Click to edit Master title style Software Size Measures

Instructions Lines of Code

Most frequently used Different definitions of a line can cause counts to vary by 10x Smaller programs often accomplish the same functionality with higher quality coding

Requirements-based Use Case Points Story Points

Use Case Points have not become widely used and need more development Story points are subjective to each team and are susceptible to several forms of bias

Functions Function Points

Popular in IT Several counting schemes (IFPUG NESMA Mark II COSMIC etc) Manual counting is expensive and subjectivemdashcertified counters can differ by 10 Automated FPs taking root

Click to edit Master title style How Many Lines of Code define LOWER 0 lower limit of table define UPPER 300 upper limit define STEP 20 step size main() print a Fahrenheit-Celsius conversion table int fahr for(fahr=LOWER fahrlt=UPPER fahr=fahr+STEP) printf(ldquo4d 61fnrdquo fahr (5090)(fahr-32))

1 2 3 4 5 6 7 8 9 0

2

4

6

8

10

12

Number of Lines of Code

Number of votes

B Park (1992)

Click to edit Master title style Function Point Estimation

Ebert amp Dumke (2007) Software Measurement p188

R2 = 95 y = 779x + 4350

0 200 400 600 800 1000 0

2000

4000

6000

8000

Un

adju

sted

Fu

nct

ion

Po

ints

EIs + EOs

Data store Software

Entries

Exits

Reads

Writes

Functional view of software

Functional size can be estimated from external inputs and outputs

Upfront functional analysis provides basis for good estimates

Repository of FP data provides basis for calibrating estimates

39

Click to edit Master title style Effort mdash Weakest Measure

Effort Unreliable

Inconsistent

Effort

After the fact estimates

bull Memory lapses bull Time-splicing bull Inconsistency

Under-reporting

bull Contract issues

bull HR issues

bull Impressions

Lack of normalization

bull Roles included

bull Phases included

bull Hours in P-Year

Click to edit Master title style How Quality Affects Productivity

Original productivity baseline

Incremental increases in technical debt

Continuing decrease in productivity

Unless you take action

Assumption Productivity is a stable number

Reality Productivity is unstable tending to decline

Click to edit Master title style Carry-forward Rework

Release N+2

Develop N+2 Rework N+2 Rework N+1 Rework N+0

Develop N

Release N

Rework N

Unfixed defects release N

Release N+1

Develop N+1 Rework N+1 Rework N+0

Unfixed defects release N

Unfixed defects release N+1

Click to edit Master title style Example of Quality Impact Project B (Better Faster Cheaper)

ndash 20 developers 3 months

ndash $120k per FTE

ndash 4 FPs per staff month

ndash 240 FPs delivered

$2500FP cost

ndash 5 critical violations per FP

ndash $500 per fix

ndash Cost for 1200 fixes = $600k

ndash Total Cost to Own = $1200k

$5000FP of TCO

Project A (Plodders)

ndash 20 developers 3 months

ndash $120k per FTE

ndash 3 FPs per staff month

ndash 180 FPs delivered

$3333FP cost

ndash 2 critical violations per FP

ndash $500 per fix

ndash Cost for 360 fixes = $180k

ndash Total Cost to Own = $780k

$4333FP of TCO

Project B is 25 more productive

However

Project A is 134 more productive

Click to edit Master title style Quality-Adjusted Productivity

Productivity

Estimation

Benchmarks

Value amp ROI

Etc

Automated Enhancement

Points

Size of both functional and non-functional code segments

Corrective effort in future releases for defects injected in

this release

Must add future effort to fix bugs into productivity

calculations

Effort amp Cost

Quality-Adjusted

Productivity

Automated Technical Debt

Click to edit Master title style Best Practices for Analysis

1) Segment baselines

2) Beware sampling effects

3) Understand variation

4) Evaluate demographics

5) Investigate distributions

6) Account for maturity effects

7) Beware external data sets

Click to edit Master title style 1 Evaluate Demographics

Routine reports

COBOL apps

Function Points

Prod

uctiv

ity

Web-based apps

Innovative mobile

Custom ERP

Click to edit Master title style 2 Segment Baselines

Year Projects Productivity Total Corporate 1981 28 2342 1980 21 1939 Telecommunications 1981 14 1811 1980 12 1458 Engineering amp Defense 1981 8 2965 1980 6 2739 Business Applications 1981 6 3054 1980 3 1813

Multiple baselines are usually the most valid

Click to edit Master title style 3 Beware Sampling Effects

22

0

500

1000

1500

2000

2500

1980 1981 1982

Line

s of

cod

epe

rson

-yea

rs

Baselines

Lots of small

projects Lots of large

projects

Click to edit Master title style 4 Understand Variation

23

Product factors bull timing constraints bull memory utilization bull CPU occupancy bull resource constraints bull complexity bull product size

Project factors bull concurrent development bull development computer bull requirements definition bull requirements churn bull programming practices bull personnel experience bull client experience bull client participation

Project 49

Product 16

Not explained

35

R2product = 16

R2project = 49

R2

model = 65

J Vosburgh B Curtis et al (1984) Proceedings of ICSE 1994

Click to edit Master title style 5 Investigate Distributions

Low Medium High

Rel

ativ

e p

rod

uct

ivit

y (

of m

ean

)

Software Engineering Practices Usage

-100

-50

0

50

100

150

200

250

Click to edit Master title style 6 Account for Maturity Effects

Which organization will be required to spend more effort on correcting software flaws

Click to edit Master title style 7 Caution for External Data Sets

26

DACS Us

Productivity

Siz

e Data definitions

Data collection

Data validity

Data age

Demographics

Click to edit Master title style

1) Tracking traditional projects

2) Iterative and Agile measurement

3) Team Software Process measurement

33

Section 2 Measuring to Manage

Software Projects

Click to edit Master title style Measuring Project Progress

300K 250K 200K 150K 100K 50K 0K

Jan Feb Mar Apr May Jun

Planned rate of code Growth

Actual code growth

Completion of coding

ldquoThe measures say we are on schedulerdquo

300K

250K

200K

150K

100K

50K

0K Jan Feb Mar Apr May Jun

Planned rate of code Growth

Actual code growth

Completion of coding

ldquoBut the code keeps growing We should have measured the requirements growthrdquo

50

100

150

Defect reports

Total

Open

Closed

125

75

25

Jan Feb Mar Apr May Jun

Completion of coding

ldquoAnd all the defect reports remaining to be fixed will keep us far behind schedulerdquo

Click to edit Master title style Requirements Change Impact

Number of requirements

affected

Chart6

Requirements Added
Requirements Changed
Requirements Deleted
Person hours of work or rework
2
20
1
2
11
1
45
35
3
15
61
1
1
20
4
15
15
0
18
05
2

Sheet1

Sheet1

Requirements Added
Requirements Changed
Requirements Deleted
Person hours of work or rework

Sheet2

Sheet3

Click to edit Master title style Phase Defect Removal Model

PHASE

Escapes Previous Phase KSLOC

Defect Injection KSLOC

Subtotal KSLOC

Removal Effective-ness

Defect Removal KSLOC

Escapes at phase exit KSLOC

Rqmts 0 12 12 0 0 12

Design 12 180 192 76 146 46

Code amp UT 46 154 200 71 142 58

Inte-gration 58 0 58 67 39 19

Sys Test 19 0 19 58 11 08

Opera-tion 08

Kan (1999) Metrics and Model in Software Quality Engineering

Click to edit Master title style SCRUM Board Scrumboard displays

the path to completion for each story by showing task status

Story points are an estimate of days for each story

Agile estimating is by feature (story) rather than by task

lsquoDonersquo does not always mean all tasks are finished

Kanban methods restrict the work in progress to a limited number of paths

Click to edit Master title style Burndown Chart and Velocity

Burndown chart displays story completion by day

Burndown chart indicates actual versus estimated progress

Velocity is the rate at which stories are completed

Velocity indicates sustainable progress

Velocity results provide one source of historical data for improving estimates

Story points without the context of productivity factors are dangerous for estimating

Click to edit Master title style Team Velocity

0

1

2

3

4

5

1 2 2 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20Days

Story Points per Hour

0

1

2

3

4

5

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

HrsStryPt

Analyzing Velocity

Item Cycle Times

02468

101214161820

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

Stories Hours HrsStryPt

0

10

20

30

40

50

60

1 2 2 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

Burndown Chart

Days

Click to edit Master title style Measuring and Managing Flow

Released

In progress

Backlogged

Cumulative Flow Diagram Days to delivery

outliers

Anderson (2010) Kanban

Click to edit Master title style

0

10

20

30

40

50

60

70

80

1 2 3 4 5 6 7 8 9 10 11 12

Sprints

Sto

ry p

oin

ts d

eliv

ered

Trends Over Sprints

0

10

20

30

40

50

60

70

80

1 2 3 4 5 6 7 8 9 10 11 12

Sprints

Sto

ry p

oin

ts d

eliv

ered

0

05

1

15

2

25

3

35

4

1 2 3 4 5 6 7 8 9 10 11 12

Sprints

Tech

nic

al d

ebt

Track and correlate multiple measures

across sprints

Predictive models

Click to edit Master title style Sources and Measures

Stories submitted

Stories pulled back

Growth in size of stories

Stories added mid-sprint

Stories under review

Failed tests

Non-development tasks

Assists to other projects

Data sources

Measures Project tracking

Code control

Bug tracking

Build system

Test environment

Deployment tools

Operational monitoring

Click to edit Master title style Recommended Books

Click to edit Master title style Personal Software Process (PSP)

PSP 0 Current process Time recording

Defect recording Defect type standards

PSP 01 Coding standard

Size measurement Process improvement

proposal

Baseline Personal Process

Humphrey (1997) Introduction to the Personal Software Process

PSP 1 Size estimating

Test report

Personal Planning

PSP 2 Code reviews

Design reviews

Personal Quality Management

PSP 21 Cyclic development

Cyclic Personal Process

PSP 11 Task planning

Schedule planning

PSP 21 Design templates

Click to edit Master title style Team Software Process (TSP) bull Built from personal processes of team members

bull Well defined team roles

bull Project launch workshops for planning

bull Team measurement and tracking

bull Post-mortems for improvement

bull Team application of lean principles

Humphrey (2000) Introduction to the Team Software Process

Click to edit Master title style Aggregated Personal Data Best

R2s range from 70 to 87

The ability to predict the amount of time required to produce a piece of software is dramatically improved by estimating at the individual level first and then combining the estimates rather than developing a single team estimate averaged over individuals

R2 = 53

Click to edit Master title style Defect Detection at Intuit

Consequently test changes from defect

detection into correctness verification

PSPTSP shifts defect detection from the test phase back up into development

Fagan (2005)

Personal reviews

33

Compile 33

Team reviews 19

Unit test 15

Test 14

Click to edit Master title style

Dramatic reductions

ndash Delivered defects

ndash Days per KLOC

ndash Schedule deviation

ndash Test defectskloc

ndash Variation in results

Humphrey (2006) TSP Leading a Development Team

TSP Benefits

Click to edit Master title style Section 3 Section 3 Measuring Software

Improvement Programs

1) Improvement program methods

2) Maturity-based measurement

3) Improvement program results

61

Click to edit Master title style CMMI Maturity Transitions

Level 5 Optimizing

Innovation management

Level 4 Quantitatively

Managed

Capability management

Level 3 Defined

Process management

Level 2 Managed

Project management

Level 1 Initial

Inconsistent management

Chrisis Konrad amp Shrum (2005) CMMI

Click to edit Master title style Enhanced Maturity Framework

Level 5 Innovating

Innovation management

Level 4 Optimized

Capability management

Level 3 Standardized

Organizational management

Level 2 Stabilized

Work unit management

Level 1 Initial

Inconsistent management

Click to edit Master title style Maturity Level Transformation

Level 5 Opportunistic improvements

Empowered culture

Level 5 Proactive

improvements Agile culture

Organization

Projects

Individual

Level 4 Process and results

managed statistically Precision culture

Level 3 Organization develops

standard processes Common culture

Level 2 Project managers stabilize projects

Local culture

Level 1 Ad Hoc inconsistent

development practices Relationship culture

Trust

Click to edit Master title style Measurement by Level

Level 5 mdash Exploratory data ∆ t = N x ROI

Level 4 mdash Predictive data X plusmn (196σ radicn )

Level 3 mdash Process data 83 defects per review

Level 2 mdash Management data Effort = α LOCβ

Level 1 mdash Haphazard data ldquoabout a million linesrdquo

Measure-maturity mismatch slows improvement and creates

dysfunction

Click to edit Master title style Average Improvement Results

Productivity growth 4 35 9 - 67

Pre-test defect detection 3 22 6 - 25

Time to market 2 darr 19 15 - 23

Field error reports 5 darr 39 10 - 94

Return on investment 5 501 41 - 881

Annual Annual Improvement benefit Orgs median range

Click to edit Master title style Schedule Performance

0

02

04

06

08

1

12

14

16

1 2 3

Schedule performance

index

CMM maturity level

Budgeted cost of work performed

Budgeted cost of work scheduled

behind ahead

Air Force study of contractors

Flowe amp Thordahl (1994)

Click to edit Master title style Cost Performance

Cost performance

index

0

05

1

15

2

25

1 2 3 CMM Maturity Level

Budgeted cost of work performed Actual cost of

work performed

overrun underrun

Air Force study of contractors

Flowe amp Thordahl (1994)

Click to edit Master title style Crosbyrsquos Cost of Quality Model Project Cost

Performance Cost of Quality Generating plans documentation Development - requirements - design - code - integration Prevention Appraisal

Reviews - system - requirements - design - test plan - test scripts Walkthroughs Testing (first time) Independent Verification and Validation (first time) Audits

Training Policies Procedures and methods Tools Planning Quality Improvement Projects Data Gathering and Analysis Root Cause Analysis Quality Reporting

Nonconformance Conformance Fixing Defects Reworking any Document Updating Source Code Re-reviews Re-tests Lab costs for re-tests Patches(Internal and External) Engineering Changes Change Control Boards External Failures and fines Customer Support Help Desks

Dion (1993) Crosby (1979) Quality Is Free

Click to edit Master title style Raytheons Cost of Quality

bull Performance mdash cost of building it right first time bull Nonconformance mdash cost of rework bull Appraisal mdash cost of testing bull Prevention mdash cost of preventing nonconformance

Dion (1993) Haley (1995)

23

18

Year Level Perform Nonconf Appraise Prevent

1988 1 34 41 15 7

1990 2 55 18 15 12

1992 3 66 11

1994 4 76 6 75

Click to edit Master title style Raytheonrsquos Productivity

0

1

2

3

4

1988 1989 1990 1991 1992 1993 1994

Growth relative to 1989 baseline

Lydon (1995)

Click to edit Master title style Raytheonrsquos Cost Reduction

0

02

04

06

08

1

12

1988 1989 1990 1991 1992 1993 1994

Reduction relative to 1989 baseline

Lydon (1995)

Click to edit Master title style

Haley (1995)

1993 1992 1991 1990 1989 1988 1994

50

40

30

20

10

0

-10

Actual cost

Budgeted cost

over-run under-run

4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1

Raytheonrsquos Cost Predictability

Click to edit Master title style OMRONrsquos Reuse gains

Sakamoto et al (1996)

0

5 0 0

1 0 0 0

1 5 0 0

2 0 0 0

2 5 0 0

3 0 0 0

1 9 8 7 1 9 8 8 1 9 8 9 1 9 9 0 1 9 9 1 1 9 9 2 1 9 9 3 1 9 9 4

Steps

100

75

50

25

0

of reuse

Click to edit Master title style Section 4 Section 4 Measuring Quality and

Technical Debt

1) Structural quality measurement

2) Technical debt

3) Consortium for IT Software Quality (CISQ)

89

Click to edit Master title style What about the Product CMMIrsquos primary assumption

ldquoThe quality of a software system is governed by quality of the process used to develop itrdquo

- Watts Humphrey

CMMIrsquos product quality problem

1 Assessments do not verify product quality

2 Compliance focus undermines CMMIrsquos primary assumption

3 Product quality focus at Level 4 in CMM was lost in quantitative process control in CMMI

4 The assumptions underlying control charts are violated by software data

Click to edit Master title style Six Sigmarsquos Challenge

Process focus ndash process improvement ndash Six Sigma Product focus ndash product improvement ndash Design for 6σ

Product focus supplements product focus to unlock even more value from software

Six Sigma projects must have significant benefits Huge benefits

Large benefits

Good benefits

Okay benefits

Small benefits

Now what

Ultimately we run out of projects with significant enough benefits to continue the programhellip

so what is the solution that allows continual improvement

Click to edit Master title style Testing is Not Enough ldquoAs higher levels of assurance are demandedhelliptesting cannot deliver the level of confidence required at a reasonable costrdquo

ldquoThe correctness of the code is rarely the weakest linkrdquo

Jackson D (2009) Spinellis D (2006)

ldquohellipa failure to satisfy a non-functional requirement can be critical even catastrophichellipnon-functional requirements are sometimes difficult to verify We cannot write a test case to verify a systemrsquos reliabilityhellipThe ability to associate code to non-functional properties can be a powerful weapon in a software engineerrsquos arsenalrdquo

Click to edit Master title style CASTrsquos Application Intelligence Platform

Application Analysis

Evaluation of 1200+ coding amp

architectural rules

Application meta-data

Transferability

Changeability

Robustness

Performance

Security

Quality Measurements

Detected Violations

Expensive operation in loop Static vs pooled connections Complex query on big table Large indices on big table

Empty CATCH block Uncontrolled data access Poor memory management Opened resource not closed

SQL injection Cross-site scripting Buffer overflow Uncontrolled format string

Unstructured code Misuse of inheritance Lack of comments Violated naming convention

Highly coupled component Duplicated code Index modified in loop High cyclomatic complexity

Language Parsers

Oracle PLSQL Sybase T-SQL SQL Server T-SQL IBM SQLPSM C C++ C Pro C Cobol CICS Visual Basic VBNet ASPNet Java J2EE JSP XML HTML Javascript VBScript PHP PowerBuilder Oracle Forms PeopleSoft SAP ABAP Netweaver Tibco Business Objects Universal Analyzer for other languages

Click to edit Master title style Modern Apps are a Technology Stack Ar

chite

ctur

al C

ompl

ianc

e

Data Flow Transaction Risk

EJB PLSQL

Oracle

SQL Server

DB2

TSQL

Hibernate

Spring

Struts NET

COBOL

Sybase IMS

Messaging

Integration quality Architectural

compliance Risk propagation Application security Resiliency checks Transaction integrity

Function point Effort estimation Data access control SDK versioning Calibration across

technologies IT organization level

System Level 3

Code style amp layout Expression complexity Code documentation Class or program design Basic coding standards Developer level

Unit Level 1

Java Java Java

Java Java

Web Services Java Java Single languagetechnology layer

Intra-technology architecture Intra-layer dependencies Inter-program invocation Security vulnerabilities Development team level

Technology Level 2

JSP ASPNET APIs

Click to edit Master title style Architecturally Complex Defects

Primary cause of operational

problems

20x as many

fixes to correct

Architecturally Complex Defect

A structural flaw involving interactions among multiple components that reside in different application layers

48

52 92

8 Architecturally

Complex Defects

Code unit-level violations

of total app defects

of total repair effort

Most architecturally complex defects involve an architectural hotspot ndash a badly-built component that should be replaced

Click to edit Master title style Propagated Risk

The impact of violation 284 in Component B is very widely propagated throughout the system ndash

it presents very high risk

The impact of violation 342 in Component A is only narrowly propagated in the system

ndash it presents low risk

Propagated Risk Index

A ranking of the risk created by each violation based on the its severity and the breadth of its impact in the system

Violation 342 in component A

Violation 284 in component B

Remediating violation 284 in Component B will have

the greater impact on reducing risk and cost

Click to edit Master title style Transaction Risk

Customers care most about the dependability of their transactions

The risk of a transaction is determined by the number and severity of violations along its path

Transactions can be ranked by risk to prioritize their remediation

The Transaction Risk Index can be further tuned using the operational frequency of each transaction

Transaction Risk Index

A ranking of transactions based on the risk created by the number and severity of violations along the transaction path

Transaction entryexit in the user interface layer

Transaction fulfillment in the data storage layer

1 3 2 4 5

Transaction 1 poses greatest risk

Transaction processing in the business logic layer

Click to edit Master title style Risk-Based Prioritization

Violation Health Factor Severity Propagated Risk Index

In a high risk transaction

Frequency of path execution

148 Security 9

371 Robustness 9

062 Performance 8

284 Robustness 7

016 Changeability 7

241 Security 5

173 Transferability 4

359 Transferability 3

Click to edit Master title style Static Analysis in US DoD

John Keane (2013)

Testing finds lt13 of structural defects

Static analysis before testing is 85 efficient

With testing and static analysis combined test schedules are reduced by 50

Click to edit Master title style Reducing Operational Incidents amp Costs

Study of structural quality measures and maintenance effort across 20 customers in a large global system integrator

TQI increase of 24 decreased corrective maintenance effort by 50

Total Quality Index

Click to edit Master title style Reducing Operational Losses

0

20

40

60

80

100

120

140

160

180

100 150 200 250 300 350 400

Technical Quality Index

Pro

du

ctio

n In

cid

ents

+$2000000 Annual Loss

$50000 Annual Loss

$5000 Annual Loss

Large international investment bank Business critical applications

Click to edit Master title style Factors Affecting Structural Quality

2017 CRASH Report Request from infocastsoftwarecom

Factor apps Robust Security Perform Change Transfer

Technology 1606 20 3 16 26 7

Java-EE

Industry 677 5 5 7 4 6

App type 510 1 1 1 1 1

Source 580 1 1 4

Shore 540

Maturity 72 28 25 15 24 12

Method 208 10 6 14 6

Team size 186 7 5 8 8

of users 262 4 3 5 4

Percentage of variation in structural quality scores accounted for by various factors

Click to edit Master title style The Technical Debt Metaphor

Structural quality problems in production code

Technical Debt

Principal borrowed

Interest on the debt

Business Risk Liability from debt

Opportunity cost

Interestmdashcontinuing IT costs attributable to the violations causing technical debt ie higher maintenance costs greater resource usage etc

Principalcost of fixing problems remaining in the code after release that must be remediated

Opportunity costmdashbenefits that could have been achieved had resources been put on new capability rather than retiring technical debt

Liabilitymdashbusiness costs related to outages breaches corrupted data etc

Technical Debt the future cost of fixing defects remaining in code at release a component of the cost of ownership

Todayrsquos talk focuses on the principal

Curtis et al (2012) IEEE Software

Click to edit Master title style Tech Debt by Quality Factor

Transferability

40

Changeability

30

Security 7

Robustness

18

70 of Technical Debt is in IT Cost (Transferability Changeability)

30 of Technical Debt is in Business Risk

(Robustness Performance Security)

Health Factor proportions are mostly

consistent across technologies

Curtis et al (2012) IEEE Software

Click to edit Master title style Join CISQ wwwit-cisqorg

Click to edit Master title style Section

References

115

Click to edit Master title style References 1 Anderson DJ (2010) Kanban Sequim WA Blue Hole Press

Basili V Trendowicz A Kowalczyk M Heidrich J Seaman C Munch J amp Rombach D (2014) Aligning Organizations through Measurement Heidelberg Springer

Boehm B (1987) Increasing software productivity IEEE Computer 20 (9) 43-57

Boehm B et al (2000) Software Cost Estimation with COCOMO II Prentice-Hall

Chrissis MB Konrad M amp Shrum S (2011) CMMI Third Edition Guidelines for Process Integration and Product Improvement Reading MA Addison Wesley

Cohn M (2006) Agile Estimating and Planning Upper Saddle River NJ Prentice-Hall

Crosby P (1979) Quality Is Free New York McGraw-Hill

Curtis B Sapiddi J amp Szynkarski A (2012) Estimating the principle of an applicationrsquos technical debt IEEE Software 29 (6) 34-42

Dion R (1993) Process improvement and the corporate balance sheet IEEE Software 10 (4) 28-35

Duncker R (1992) Presentation at the 25th Annual Conference of the Singapore Computer Society Singapore November 1992

Dyne K (1999) ESSI Benefits from Continuous Software Improvements In Proceedings of ESEPGrsquo99 Milton Keyes UK ESPI Foundation

Flowe RM amp Thordahl JB (1994) A Correlational Study of the SEIrsquos Capability Maturity Model and Software Development Performance in DoD Contracts (AFITGSSLAR94D-2) Dayton OH Air Force Institute of Technology Wright Patterson Air Force Base

116

Click to edit Master title style References 2 Haley T Ireland B Wojtaszek E Nash D amp Dion R (1995) Raytheon Electronic Systems

Experience in Software Process Improvement (Tech Rep CMUSEI-95-TR-017) ) Pittsburgh Software Engineering Institute Carnegie Mellon University

Ebert C amp Dumke R (2007) Software Measurement Berlin Springer-Verlag

Fagan E amp Davis N (2005) TSP PSP at Intuit Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Fenton N (2006) New directions for Software Metrics CIO Symposium on Software Best Practices

Garmus D amp Herron D (2001) Function Point Analysis Boston Addison-Wesley

George E (2005) Project management with TSP Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Herbsleb J Carleton A Rozum J Siegel J amp Zubrow D (1994) Benefits of CMM-Based Software Process Improvement Initial Results (Tech Rep CMUSEI-94-TR-13) Pittsburgh Software Engineering Institute Carnegie Mellon University

Humphrey W (1997) Introduction to the Personal Software Process Boston Addison Wesley

Humphrey W (2000) Introduction to the Team Software Process Boston Addison Wesley

Humphrey W (2006) TSP Leading a Development Team Boston Addison Wesley

Jackson D (2009) A direct path to dependable software Communications of the ACM 52 (4)

Kan SH (1995) Metrics and Models in Software Quality Engineering Reading MA Addison-Wesley

Keeni B (2000) IEEE Software 16 (4)

117

Click to edit Master title style References 2 Larman C (2004) Agile and Iterative Development Boston Addison Wesley

Lydon T (1995) Productivity drivers Process and capital In Proceedings of the 1995 SEPG Conference Pittsburgh Software Engineering Institute Carnegie Mellon University

McGarry J et al (2002) Practical Software Measurement Reading MA Addison Wesley

Paulk M Weber C Curtis B amp Chrissis M (1995) The Capability Maturity Model Guidelines for Improving the Software Process Reading MA Addison-Wesley

Peterman W (2000) IEEE Software JulyAugust issue

Poppendieck M amp Poppendieck T (2007) Implementing Lean Software Development Boston Addison-Wesley

Sakamoto K Kishida K amp Nakakoji K (1996) Cultural adaptation of the CMM In Fuggetta A amp Wolf A (Eds) Software Process Chichester UK Wiley 137-154

Sinha M (2006) Performance models ndash Enabling organizations prediction capability Proceedings of SEPGrsquo06 Pittsburgh Software Engineering Institute Carnegie Mellon University

Schwaber K amp Beedle M (2002) Agile Software Development Using Scrum

Spinellis D (2006) Code Quality Addison-Wesley

Van Epps S amp Marshall R (2012) Using TSP at the end of the development cycle Proceedings of TSP 2012 Pittsburgh Software Engineering Institute Carnegie Mellon University

Vu JD (1996) Software process improvement A business case In Proceedings of the European SEPG Conference Milton Keynes UK European Software Process Improvement Foundation

118

  • Best Practices for Software Process and Product Measurement
  • Instructor Dr Bill Curtis
  • Agenda
  • Section 1
  • Why Does Measurement Languish
  • Measurement Process Guidance
  • Measurement Categories
  • Productivity Analysis Objectives
  • Productivity Analysis Measures
  • Software Productivity
  • Software Size Measures
  • How Many Lines of Code
  • Function Point Estimation
  • Effort mdash Weakest Measure
  • How Quality Affects Productivity
  • Carry-forward Rework
  • Example of Quality Impact
  • Quality-Adjusted Productivity
  • Best Practices for Analysis
  • 1 Evaluate Demographics
  • 2 Segment Baselines
  • 3 Beware Sampling Effects
  • 4 Understand Variation
  • 5 Investigate Distributions
  • 6 Account for Maturity Effects
  • 7 Caution for External Data Sets
  • Slide Number 27
  • Measuring Project Progress
  • Requirements Change Impact
  • Phase Defect Removal Model
  • SCRUM Board
  • Burndown Chart and Velocity
  • Analyzing Velocity
  • Measuring and Managing Flow
  • Trends Over Sprints
  • Sources and Measures
  • Recommended Books
  • Personal Software Process (PSP)
  • Team Software Process (TSP)
  • Aggregated Personal Data Best
  • Defect Detection at Intuit
  • TSP Benefits
  • Section 3
  • CMMI Maturity Transitions
  • Enhanced Maturity Framework
  • Maturity Level Transformation
  • Measurement by Level
  • Average Improvement Results
  • Schedule Performance
  • Cost Performance
  • Crosbyrsquos Cost of Quality Model
  • Raytheons Cost of Quality
  • Raytheonrsquos Productivity
  • Raytheonrsquos Cost Reduction
  • Raytheonrsquos Cost Predictability
  • OMRONrsquos Reuse gains
  • Section 4
  • What about the Product
  • Six Sigmarsquos Challenge
  • Testing is Not Enough
  • CASTrsquos Application Intelligence Platform
  • Modern Apps are a Technology Stack
  • Architecturally Complex Defects
  • Propagated Risk
  • Transaction Risk
  • Risk-Based Prioritization
  • Static Analysis in US DoD
  • Reducing Operational Incidents amp Costs
  • Reducing Operational Losses
  • Slide Number 70
  • The Technical Debt Metaphor
  • Tech Debt by Quality Factor
  • Join CISQ wwwit-cisqorg
  • Section
  • References 1
  • References 2
  • References 2

PHASE

Escapes Previous Phase

KSLOC

Defect Injection

KSLOC

Subtotal

KSLOC

Removal

Effective-ness

Defect Removal

KSLOC

Escapes at phase exit KSLOC

Rqmts

0

12

12

0

0

12

Design

12

180

192

76

146

46

Code amp UT

46

154

200

71

142

58

Inte-gration

58

0

58

67

39

19

Sys Test

19

0

19

58

11

08

Opera-tion

08

Requirements Added Requirements Changed Requirements Deleted
Feb 2 20 1
Mar 2 11 1
Apr 45 35 3
May 15 61 1
Jun 1 20 4
Jul 15 15 0
Aug 18 05 2
Feb Feb Feb
Mar Mar Mar
Apr Apr Apr
May May May
Jun Jun Jun
Jul Jul Jul
Aug Aug Aug
Page 5: Best Practices for Software Process and Product Measurement · Software Process and Product Measurement Dr. Bill Curtis ... measurement best practice • Best guidance for starting

Click to edit Master title style Why Does Measurement Languish

Unreliable data

Too many measures

Poorly matched to process maturity

Badly defined measures

Use in personal appraisals

Click to edit Master title style Measurement Process Guidance

bull Based on decades of software measurement best practice

bull Best guidance for starting a measurement program

bull Compatible with ISO 15939 minus Software Measurement Process

bull Free guides amp licensed training and consulting available

bull httppsmsccom

6 McGarry et al (2002) Practical Software Measurement

Click to edit Master title style Measurement Categories Category Issue Example measure

Schedule amp Progress

Milestone completion Earned value

Actual vs planned dates Actual vs planned completions

Resources amp Cost

Personnel effort Facilities used

Person-hours Test equipment amp hours

Product Size Size of coded product Size of documentation

Function Points Pages of manuals

Product Quality Functional quality Structural quality

Defects per KLOC Violations of reliability rules

Process Performance

Efficiency Compliance

Defect detection efficiency Audit results

Product Value Return on investment Risk avoidance

∆ Increase in target revenue Cost of system outage

Customer Satisfaction

Account growth Selfndashsufficiency

Repeat business within 1 year Number of helpdesk calls

McGarry et al (2002) Practical Software Measurement p37

Leve

l 2

ge L

evel

3

Click to edit Master title style Productivity Analysis Objectives

Improvement

Productivity Analysis Estimation Benchmarking

Managing Vendors

Click to edit Master title style Productivity Analysis Measures

Adjustment Measures

bull Functional bull Structural bull Behavioral

bull Application bull Project bull Organization

Quality Demo-graphics

Primary Measures

bull Instructions bull Functions bull Requirements

bull Hours bull Roles bull Phases

Size Effort

Productivity Analysis

Click to edit Master title style Software Productivity

Release Productivity =

developed Size of software deleted or modified

Total effort expended on the release

Software Productivity =

Size of software produced

Total effort expended to produce it

Click to edit Master title style Software Size Measures

Instructions Lines of Code

Most frequently used Different definitions of a line can cause counts to vary by 10x Smaller programs often accomplish the same functionality with higher quality coding

Requirements-based Use Case Points Story Points

Use Case Points have not become widely used and need more development Story points are subjective to each team and are susceptible to several forms of bias

Functions Function Points

Popular in IT Several counting schemes (IFPUG NESMA Mark II COSMIC etc) Manual counting is expensive and subjectivemdashcertified counters can differ by 10 Automated FPs taking root

Click to edit Master title style How Many Lines of Code define LOWER 0 lower limit of table define UPPER 300 upper limit define STEP 20 step size main() print a Fahrenheit-Celsius conversion table int fahr for(fahr=LOWER fahrlt=UPPER fahr=fahr+STEP) printf(ldquo4d 61fnrdquo fahr (5090)(fahr-32))

1 2 3 4 5 6 7 8 9 0

2

4

6

8

10

12

Number of Lines of Code

Number of votes

B Park (1992)

Click to edit Master title style Function Point Estimation

Ebert amp Dumke (2007) Software Measurement p188

R2 = 95 y = 779x + 4350

0 200 400 600 800 1000 0

2000

4000

6000

8000

Un

adju

sted

Fu

nct

ion

Po

ints

EIs + EOs

Data store Software

Entries

Exits

Reads

Writes

Functional view of software

Functional size can be estimated from external inputs and outputs

Upfront functional analysis provides basis for good estimates

Repository of FP data provides basis for calibrating estimates

39

Click to edit Master title style Effort mdash Weakest Measure

Effort Unreliable

Inconsistent

Effort

After the fact estimates

bull Memory lapses bull Time-splicing bull Inconsistency

Under-reporting

bull Contract issues

bull HR issues

bull Impressions

Lack of normalization

bull Roles included

bull Phases included

bull Hours in P-Year

Click to edit Master title style How Quality Affects Productivity

Original productivity baseline

Incremental increases in technical debt

Continuing decrease in productivity

Unless you take action

Assumption Productivity is a stable number

Reality Productivity is unstable tending to decline

Click to edit Master title style Carry-forward Rework

Release N+2

Develop N+2 Rework N+2 Rework N+1 Rework N+0

Develop N

Release N

Rework N

Unfixed defects release N

Release N+1

Develop N+1 Rework N+1 Rework N+0

Unfixed defects release N

Unfixed defects release N+1

Click to edit Master title style Example of Quality Impact Project B (Better Faster Cheaper)

ndash 20 developers 3 months

ndash $120k per FTE

ndash 4 FPs per staff month

ndash 240 FPs delivered

$2500FP cost

ndash 5 critical violations per FP

ndash $500 per fix

ndash Cost for 1200 fixes = $600k

ndash Total Cost to Own = $1200k

$5000FP of TCO

Project A (Plodders)

ndash 20 developers 3 months

ndash $120k per FTE

ndash 3 FPs per staff month

ndash 180 FPs delivered

$3333FP cost

ndash 2 critical violations per FP

ndash $500 per fix

ndash Cost for 360 fixes = $180k

ndash Total Cost to Own = $780k

$4333FP of TCO

Project B is 25 more productive

However

Project A is 134 more productive

Click to edit Master title style Quality-Adjusted Productivity

Productivity

Estimation

Benchmarks

Value amp ROI

Etc

Automated Enhancement

Points

Size of both functional and non-functional code segments

Corrective effort in future releases for defects injected in

this release

Must add future effort to fix bugs into productivity

calculations

Effort amp Cost

Quality-Adjusted

Productivity

Automated Technical Debt

Click to edit Master title style Best Practices for Analysis

1) Segment baselines

2) Beware sampling effects

3) Understand variation

4) Evaluate demographics

5) Investigate distributions

6) Account for maturity effects

7) Beware external data sets

Click to edit Master title style 1 Evaluate Demographics

Routine reports

COBOL apps

Function Points

Prod

uctiv

ity

Web-based apps

Innovative mobile

Custom ERP

Click to edit Master title style 2 Segment Baselines

Year Projects Productivity Total Corporate 1981 28 2342 1980 21 1939 Telecommunications 1981 14 1811 1980 12 1458 Engineering amp Defense 1981 8 2965 1980 6 2739 Business Applications 1981 6 3054 1980 3 1813

Multiple baselines are usually the most valid

Click to edit Master title style 3 Beware Sampling Effects

22

0

500

1000

1500

2000

2500

1980 1981 1982

Line

s of

cod

epe

rson

-yea

rs

Baselines

Lots of small

projects Lots of large

projects

Click to edit Master title style 4 Understand Variation

23

Product factors bull timing constraints bull memory utilization bull CPU occupancy bull resource constraints bull complexity bull product size

Project factors bull concurrent development bull development computer bull requirements definition bull requirements churn bull programming practices bull personnel experience bull client experience bull client participation

Project 49

Product 16

Not explained

35

R2product = 16

R2project = 49

R2

model = 65

J Vosburgh B Curtis et al (1984) Proceedings of ICSE 1994

Click to edit Master title style 5 Investigate Distributions

Low Medium High

Rel

ativ

e p

rod

uct

ivit

y (

of m

ean

)

Software Engineering Practices Usage

-100

-50

0

50

100

150

200

250

Click to edit Master title style 6 Account for Maturity Effects

Which organization will be required to spend more effort on correcting software flaws

Click to edit Master title style 7 Caution for External Data Sets

26

DACS Us

Productivity

Siz

e Data definitions

Data collection

Data validity

Data age

Demographics

Click to edit Master title style

1) Tracking traditional projects

2) Iterative and Agile measurement

3) Team Software Process measurement

33

Section 2 Measuring to Manage

Software Projects

Click to edit Master title style Measuring Project Progress

300K 250K 200K 150K 100K 50K 0K

Jan Feb Mar Apr May Jun

Planned rate of code Growth

Actual code growth

Completion of coding

ldquoThe measures say we are on schedulerdquo

300K

250K

200K

150K

100K

50K

0K Jan Feb Mar Apr May Jun

Planned rate of code Growth

Actual code growth

Completion of coding

ldquoBut the code keeps growing We should have measured the requirements growthrdquo

50

100

150

Defect reports

Total

Open

Closed

125

75

25

Jan Feb Mar Apr May Jun

Completion of coding

ldquoAnd all the defect reports remaining to be fixed will keep us far behind schedulerdquo

Click to edit Master title style Requirements Change Impact

Number of requirements

affected

Chart6

Requirements Added
Requirements Changed
Requirements Deleted
Person hours of work or rework
2
20
1
2
11
1
45
35
3
15
61
1
1
20
4
15
15
0
18
05
2

Sheet1

Sheet1

Requirements Added
Requirements Changed
Requirements Deleted
Person hours of work or rework

Sheet2

Sheet3

Click to edit Master title style Phase Defect Removal Model

PHASE

Escapes Previous Phase KSLOC

Defect Injection KSLOC

Subtotal KSLOC

Removal Effective-ness

Defect Removal KSLOC

Escapes at phase exit KSLOC

Rqmts 0 12 12 0 0 12

Design 12 180 192 76 146 46

Code amp UT 46 154 200 71 142 58

Inte-gration 58 0 58 67 39 19

Sys Test 19 0 19 58 11 08

Opera-tion 08

Kan (1999) Metrics and Model in Software Quality Engineering

Click to edit Master title style SCRUM Board Scrumboard displays

the path to completion for each story by showing task status

Story points are an estimate of days for each story

Agile estimating is by feature (story) rather than by task

lsquoDonersquo does not always mean all tasks are finished

Kanban methods restrict the work in progress to a limited number of paths

Click to edit Master title style Burndown Chart and Velocity

Burndown chart displays story completion by day

Burndown chart indicates actual versus estimated progress

Velocity is the rate at which stories are completed

Velocity indicates sustainable progress

Velocity results provide one source of historical data for improving estimates

Story points without the context of productivity factors are dangerous for estimating

Click to edit Master title style Team Velocity

0

1

2

3

4

5

1 2 2 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20Days

Story Points per Hour

0

1

2

3

4

5

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

HrsStryPt

Analyzing Velocity

Item Cycle Times

02468

101214161820

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

Stories Hours HrsStryPt

0

10

20

30

40

50

60

1 2 2 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

Burndown Chart

Days

Click to edit Master title style Measuring and Managing Flow

Released

In progress

Backlogged

Cumulative Flow Diagram Days to delivery

outliers

Anderson (2010) Kanban

Click to edit Master title style

0

10

20

30

40

50

60

70

80

1 2 3 4 5 6 7 8 9 10 11 12

Sprints

Sto

ry p

oin

ts d

eliv

ered

Trends Over Sprints

0

10

20

30

40

50

60

70

80

1 2 3 4 5 6 7 8 9 10 11 12

Sprints

Sto

ry p

oin

ts d

eliv

ered

0

05

1

15

2

25

3

35

4

1 2 3 4 5 6 7 8 9 10 11 12

Sprints

Tech

nic

al d

ebt

Track and correlate multiple measures

across sprints

Predictive models

Click to edit Master title style Sources and Measures

Stories submitted

Stories pulled back

Growth in size of stories

Stories added mid-sprint

Stories under review

Failed tests

Non-development tasks

Assists to other projects

Data sources

Measures Project tracking

Code control

Bug tracking

Build system

Test environment

Deployment tools

Operational monitoring

Click to edit Master title style Recommended Books

Click to edit Master title style Personal Software Process (PSP)

PSP 0 Current process Time recording

Defect recording Defect type standards

PSP 01 Coding standard

Size measurement Process improvement

proposal

Baseline Personal Process

Humphrey (1997) Introduction to the Personal Software Process

PSP 1 Size estimating

Test report

Personal Planning

PSP 2 Code reviews

Design reviews

Personal Quality Management

PSP 21 Cyclic development

Cyclic Personal Process

PSP 11 Task planning

Schedule planning

PSP 21 Design templates

Click to edit Master title style Team Software Process (TSP) bull Built from personal processes of team members

bull Well defined team roles

bull Project launch workshops for planning

bull Team measurement and tracking

bull Post-mortems for improvement

bull Team application of lean principles

Humphrey (2000) Introduction to the Team Software Process

Click to edit Master title style Aggregated Personal Data Best

R2s range from 70 to 87

The ability to predict the amount of time required to produce a piece of software is dramatically improved by estimating at the individual level first and then combining the estimates rather than developing a single team estimate averaged over individuals

R2 = 53

Click to edit Master title style Defect Detection at Intuit

Consequently test changes from defect

detection into correctness verification

PSPTSP shifts defect detection from the test phase back up into development

Fagan (2005)

Personal reviews

33

Compile 33

Team reviews 19

Unit test 15

Test 14

Click to edit Master title style

Dramatic reductions

ndash Delivered defects

ndash Days per KLOC

ndash Schedule deviation

ndash Test defectskloc

ndash Variation in results

Humphrey (2006) TSP Leading a Development Team

TSP Benefits

Click to edit Master title style Section 3 Section 3 Measuring Software

Improvement Programs

1) Improvement program methods

2) Maturity-based measurement

3) Improvement program results

61

Click to edit Master title style CMMI Maturity Transitions

Level 5 Optimizing

Innovation management

Level 4 Quantitatively

Managed

Capability management

Level 3 Defined

Process management

Level 2 Managed

Project management

Level 1 Initial

Inconsistent management

Chrisis Konrad amp Shrum (2005) CMMI

Click to edit Master title style Enhanced Maturity Framework

Level 5 Innovating

Innovation management

Level 4 Optimized

Capability management

Level 3 Standardized

Organizational management

Level 2 Stabilized

Work unit management

Level 1 Initial

Inconsistent management

Click to edit Master title style Maturity Level Transformation

Level 5 Opportunistic improvements

Empowered culture

Level 5 Proactive

improvements Agile culture

Organization

Projects

Individual

Level 4 Process and results

managed statistically Precision culture

Level 3 Organization develops

standard processes Common culture

Level 2 Project managers stabilize projects

Local culture

Level 1 Ad Hoc inconsistent

development practices Relationship culture

Trust

Click to edit Master title style Measurement by Level

Level 5 mdash Exploratory data ∆ t = N x ROI

Level 4 mdash Predictive data X plusmn (196σ radicn )

Level 3 mdash Process data 83 defects per review

Level 2 mdash Management data Effort = α LOCβ

Level 1 mdash Haphazard data ldquoabout a million linesrdquo

Measure-maturity mismatch slows improvement and creates

dysfunction

Click to edit Master title style Average Improvement Results

Productivity growth 4 35 9 - 67

Pre-test defect detection 3 22 6 - 25

Time to market 2 darr 19 15 - 23

Field error reports 5 darr 39 10 - 94

Return on investment 5 501 41 - 881

Annual Annual Improvement benefit Orgs median range

Click to edit Master title style Schedule Performance

0

02

04

06

08

1

12

14

16

1 2 3

Schedule performance

index

CMM maturity level

Budgeted cost of work performed

Budgeted cost of work scheduled

behind ahead

Air Force study of contractors

Flowe amp Thordahl (1994)

Click to edit Master title style Cost Performance

Cost performance

index

0

05

1

15

2

25

1 2 3 CMM Maturity Level

Budgeted cost of work performed Actual cost of

work performed

overrun underrun

Air Force study of contractors

Flowe amp Thordahl (1994)

Click to edit Master title style Crosbyrsquos Cost of Quality Model Project Cost

Performance Cost of Quality Generating plans documentation Development - requirements - design - code - integration Prevention Appraisal

Reviews - system - requirements - design - test plan - test scripts Walkthroughs Testing (first time) Independent Verification and Validation (first time) Audits

Training Policies Procedures and methods Tools Planning Quality Improvement Projects Data Gathering and Analysis Root Cause Analysis Quality Reporting

Nonconformance Conformance Fixing Defects Reworking any Document Updating Source Code Re-reviews Re-tests Lab costs for re-tests Patches(Internal and External) Engineering Changes Change Control Boards External Failures and fines Customer Support Help Desks

Dion (1993) Crosby (1979) Quality Is Free

Click to edit Master title style Raytheons Cost of Quality

bull Performance mdash cost of building it right first time bull Nonconformance mdash cost of rework bull Appraisal mdash cost of testing bull Prevention mdash cost of preventing nonconformance

Dion (1993) Haley (1995)

23

18

Year Level Perform Nonconf Appraise Prevent

1988 1 34 41 15 7

1990 2 55 18 15 12

1992 3 66 11

1994 4 76 6 75

Click to edit Master title style Raytheonrsquos Productivity

0

1

2

3

4

1988 1989 1990 1991 1992 1993 1994

Growth relative to 1989 baseline

Lydon (1995)

Click to edit Master title style Raytheonrsquos Cost Reduction

0

02

04

06

08

1

12

1988 1989 1990 1991 1992 1993 1994

Reduction relative to 1989 baseline

Lydon (1995)

Click to edit Master title style

Haley (1995)

1993 1992 1991 1990 1989 1988 1994

50

40

30

20

10

0

-10

Actual cost

Budgeted cost

over-run under-run

4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1

Raytheonrsquos Cost Predictability

Click to edit Master title style OMRONrsquos Reuse gains

Sakamoto et al (1996)

0

5 0 0

1 0 0 0

1 5 0 0

2 0 0 0

2 5 0 0

3 0 0 0

1 9 8 7 1 9 8 8 1 9 8 9 1 9 9 0 1 9 9 1 1 9 9 2 1 9 9 3 1 9 9 4

Steps

100

75

50

25

0

of reuse

Click to edit Master title style Section 4 Section 4 Measuring Quality and

Technical Debt

1) Structural quality measurement

2) Technical debt

3) Consortium for IT Software Quality (CISQ)

89

Click to edit Master title style What about the Product CMMIrsquos primary assumption

ldquoThe quality of a software system is governed by quality of the process used to develop itrdquo

- Watts Humphrey

CMMIrsquos product quality problem

1 Assessments do not verify product quality

2 Compliance focus undermines CMMIrsquos primary assumption

3 Product quality focus at Level 4 in CMM was lost in quantitative process control in CMMI

4 The assumptions underlying control charts are violated by software data

Click to edit Master title style Six Sigmarsquos Challenge

Process focus ndash process improvement ndash Six Sigma Product focus ndash product improvement ndash Design for 6σ

Product focus supplements product focus to unlock even more value from software

Six Sigma projects must have significant benefits Huge benefits

Large benefits

Good benefits

Okay benefits

Small benefits

Now what

Ultimately we run out of projects with significant enough benefits to continue the programhellip

so what is the solution that allows continual improvement

Click to edit Master title style Testing is Not Enough ldquoAs higher levels of assurance are demandedhelliptesting cannot deliver the level of confidence required at a reasonable costrdquo

ldquoThe correctness of the code is rarely the weakest linkrdquo

Jackson D (2009) Spinellis D (2006)

ldquohellipa failure to satisfy a non-functional requirement can be critical even catastrophichellipnon-functional requirements are sometimes difficult to verify We cannot write a test case to verify a systemrsquos reliabilityhellipThe ability to associate code to non-functional properties can be a powerful weapon in a software engineerrsquos arsenalrdquo

Click to edit Master title style CASTrsquos Application Intelligence Platform

Application Analysis

Evaluation of 1200+ coding amp

architectural rules

Application meta-data

Transferability

Changeability

Robustness

Performance

Security

Quality Measurements

Detected Violations

Expensive operation in loop Static vs pooled connections Complex query on big table Large indices on big table

Empty CATCH block Uncontrolled data access Poor memory management Opened resource not closed

SQL injection Cross-site scripting Buffer overflow Uncontrolled format string

Unstructured code Misuse of inheritance Lack of comments Violated naming convention

Highly coupled component Duplicated code Index modified in loop High cyclomatic complexity

Language Parsers

Oracle PLSQL Sybase T-SQL SQL Server T-SQL IBM SQLPSM C C++ C Pro C Cobol CICS Visual Basic VBNet ASPNet Java J2EE JSP XML HTML Javascript VBScript PHP PowerBuilder Oracle Forms PeopleSoft SAP ABAP Netweaver Tibco Business Objects Universal Analyzer for other languages

Click to edit Master title style Modern Apps are a Technology Stack Ar

chite

ctur

al C

ompl

ianc

e

Data Flow Transaction Risk

EJB PLSQL

Oracle

SQL Server

DB2

TSQL

Hibernate

Spring

Struts NET

COBOL

Sybase IMS

Messaging

Integration quality Architectural

compliance Risk propagation Application security Resiliency checks Transaction integrity

Function point Effort estimation Data access control SDK versioning Calibration across

technologies IT organization level

System Level 3

Code style amp layout Expression complexity Code documentation Class or program design Basic coding standards Developer level

Unit Level 1

Java Java Java

Java Java

Web Services Java Java Single languagetechnology layer

Intra-technology architecture Intra-layer dependencies Inter-program invocation Security vulnerabilities Development team level

Technology Level 2

JSP ASPNET APIs

Click to edit Master title style Architecturally Complex Defects

Primary cause of operational

problems

20x as many

fixes to correct

Architecturally Complex Defect

A structural flaw involving interactions among multiple components that reside in different application layers

48

52 92

8 Architecturally

Complex Defects

Code unit-level violations

of total app defects

of total repair effort

Most architecturally complex defects involve an architectural hotspot ndash a badly-built component that should be replaced

Click to edit Master title style Propagated Risk

The impact of violation 284 in Component B is very widely propagated throughout the system ndash

it presents very high risk

The impact of violation 342 in Component A is only narrowly propagated in the system

ndash it presents low risk

Propagated Risk Index

A ranking of the risk created by each violation based on the its severity and the breadth of its impact in the system

Violation 342 in component A

Violation 284 in component B

Remediating violation 284 in Component B will have

the greater impact on reducing risk and cost

Click to edit Master title style Transaction Risk

Customers care most about the dependability of their transactions

The risk of a transaction is determined by the number and severity of violations along its path

Transactions can be ranked by risk to prioritize their remediation

The Transaction Risk Index can be further tuned using the operational frequency of each transaction

Transaction Risk Index

A ranking of transactions based on the risk created by the number and severity of violations along the transaction path

Transaction entryexit in the user interface layer

Transaction fulfillment in the data storage layer

1 3 2 4 5

Transaction 1 poses greatest risk

Transaction processing in the business logic layer

Click to edit Master title style Risk-Based Prioritization

Violation Health Factor Severity Propagated Risk Index

In a high risk transaction

Frequency of path execution

148 Security 9

371 Robustness 9

062 Performance 8

284 Robustness 7

016 Changeability 7

241 Security 5

173 Transferability 4

359 Transferability 3

Click to edit Master title style Static Analysis in US DoD

John Keane (2013)

Testing finds lt13 of structural defects

Static analysis before testing is 85 efficient

With testing and static analysis combined test schedules are reduced by 50

Click to edit Master title style Reducing Operational Incidents amp Costs

Study of structural quality measures and maintenance effort across 20 customers in a large global system integrator

TQI increase of 24 decreased corrective maintenance effort by 50

Total Quality Index

Click to edit Master title style Reducing Operational Losses

0

20

40

60

80

100

120

140

160

180

100 150 200 250 300 350 400

Technical Quality Index

Pro

du

ctio

n In

cid

ents

+$2000000 Annual Loss

$50000 Annual Loss

$5000 Annual Loss

Large international investment bank Business critical applications

Click to edit Master title style Factors Affecting Structural Quality

2017 CRASH Report Request from infocastsoftwarecom

Factor apps Robust Security Perform Change Transfer

Technology 1606 20 3 16 26 7

Java-EE

Industry 677 5 5 7 4 6

App type 510 1 1 1 1 1

Source 580 1 1 4

Shore 540

Maturity 72 28 25 15 24 12

Method 208 10 6 14 6

Team size 186 7 5 8 8

of users 262 4 3 5 4

Percentage of variation in structural quality scores accounted for by various factors

Click to edit Master title style The Technical Debt Metaphor

Structural quality problems in production code

Technical Debt

Principal borrowed

Interest on the debt

Business Risk Liability from debt

Opportunity cost

Interestmdashcontinuing IT costs attributable to the violations causing technical debt ie higher maintenance costs greater resource usage etc

Principalcost of fixing problems remaining in the code after release that must be remediated

Opportunity costmdashbenefits that could have been achieved had resources been put on new capability rather than retiring technical debt

Liabilitymdashbusiness costs related to outages breaches corrupted data etc

Technical Debt the future cost of fixing defects remaining in code at release a component of the cost of ownership

Todayrsquos talk focuses on the principal

Curtis et al (2012) IEEE Software

Click to edit Master title style Tech Debt by Quality Factor

Transferability

40

Changeability

30

Security 7

Robustness

18

70 of Technical Debt is in IT Cost (Transferability Changeability)

30 of Technical Debt is in Business Risk

(Robustness Performance Security)

Health Factor proportions are mostly

consistent across technologies

Curtis et al (2012) IEEE Software

Click to edit Master title style Join CISQ wwwit-cisqorg

Click to edit Master title style Section

References

115

Click to edit Master title style References 1 Anderson DJ (2010) Kanban Sequim WA Blue Hole Press

Basili V Trendowicz A Kowalczyk M Heidrich J Seaman C Munch J amp Rombach D (2014) Aligning Organizations through Measurement Heidelberg Springer

Boehm B (1987) Increasing software productivity IEEE Computer 20 (9) 43-57

Boehm B et al (2000) Software Cost Estimation with COCOMO II Prentice-Hall

Chrissis MB Konrad M amp Shrum S (2011) CMMI Third Edition Guidelines for Process Integration and Product Improvement Reading MA Addison Wesley

Cohn M (2006) Agile Estimating and Planning Upper Saddle River NJ Prentice-Hall

Crosby P (1979) Quality Is Free New York McGraw-Hill

Curtis B Sapiddi J amp Szynkarski A (2012) Estimating the principle of an applicationrsquos technical debt IEEE Software 29 (6) 34-42

Dion R (1993) Process improvement and the corporate balance sheet IEEE Software 10 (4) 28-35

Duncker R (1992) Presentation at the 25th Annual Conference of the Singapore Computer Society Singapore November 1992

Dyne K (1999) ESSI Benefits from Continuous Software Improvements In Proceedings of ESEPGrsquo99 Milton Keyes UK ESPI Foundation

Flowe RM amp Thordahl JB (1994) A Correlational Study of the SEIrsquos Capability Maturity Model and Software Development Performance in DoD Contracts (AFITGSSLAR94D-2) Dayton OH Air Force Institute of Technology Wright Patterson Air Force Base

116

Click to edit Master title style References 2 Haley T Ireland B Wojtaszek E Nash D amp Dion R (1995) Raytheon Electronic Systems

Experience in Software Process Improvement (Tech Rep CMUSEI-95-TR-017) ) Pittsburgh Software Engineering Institute Carnegie Mellon University

Ebert C amp Dumke R (2007) Software Measurement Berlin Springer-Verlag

Fagan E amp Davis N (2005) TSP PSP at Intuit Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Fenton N (2006) New directions for Software Metrics CIO Symposium on Software Best Practices

Garmus D amp Herron D (2001) Function Point Analysis Boston Addison-Wesley

George E (2005) Project management with TSP Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Herbsleb J Carleton A Rozum J Siegel J amp Zubrow D (1994) Benefits of CMM-Based Software Process Improvement Initial Results (Tech Rep CMUSEI-94-TR-13) Pittsburgh Software Engineering Institute Carnegie Mellon University

Humphrey W (1997) Introduction to the Personal Software Process Boston Addison Wesley

Humphrey W (2000) Introduction to the Team Software Process Boston Addison Wesley

Humphrey W (2006) TSP Leading a Development Team Boston Addison Wesley

Jackson D (2009) A direct path to dependable software Communications of the ACM 52 (4)

Kan SH (1995) Metrics and Models in Software Quality Engineering Reading MA Addison-Wesley

Keeni B (2000) IEEE Software 16 (4)

117

Click to edit Master title style References 2 Larman C (2004) Agile and Iterative Development Boston Addison Wesley

Lydon T (1995) Productivity drivers Process and capital In Proceedings of the 1995 SEPG Conference Pittsburgh Software Engineering Institute Carnegie Mellon University

McGarry J et al (2002) Practical Software Measurement Reading MA Addison Wesley

Paulk M Weber C Curtis B amp Chrissis M (1995) The Capability Maturity Model Guidelines for Improving the Software Process Reading MA Addison-Wesley

Peterman W (2000) IEEE Software JulyAugust issue

Poppendieck M amp Poppendieck T (2007) Implementing Lean Software Development Boston Addison-Wesley

Sakamoto K Kishida K amp Nakakoji K (1996) Cultural adaptation of the CMM In Fuggetta A amp Wolf A (Eds) Software Process Chichester UK Wiley 137-154

Sinha M (2006) Performance models ndash Enabling organizations prediction capability Proceedings of SEPGrsquo06 Pittsburgh Software Engineering Institute Carnegie Mellon University

Schwaber K amp Beedle M (2002) Agile Software Development Using Scrum

Spinellis D (2006) Code Quality Addison-Wesley

Van Epps S amp Marshall R (2012) Using TSP at the end of the development cycle Proceedings of TSP 2012 Pittsburgh Software Engineering Institute Carnegie Mellon University

Vu JD (1996) Software process improvement A business case In Proceedings of the European SEPG Conference Milton Keynes UK European Software Process Improvement Foundation

118

  • Best Practices for Software Process and Product Measurement
  • Instructor Dr Bill Curtis
  • Agenda
  • Section 1
  • Why Does Measurement Languish
  • Measurement Process Guidance
  • Measurement Categories
  • Productivity Analysis Objectives
  • Productivity Analysis Measures
  • Software Productivity
  • Software Size Measures
  • How Many Lines of Code
  • Function Point Estimation
  • Effort mdash Weakest Measure
  • How Quality Affects Productivity
  • Carry-forward Rework
  • Example of Quality Impact
  • Quality-Adjusted Productivity
  • Best Practices for Analysis
  • 1 Evaluate Demographics
  • 2 Segment Baselines
  • 3 Beware Sampling Effects
  • 4 Understand Variation
  • 5 Investigate Distributions
  • 6 Account for Maturity Effects
  • 7 Caution for External Data Sets
  • Slide Number 27
  • Measuring Project Progress
  • Requirements Change Impact
  • Phase Defect Removal Model
  • SCRUM Board
  • Burndown Chart and Velocity
  • Analyzing Velocity
  • Measuring and Managing Flow
  • Trends Over Sprints
  • Sources and Measures
  • Recommended Books
  • Personal Software Process (PSP)
  • Team Software Process (TSP)
  • Aggregated Personal Data Best
  • Defect Detection at Intuit
  • TSP Benefits
  • Section 3
  • CMMI Maturity Transitions
  • Enhanced Maturity Framework
  • Maturity Level Transformation
  • Measurement by Level
  • Average Improvement Results
  • Schedule Performance
  • Cost Performance
  • Crosbyrsquos Cost of Quality Model
  • Raytheons Cost of Quality
  • Raytheonrsquos Productivity
  • Raytheonrsquos Cost Reduction
  • Raytheonrsquos Cost Predictability
  • OMRONrsquos Reuse gains
  • Section 4
  • What about the Product
  • Six Sigmarsquos Challenge
  • Testing is Not Enough
  • CASTrsquos Application Intelligence Platform
  • Modern Apps are a Technology Stack
  • Architecturally Complex Defects
  • Propagated Risk
  • Transaction Risk
  • Risk-Based Prioritization
  • Static Analysis in US DoD
  • Reducing Operational Incidents amp Costs
  • Reducing Operational Losses
  • Slide Number 70
  • The Technical Debt Metaphor
  • Tech Debt by Quality Factor
  • Join CISQ wwwit-cisqorg
  • Section
  • References 1
  • References 2
  • References 2

PHASE

Escapes Previous Phase

KSLOC

Defect Injection

KSLOC

Subtotal

KSLOC

Removal

Effective-ness

Defect Removal

KSLOC

Escapes at phase exit KSLOC

Rqmts

0

12

12

0

0

12

Design

12

180

192

76

146

46

Code amp UT

46

154

200

71

142

58

Inte-gration

58

0

58

67

39

19

Sys Test

19

0

19

58

11

08

Opera-tion

08

Requirements Added Requirements Changed Requirements Deleted
Feb 2 20 1
Mar 2 11 1
Apr 45 35 3
May 15 61 1
Jun 1 20 4
Jul 15 15 0
Aug 18 05 2
Feb Feb Feb
Mar Mar Mar
Apr Apr Apr
May May May
Jun Jun Jun
Jul Jul Jul
Aug Aug Aug
Page 6: Best Practices for Software Process and Product Measurement · Software Process and Product Measurement Dr. Bill Curtis ... measurement best practice • Best guidance for starting

Click to edit Master title style Measurement Process Guidance

bull Based on decades of software measurement best practice

bull Best guidance for starting a measurement program

bull Compatible with ISO 15939 minus Software Measurement Process

bull Free guides amp licensed training and consulting available

bull httppsmsccom

6 McGarry et al (2002) Practical Software Measurement

Click to edit Master title style Measurement Categories Category Issue Example measure

Schedule amp Progress

Milestone completion Earned value

Actual vs planned dates Actual vs planned completions

Resources amp Cost

Personnel effort Facilities used

Person-hours Test equipment amp hours

Product Size Size of coded product Size of documentation

Function Points Pages of manuals

Product Quality Functional quality Structural quality

Defects per KLOC Violations of reliability rules

Process Performance

Efficiency Compliance

Defect detection efficiency Audit results

Product Value Return on investment Risk avoidance

∆ Increase in target revenue Cost of system outage

Customer Satisfaction

Account growth Selfndashsufficiency

Repeat business within 1 year Number of helpdesk calls

McGarry et al (2002) Practical Software Measurement p37

Leve

l 2

ge L

evel

3

Click to edit Master title style Productivity Analysis Objectives

Improvement

Productivity Analysis Estimation Benchmarking

Managing Vendors

Click to edit Master title style Productivity Analysis Measures

Adjustment Measures

bull Functional bull Structural bull Behavioral

bull Application bull Project bull Organization

Quality Demo-graphics

Primary Measures

bull Instructions bull Functions bull Requirements

bull Hours bull Roles bull Phases

Size Effort

Productivity Analysis

Click to edit Master title style Software Productivity

Release Productivity =

developed Size of software deleted or modified

Total effort expended on the release

Software Productivity =

Size of software produced

Total effort expended to produce it

Click to edit Master title style Software Size Measures

Instructions Lines of Code

Most frequently used Different definitions of a line can cause counts to vary by 10x Smaller programs often accomplish the same functionality with higher quality coding

Requirements-based Use Case Points Story Points

Use Case Points have not become widely used and need more development Story points are subjective to each team and are susceptible to several forms of bias

Functions Function Points

Popular in IT Several counting schemes (IFPUG NESMA Mark II COSMIC etc) Manual counting is expensive and subjectivemdashcertified counters can differ by 10 Automated FPs taking root

Click to edit Master title style How Many Lines of Code define LOWER 0 lower limit of table define UPPER 300 upper limit define STEP 20 step size main() print a Fahrenheit-Celsius conversion table int fahr for(fahr=LOWER fahrlt=UPPER fahr=fahr+STEP) printf(ldquo4d 61fnrdquo fahr (5090)(fahr-32))

1 2 3 4 5 6 7 8 9 0

2

4

6

8

10

12

Number of Lines of Code

Number of votes

B Park (1992)

Click to edit Master title style Function Point Estimation

Ebert amp Dumke (2007) Software Measurement p188

R2 = 95 y = 779x + 4350

0 200 400 600 800 1000 0

2000

4000

6000

8000

Un

adju

sted

Fu

nct

ion

Po

ints

EIs + EOs

Data store Software

Entries

Exits

Reads

Writes

Functional view of software

Functional size can be estimated from external inputs and outputs

Upfront functional analysis provides basis for good estimates

Repository of FP data provides basis for calibrating estimates

39

Click to edit Master title style Effort mdash Weakest Measure

Effort Unreliable

Inconsistent

Effort

After the fact estimates

bull Memory lapses bull Time-splicing bull Inconsistency

Under-reporting

bull Contract issues

bull HR issues

bull Impressions

Lack of normalization

bull Roles included

bull Phases included

bull Hours in P-Year

Click to edit Master title style How Quality Affects Productivity

Original productivity baseline

Incremental increases in technical debt

Continuing decrease in productivity

Unless you take action

Assumption Productivity is a stable number

Reality Productivity is unstable tending to decline

Click to edit Master title style Carry-forward Rework

Release N+2

Develop N+2 Rework N+2 Rework N+1 Rework N+0

Develop N

Release N

Rework N

Unfixed defects release N

Release N+1

Develop N+1 Rework N+1 Rework N+0

Unfixed defects release N

Unfixed defects release N+1

Click to edit Master title style Example of Quality Impact Project B (Better Faster Cheaper)

ndash 20 developers 3 months

ndash $120k per FTE

ndash 4 FPs per staff month

ndash 240 FPs delivered

$2500FP cost

ndash 5 critical violations per FP

ndash $500 per fix

ndash Cost for 1200 fixes = $600k

ndash Total Cost to Own = $1200k

$5000FP of TCO

Project A (Plodders)

ndash 20 developers 3 months

ndash $120k per FTE

ndash 3 FPs per staff month

ndash 180 FPs delivered

$3333FP cost

ndash 2 critical violations per FP

ndash $500 per fix

ndash Cost for 360 fixes = $180k

ndash Total Cost to Own = $780k

$4333FP of TCO

Project B is 25 more productive

However

Project A is 134 more productive

Click to edit Master title style Quality-Adjusted Productivity

Productivity

Estimation

Benchmarks

Value amp ROI

Etc

Automated Enhancement

Points

Size of both functional and non-functional code segments

Corrective effort in future releases for defects injected in

this release

Must add future effort to fix bugs into productivity

calculations

Effort amp Cost

Quality-Adjusted

Productivity

Automated Technical Debt

Click to edit Master title style Best Practices for Analysis

1) Segment baselines

2) Beware sampling effects

3) Understand variation

4) Evaluate demographics

5) Investigate distributions

6) Account for maturity effects

7) Beware external data sets

Click to edit Master title style 1 Evaluate Demographics

Routine reports

COBOL apps

Function Points

Prod

uctiv

ity

Web-based apps

Innovative mobile

Custom ERP

Click to edit Master title style 2 Segment Baselines

Year Projects Productivity Total Corporate 1981 28 2342 1980 21 1939 Telecommunications 1981 14 1811 1980 12 1458 Engineering amp Defense 1981 8 2965 1980 6 2739 Business Applications 1981 6 3054 1980 3 1813

Multiple baselines are usually the most valid

Click to edit Master title style 3 Beware Sampling Effects

22

0

500

1000

1500

2000

2500

1980 1981 1982

Line

s of

cod

epe

rson

-yea

rs

Baselines

Lots of small

projects Lots of large

projects

Click to edit Master title style 4 Understand Variation

23

Product factors bull timing constraints bull memory utilization bull CPU occupancy bull resource constraints bull complexity bull product size

Project factors bull concurrent development bull development computer bull requirements definition bull requirements churn bull programming practices bull personnel experience bull client experience bull client participation

Project 49

Product 16

Not explained

35

R2product = 16

R2project = 49

R2

model = 65

J Vosburgh B Curtis et al (1984) Proceedings of ICSE 1994

Click to edit Master title style 5 Investigate Distributions

Low Medium High

Rel

ativ

e p

rod

uct

ivit

y (

of m

ean

)

Software Engineering Practices Usage

-100

-50

0

50

100

150

200

250

Click to edit Master title style 6 Account for Maturity Effects

Which organization will be required to spend more effort on correcting software flaws

Click to edit Master title style 7 Caution for External Data Sets

26

DACS Us

Productivity

Siz

e Data definitions

Data collection

Data validity

Data age

Demographics

Click to edit Master title style

1) Tracking traditional projects

2) Iterative and Agile measurement

3) Team Software Process measurement

33

Section 2 Measuring to Manage

Software Projects

Click to edit Master title style Measuring Project Progress

300K 250K 200K 150K 100K 50K 0K

Jan Feb Mar Apr May Jun

Planned rate of code Growth

Actual code growth

Completion of coding

ldquoThe measures say we are on schedulerdquo

300K

250K

200K

150K

100K

50K

0K Jan Feb Mar Apr May Jun

Planned rate of code Growth

Actual code growth

Completion of coding

ldquoBut the code keeps growing We should have measured the requirements growthrdquo

50

100

150

Defect reports

Total

Open

Closed

125

75

25

Jan Feb Mar Apr May Jun

Completion of coding

ldquoAnd all the defect reports remaining to be fixed will keep us far behind schedulerdquo

Click to edit Master title style Requirements Change Impact

Number of requirements

affected

Chart6

Requirements Added
Requirements Changed
Requirements Deleted
Person hours of work or rework
2
20
1
2
11
1
45
35
3
15
61
1
1
20
4
15
15
0
18
05
2

Sheet1

Sheet1

Requirements Added
Requirements Changed
Requirements Deleted
Person hours of work or rework

Sheet2

Sheet3

Click to edit Master title style Phase Defect Removal Model

PHASE

Escapes Previous Phase KSLOC

Defect Injection KSLOC

Subtotal KSLOC

Removal Effective-ness

Defect Removal KSLOC

Escapes at phase exit KSLOC

Rqmts 0 12 12 0 0 12

Design 12 180 192 76 146 46

Code amp UT 46 154 200 71 142 58

Inte-gration 58 0 58 67 39 19

Sys Test 19 0 19 58 11 08

Opera-tion 08

Kan (1999) Metrics and Model in Software Quality Engineering

Click to edit Master title style SCRUM Board Scrumboard displays

the path to completion for each story by showing task status

Story points are an estimate of days for each story

Agile estimating is by feature (story) rather than by task

lsquoDonersquo does not always mean all tasks are finished

Kanban methods restrict the work in progress to a limited number of paths

Click to edit Master title style Burndown Chart and Velocity

Burndown chart displays story completion by day

Burndown chart indicates actual versus estimated progress

Velocity is the rate at which stories are completed

Velocity indicates sustainable progress

Velocity results provide one source of historical data for improving estimates

Story points without the context of productivity factors are dangerous for estimating

Click to edit Master title style Team Velocity

0

1

2

3

4

5

1 2 2 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20Days

Story Points per Hour

0

1

2

3

4

5

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

HrsStryPt

Analyzing Velocity

Item Cycle Times

02468

101214161820

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

Stories Hours HrsStryPt

0

10

20

30

40

50

60

1 2 2 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

Burndown Chart

Days

Click to edit Master title style Measuring and Managing Flow

Released

In progress

Backlogged

Cumulative Flow Diagram Days to delivery

outliers

Anderson (2010) Kanban

Click to edit Master title style

0

10

20

30

40

50

60

70

80

1 2 3 4 5 6 7 8 9 10 11 12

Sprints

Sto

ry p

oin

ts d

eliv

ered

Trends Over Sprints

0

10

20

30

40

50

60

70

80

1 2 3 4 5 6 7 8 9 10 11 12

Sprints

Sto

ry p

oin

ts d

eliv

ered

0

05

1

15

2

25

3

35

4

1 2 3 4 5 6 7 8 9 10 11 12

Sprints

Tech

nic

al d

ebt

Track and correlate multiple measures

across sprints

Predictive models

Click to edit Master title style Sources and Measures

Stories submitted

Stories pulled back

Growth in size of stories

Stories added mid-sprint

Stories under review

Failed tests

Non-development tasks

Assists to other projects

Data sources

Measures Project tracking

Code control

Bug tracking

Build system

Test environment

Deployment tools

Operational monitoring

Click to edit Master title style Recommended Books

Click to edit Master title style Personal Software Process (PSP)

PSP 0 Current process Time recording

Defect recording Defect type standards

PSP 01 Coding standard

Size measurement Process improvement

proposal

Baseline Personal Process

Humphrey (1997) Introduction to the Personal Software Process

PSP 1 Size estimating

Test report

Personal Planning

PSP 2 Code reviews

Design reviews

Personal Quality Management

PSP 21 Cyclic development

Cyclic Personal Process

PSP 11 Task planning

Schedule planning

PSP 21 Design templates

Click to edit Master title style Team Software Process (TSP) bull Built from personal processes of team members

bull Well defined team roles

bull Project launch workshops for planning

bull Team measurement and tracking

bull Post-mortems for improvement

bull Team application of lean principles

Humphrey (2000) Introduction to the Team Software Process

Click to edit Master title style Aggregated Personal Data Best

R2s range from 70 to 87

The ability to predict the amount of time required to produce a piece of software is dramatically improved by estimating at the individual level first and then combining the estimates rather than developing a single team estimate averaged over individuals

R2 = 53

Click to edit Master title style Defect Detection at Intuit

Consequently test changes from defect

detection into correctness verification

PSPTSP shifts defect detection from the test phase back up into development

Fagan (2005)

Personal reviews

33

Compile 33

Team reviews 19

Unit test 15

Test 14

Click to edit Master title style

Dramatic reductions

ndash Delivered defects

ndash Days per KLOC

ndash Schedule deviation

ndash Test defectskloc

ndash Variation in results

Humphrey (2006) TSP Leading a Development Team

TSP Benefits

Click to edit Master title style Section 3 Section 3 Measuring Software

Improvement Programs

1) Improvement program methods

2) Maturity-based measurement

3) Improvement program results

61

Click to edit Master title style CMMI Maturity Transitions

Level 5 Optimizing

Innovation management

Level 4 Quantitatively

Managed

Capability management

Level 3 Defined

Process management

Level 2 Managed

Project management

Level 1 Initial

Inconsistent management

Chrisis Konrad amp Shrum (2005) CMMI

Click to edit Master title style Enhanced Maturity Framework

Level 5 Innovating

Innovation management

Level 4 Optimized

Capability management

Level 3 Standardized

Organizational management

Level 2 Stabilized

Work unit management

Level 1 Initial

Inconsistent management

Click to edit Master title style Maturity Level Transformation

Level 5 Opportunistic improvements

Empowered culture

Level 5 Proactive

improvements Agile culture

Organization

Projects

Individual

Level 4 Process and results

managed statistically Precision culture

Level 3 Organization develops

standard processes Common culture

Level 2 Project managers stabilize projects

Local culture

Level 1 Ad Hoc inconsistent

development practices Relationship culture

Trust

Click to edit Master title style Measurement by Level

Level 5 mdash Exploratory data ∆ t = N x ROI

Level 4 mdash Predictive data X plusmn (196σ radicn )

Level 3 mdash Process data 83 defects per review

Level 2 mdash Management data Effort = α LOCβ

Level 1 mdash Haphazard data ldquoabout a million linesrdquo

Measure-maturity mismatch slows improvement and creates

dysfunction

Click to edit Master title style Average Improvement Results

Productivity growth 4 35 9 - 67

Pre-test defect detection 3 22 6 - 25

Time to market 2 darr 19 15 - 23

Field error reports 5 darr 39 10 - 94

Return on investment 5 501 41 - 881

Annual Annual Improvement benefit Orgs median range

Click to edit Master title style Schedule Performance

0

02

04

06

08

1

12

14

16

1 2 3

Schedule performance

index

CMM maturity level

Budgeted cost of work performed

Budgeted cost of work scheduled

behind ahead

Air Force study of contractors

Flowe amp Thordahl (1994)

Click to edit Master title style Cost Performance

Cost performance

index

0

05

1

15

2

25

1 2 3 CMM Maturity Level

Budgeted cost of work performed Actual cost of

work performed

overrun underrun

Air Force study of contractors

Flowe amp Thordahl (1994)

Click to edit Master title style Crosbyrsquos Cost of Quality Model Project Cost

Performance Cost of Quality Generating plans documentation Development - requirements - design - code - integration Prevention Appraisal

Reviews - system - requirements - design - test plan - test scripts Walkthroughs Testing (first time) Independent Verification and Validation (first time) Audits

Training Policies Procedures and methods Tools Planning Quality Improvement Projects Data Gathering and Analysis Root Cause Analysis Quality Reporting

Nonconformance Conformance Fixing Defects Reworking any Document Updating Source Code Re-reviews Re-tests Lab costs for re-tests Patches(Internal and External) Engineering Changes Change Control Boards External Failures and fines Customer Support Help Desks

Dion (1993) Crosby (1979) Quality Is Free

Click to edit Master title style Raytheons Cost of Quality

bull Performance mdash cost of building it right first time bull Nonconformance mdash cost of rework bull Appraisal mdash cost of testing bull Prevention mdash cost of preventing nonconformance

Dion (1993) Haley (1995)

23

18

Year Level Perform Nonconf Appraise Prevent

1988 1 34 41 15 7

1990 2 55 18 15 12

1992 3 66 11

1994 4 76 6 75

Click to edit Master title style Raytheonrsquos Productivity

0

1

2

3

4

1988 1989 1990 1991 1992 1993 1994

Growth relative to 1989 baseline

Lydon (1995)

Click to edit Master title style Raytheonrsquos Cost Reduction

0

02

04

06

08

1

12

1988 1989 1990 1991 1992 1993 1994

Reduction relative to 1989 baseline

Lydon (1995)

Click to edit Master title style

Haley (1995)

1993 1992 1991 1990 1989 1988 1994

50

40

30

20

10

0

-10

Actual cost

Budgeted cost

over-run under-run

4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1

Raytheonrsquos Cost Predictability

Click to edit Master title style OMRONrsquos Reuse gains

Sakamoto et al (1996)

0

5 0 0

1 0 0 0

1 5 0 0

2 0 0 0

2 5 0 0

3 0 0 0

1 9 8 7 1 9 8 8 1 9 8 9 1 9 9 0 1 9 9 1 1 9 9 2 1 9 9 3 1 9 9 4

Steps

100

75

50

25

0

of reuse

Click to edit Master title style Section 4 Section 4 Measuring Quality and

Technical Debt

1) Structural quality measurement

2) Technical debt

3) Consortium for IT Software Quality (CISQ)

89

Click to edit Master title style What about the Product CMMIrsquos primary assumption

ldquoThe quality of a software system is governed by quality of the process used to develop itrdquo

- Watts Humphrey

CMMIrsquos product quality problem

1 Assessments do not verify product quality

2 Compliance focus undermines CMMIrsquos primary assumption

3 Product quality focus at Level 4 in CMM was lost in quantitative process control in CMMI

4 The assumptions underlying control charts are violated by software data

Click to edit Master title style Six Sigmarsquos Challenge

Process focus ndash process improvement ndash Six Sigma Product focus ndash product improvement ndash Design for 6σ

Product focus supplements product focus to unlock even more value from software

Six Sigma projects must have significant benefits Huge benefits

Large benefits

Good benefits

Okay benefits

Small benefits

Now what

Ultimately we run out of projects with significant enough benefits to continue the programhellip

so what is the solution that allows continual improvement

Click to edit Master title style Testing is Not Enough ldquoAs higher levels of assurance are demandedhelliptesting cannot deliver the level of confidence required at a reasonable costrdquo

ldquoThe correctness of the code is rarely the weakest linkrdquo

Jackson D (2009) Spinellis D (2006)

ldquohellipa failure to satisfy a non-functional requirement can be critical even catastrophichellipnon-functional requirements are sometimes difficult to verify We cannot write a test case to verify a systemrsquos reliabilityhellipThe ability to associate code to non-functional properties can be a powerful weapon in a software engineerrsquos arsenalrdquo

Click to edit Master title style CASTrsquos Application Intelligence Platform

Application Analysis

Evaluation of 1200+ coding amp

architectural rules

Application meta-data

Transferability

Changeability

Robustness

Performance

Security

Quality Measurements

Detected Violations

Expensive operation in loop Static vs pooled connections Complex query on big table Large indices on big table

Empty CATCH block Uncontrolled data access Poor memory management Opened resource not closed

SQL injection Cross-site scripting Buffer overflow Uncontrolled format string

Unstructured code Misuse of inheritance Lack of comments Violated naming convention

Highly coupled component Duplicated code Index modified in loop High cyclomatic complexity

Language Parsers

Oracle PLSQL Sybase T-SQL SQL Server T-SQL IBM SQLPSM C C++ C Pro C Cobol CICS Visual Basic VBNet ASPNet Java J2EE JSP XML HTML Javascript VBScript PHP PowerBuilder Oracle Forms PeopleSoft SAP ABAP Netweaver Tibco Business Objects Universal Analyzer for other languages

Click to edit Master title style Modern Apps are a Technology Stack Ar

chite

ctur

al C

ompl

ianc

e

Data Flow Transaction Risk

EJB PLSQL

Oracle

SQL Server

DB2

TSQL

Hibernate

Spring

Struts NET

COBOL

Sybase IMS

Messaging

Integration quality Architectural

compliance Risk propagation Application security Resiliency checks Transaction integrity

Function point Effort estimation Data access control SDK versioning Calibration across

technologies IT organization level

System Level 3

Code style amp layout Expression complexity Code documentation Class or program design Basic coding standards Developer level

Unit Level 1

Java Java Java

Java Java

Web Services Java Java Single languagetechnology layer

Intra-technology architecture Intra-layer dependencies Inter-program invocation Security vulnerabilities Development team level

Technology Level 2

JSP ASPNET APIs

Click to edit Master title style Architecturally Complex Defects

Primary cause of operational

problems

20x as many

fixes to correct

Architecturally Complex Defect

A structural flaw involving interactions among multiple components that reside in different application layers

48

52 92

8 Architecturally

Complex Defects

Code unit-level violations

of total app defects

of total repair effort

Most architecturally complex defects involve an architectural hotspot ndash a badly-built component that should be replaced

Click to edit Master title style Propagated Risk

The impact of violation 284 in Component B is very widely propagated throughout the system ndash

it presents very high risk

The impact of violation 342 in Component A is only narrowly propagated in the system

ndash it presents low risk

Propagated Risk Index

A ranking of the risk created by each violation based on the its severity and the breadth of its impact in the system

Violation 342 in component A

Violation 284 in component B

Remediating violation 284 in Component B will have

the greater impact on reducing risk and cost

Click to edit Master title style Transaction Risk

Customers care most about the dependability of their transactions

The risk of a transaction is determined by the number and severity of violations along its path

Transactions can be ranked by risk to prioritize their remediation

The Transaction Risk Index can be further tuned using the operational frequency of each transaction

Transaction Risk Index

A ranking of transactions based on the risk created by the number and severity of violations along the transaction path

Transaction entryexit in the user interface layer

Transaction fulfillment in the data storage layer

1 3 2 4 5

Transaction 1 poses greatest risk

Transaction processing in the business logic layer

Click to edit Master title style Risk-Based Prioritization

Violation Health Factor Severity Propagated Risk Index

In a high risk transaction

Frequency of path execution

148 Security 9

371 Robustness 9

062 Performance 8

284 Robustness 7

016 Changeability 7

241 Security 5

173 Transferability 4

359 Transferability 3

Click to edit Master title style Static Analysis in US DoD

John Keane (2013)

Testing finds lt13 of structural defects

Static analysis before testing is 85 efficient

With testing and static analysis combined test schedules are reduced by 50

Click to edit Master title style Reducing Operational Incidents amp Costs

Study of structural quality measures and maintenance effort across 20 customers in a large global system integrator

TQI increase of 24 decreased corrective maintenance effort by 50

Total Quality Index

Click to edit Master title style Reducing Operational Losses

0

20

40

60

80

100

120

140

160

180

100 150 200 250 300 350 400

Technical Quality Index

Pro

du

ctio

n In

cid

ents

+$2000000 Annual Loss

$50000 Annual Loss

$5000 Annual Loss

Large international investment bank Business critical applications

Click to edit Master title style Factors Affecting Structural Quality

2017 CRASH Report Request from infocastsoftwarecom

Factor apps Robust Security Perform Change Transfer

Technology 1606 20 3 16 26 7

Java-EE

Industry 677 5 5 7 4 6

App type 510 1 1 1 1 1

Source 580 1 1 4

Shore 540

Maturity 72 28 25 15 24 12

Method 208 10 6 14 6

Team size 186 7 5 8 8

of users 262 4 3 5 4

Percentage of variation in structural quality scores accounted for by various factors

Click to edit Master title style The Technical Debt Metaphor

Structural quality problems in production code

Technical Debt

Principal borrowed

Interest on the debt

Business Risk Liability from debt

Opportunity cost

Interestmdashcontinuing IT costs attributable to the violations causing technical debt ie higher maintenance costs greater resource usage etc

Principalcost of fixing problems remaining in the code after release that must be remediated

Opportunity costmdashbenefits that could have been achieved had resources been put on new capability rather than retiring technical debt

Liabilitymdashbusiness costs related to outages breaches corrupted data etc

Technical Debt the future cost of fixing defects remaining in code at release a component of the cost of ownership

Todayrsquos talk focuses on the principal

Curtis et al (2012) IEEE Software

Click to edit Master title style Tech Debt by Quality Factor

Transferability

40

Changeability

30

Security 7

Robustness

18

70 of Technical Debt is in IT Cost (Transferability Changeability)

30 of Technical Debt is in Business Risk

(Robustness Performance Security)

Health Factor proportions are mostly

consistent across technologies

Curtis et al (2012) IEEE Software

Click to edit Master title style Join CISQ wwwit-cisqorg

Click to edit Master title style Section

References

115

Click to edit Master title style References 1 Anderson DJ (2010) Kanban Sequim WA Blue Hole Press

Basili V Trendowicz A Kowalczyk M Heidrich J Seaman C Munch J amp Rombach D (2014) Aligning Organizations through Measurement Heidelberg Springer

Boehm B (1987) Increasing software productivity IEEE Computer 20 (9) 43-57

Boehm B et al (2000) Software Cost Estimation with COCOMO II Prentice-Hall

Chrissis MB Konrad M amp Shrum S (2011) CMMI Third Edition Guidelines for Process Integration and Product Improvement Reading MA Addison Wesley

Cohn M (2006) Agile Estimating and Planning Upper Saddle River NJ Prentice-Hall

Crosby P (1979) Quality Is Free New York McGraw-Hill

Curtis B Sapiddi J amp Szynkarski A (2012) Estimating the principle of an applicationrsquos technical debt IEEE Software 29 (6) 34-42

Dion R (1993) Process improvement and the corporate balance sheet IEEE Software 10 (4) 28-35

Duncker R (1992) Presentation at the 25th Annual Conference of the Singapore Computer Society Singapore November 1992

Dyne K (1999) ESSI Benefits from Continuous Software Improvements In Proceedings of ESEPGrsquo99 Milton Keyes UK ESPI Foundation

Flowe RM amp Thordahl JB (1994) A Correlational Study of the SEIrsquos Capability Maturity Model and Software Development Performance in DoD Contracts (AFITGSSLAR94D-2) Dayton OH Air Force Institute of Technology Wright Patterson Air Force Base

116

Click to edit Master title style References 2 Haley T Ireland B Wojtaszek E Nash D amp Dion R (1995) Raytheon Electronic Systems

Experience in Software Process Improvement (Tech Rep CMUSEI-95-TR-017) ) Pittsburgh Software Engineering Institute Carnegie Mellon University

Ebert C amp Dumke R (2007) Software Measurement Berlin Springer-Verlag

Fagan E amp Davis N (2005) TSP PSP at Intuit Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Fenton N (2006) New directions for Software Metrics CIO Symposium on Software Best Practices

Garmus D amp Herron D (2001) Function Point Analysis Boston Addison-Wesley

George E (2005) Project management with TSP Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Herbsleb J Carleton A Rozum J Siegel J amp Zubrow D (1994) Benefits of CMM-Based Software Process Improvement Initial Results (Tech Rep CMUSEI-94-TR-13) Pittsburgh Software Engineering Institute Carnegie Mellon University

Humphrey W (1997) Introduction to the Personal Software Process Boston Addison Wesley

Humphrey W (2000) Introduction to the Team Software Process Boston Addison Wesley

Humphrey W (2006) TSP Leading a Development Team Boston Addison Wesley

Jackson D (2009) A direct path to dependable software Communications of the ACM 52 (4)

Kan SH (1995) Metrics and Models in Software Quality Engineering Reading MA Addison-Wesley

Keeni B (2000) IEEE Software 16 (4)

117

Click to edit Master title style References 2 Larman C (2004) Agile and Iterative Development Boston Addison Wesley

Lydon T (1995) Productivity drivers Process and capital In Proceedings of the 1995 SEPG Conference Pittsburgh Software Engineering Institute Carnegie Mellon University

McGarry J et al (2002) Practical Software Measurement Reading MA Addison Wesley

Paulk M Weber C Curtis B amp Chrissis M (1995) The Capability Maturity Model Guidelines for Improving the Software Process Reading MA Addison-Wesley

Peterman W (2000) IEEE Software JulyAugust issue

Poppendieck M amp Poppendieck T (2007) Implementing Lean Software Development Boston Addison-Wesley

Sakamoto K Kishida K amp Nakakoji K (1996) Cultural adaptation of the CMM In Fuggetta A amp Wolf A (Eds) Software Process Chichester UK Wiley 137-154

Sinha M (2006) Performance models ndash Enabling organizations prediction capability Proceedings of SEPGrsquo06 Pittsburgh Software Engineering Institute Carnegie Mellon University

Schwaber K amp Beedle M (2002) Agile Software Development Using Scrum

Spinellis D (2006) Code Quality Addison-Wesley

Van Epps S amp Marshall R (2012) Using TSP at the end of the development cycle Proceedings of TSP 2012 Pittsburgh Software Engineering Institute Carnegie Mellon University

Vu JD (1996) Software process improvement A business case In Proceedings of the European SEPG Conference Milton Keynes UK European Software Process Improvement Foundation

118

  • Best Practices for Software Process and Product Measurement
  • Instructor Dr Bill Curtis
  • Agenda
  • Section 1
  • Why Does Measurement Languish
  • Measurement Process Guidance
  • Measurement Categories
  • Productivity Analysis Objectives
  • Productivity Analysis Measures
  • Software Productivity
  • Software Size Measures
  • How Many Lines of Code
  • Function Point Estimation
  • Effort mdash Weakest Measure
  • How Quality Affects Productivity
  • Carry-forward Rework
  • Example of Quality Impact
  • Quality-Adjusted Productivity
  • Best Practices for Analysis
  • 1 Evaluate Demographics
  • 2 Segment Baselines
  • 3 Beware Sampling Effects
  • 4 Understand Variation
  • 5 Investigate Distributions
  • 6 Account for Maturity Effects
  • 7 Caution for External Data Sets
  • Slide Number 27
  • Measuring Project Progress
  • Requirements Change Impact
  • Phase Defect Removal Model
  • SCRUM Board
  • Burndown Chart and Velocity
  • Analyzing Velocity
  • Measuring and Managing Flow
  • Trends Over Sprints
  • Sources and Measures
  • Recommended Books
  • Personal Software Process (PSP)
  • Team Software Process (TSP)
  • Aggregated Personal Data Best
  • Defect Detection at Intuit
  • TSP Benefits
  • Section 3
  • CMMI Maturity Transitions
  • Enhanced Maturity Framework
  • Maturity Level Transformation
  • Measurement by Level
  • Average Improvement Results
  • Schedule Performance
  • Cost Performance
  • Crosbyrsquos Cost of Quality Model
  • Raytheons Cost of Quality
  • Raytheonrsquos Productivity
  • Raytheonrsquos Cost Reduction
  • Raytheonrsquos Cost Predictability
  • OMRONrsquos Reuse gains
  • Section 4
  • What about the Product
  • Six Sigmarsquos Challenge
  • Testing is Not Enough
  • CASTrsquos Application Intelligence Platform
  • Modern Apps are a Technology Stack
  • Architecturally Complex Defects
  • Propagated Risk
  • Transaction Risk
  • Risk-Based Prioritization
  • Static Analysis in US DoD
  • Reducing Operational Incidents amp Costs
  • Reducing Operational Losses
  • Slide Number 70
  • The Technical Debt Metaphor
  • Tech Debt by Quality Factor
  • Join CISQ wwwit-cisqorg
  • Section
  • References 1
  • References 2
  • References 2

PHASE

Escapes Previous Phase

KSLOC

Defect Injection

KSLOC

Subtotal

KSLOC

Removal

Effective-ness

Defect Removal

KSLOC

Escapes at phase exit KSLOC

Rqmts

0

12

12

0

0

12

Design

12

180

192

76

146

46

Code amp UT

46

154

200

71

142

58

Inte-gration

58

0

58

67

39

19

Sys Test

19

0

19

58

11

08

Opera-tion

08

Requirements Added Requirements Changed Requirements Deleted
Feb 2 20 1
Mar 2 11 1
Apr 45 35 3
May 15 61 1
Jun 1 20 4
Jul 15 15 0
Aug 18 05 2
Feb Feb Feb
Mar Mar Mar
Apr Apr Apr
May May May
Jun Jun Jun
Jul Jul Jul
Aug Aug Aug
Page 7: Best Practices for Software Process and Product Measurement · Software Process and Product Measurement Dr. Bill Curtis ... measurement best practice • Best guidance for starting

Click to edit Master title style Measurement Categories Category Issue Example measure

Schedule amp Progress

Milestone completion Earned value

Actual vs planned dates Actual vs planned completions

Resources amp Cost

Personnel effort Facilities used

Person-hours Test equipment amp hours

Product Size Size of coded product Size of documentation

Function Points Pages of manuals

Product Quality Functional quality Structural quality

Defects per KLOC Violations of reliability rules

Process Performance

Efficiency Compliance

Defect detection efficiency Audit results

Product Value Return on investment Risk avoidance

∆ Increase in target revenue Cost of system outage

Customer Satisfaction

Account growth Selfndashsufficiency

Repeat business within 1 year Number of helpdesk calls

McGarry et al (2002) Practical Software Measurement p37

Leve

l 2

ge L

evel

3

Click to edit Master title style Productivity Analysis Objectives

Improvement

Productivity Analysis Estimation Benchmarking

Managing Vendors

Click to edit Master title style Productivity Analysis Measures

Adjustment Measures

bull Functional bull Structural bull Behavioral

bull Application bull Project bull Organization

Quality Demo-graphics

Primary Measures

bull Instructions bull Functions bull Requirements

bull Hours bull Roles bull Phases

Size Effort

Productivity Analysis

Click to edit Master title style Software Productivity

Release Productivity =

developed Size of software deleted or modified

Total effort expended on the release

Software Productivity =

Size of software produced

Total effort expended to produce it

Click to edit Master title style Software Size Measures

Instructions Lines of Code

Most frequently used Different definitions of a line can cause counts to vary by 10x Smaller programs often accomplish the same functionality with higher quality coding

Requirements-based Use Case Points Story Points

Use Case Points have not become widely used and need more development Story points are subjective to each team and are susceptible to several forms of bias

Functions Function Points

Popular in IT Several counting schemes (IFPUG NESMA Mark II COSMIC etc) Manual counting is expensive and subjectivemdashcertified counters can differ by 10 Automated FPs taking root

Click to edit Master title style How Many Lines of Code define LOWER 0 lower limit of table define UPPER 300 upper limit define STEP 20 step size main() print a Fahrenheit-Celsius conversion table int fahr for(fahr=LOWER fahrlt=UPPER fahr=fahr+STEP) printf(ldquo4d 61fnrdquo fahr (5090)(fahr-32))

1 2 3 4 5 6 7 8 9 0

2

4

6

8

10

12

Number of Lines of Code

Number of votes

B Park (1992)

Click to edit Master title style Function Point Estimation

Ebert amp Dumke (2007) Software Measurement p188

R2 = 95 y = 779x + 4350

0 200 400 600 800 1000 0

2000

4000

6000

8000

Un

adju

sted

Fu

nct

ion

Po

ints

EIs + EOs

Data store Software

Entries

Exits

Reads

Writes

Functional view of software

Functional size can be estimated from external inputs and outputs

Upfront functional analysis provides basis for good estimates

Repository of FP data provides basis for calibrating estimates

39

Click to edit Master title style Effort mdash Weakest Measure

Effort Unreliable

Inconsistent

Effort

After the fact estimates

bull Memory lapses bull Time-splicing bull Inconsistency

Under-reporting

bull Contract issues

bull HR issues

bull Impressions

Lack of normalization

bull Roles included

bull Phases included

bull Hours in P-Year

Click to edit Master title style How Quality Affects Productivity

Original productivity baseline

Incremental increases in technical debt

Continuing decrease in productivity

Unless you take action

Assumption Productivity is a stable number

Reality Productivity is unstable tending to decline

Click to edit Master title style Carry-forward Rework

Release N+2

Develop N+2 Rework N+2 Rework N+1 Rework N+0

Develop N

Release N

Rework N

Unfixed defects release N

Release N+1

Develop N+1 Rework N+1 Rework N+0

Unfixed defects release N

Unfixed defects release N+1

Click to edit Master title style Example of Quality Impact Project B (Better Faster Cheaper)

ndash 20 developers 3 months

ndash $120k per FTE

ndash 4 FPs per staff month

ndash 240 FPs delivered

$2500FP cost

ndash 5 critical violations per FP

ndash $500 per fix

ndash Cost for 1200 fixes = $600k

ndash Total Cost to Own = $1200k

$5000FP of TCO

Project A (Plodders)

ndash 20 developers 3 months

ndash $120k per FTE

ndash 3 FPs per staff month

ndash 180 FPs delivered

$3333FP cost

ndash 2 critical violations per FP

ndash $500 per fix

ndash Cost for 360 fixes = $180k

ndash Total Cost to Own = $780k

$4333FP of TCO

Project B is 25 more productive

However

Project A is 134 more productive

Click to edit Master title style Quality-Adjusted Productivity

Productivity

Estimation

Benchmarks

Value amp ROI

Etc

Automated Enhancement

Points

Size of both functional and non-functional code segments

Corrective effort in future releases for defects injected in

this release

Must add future effort to fix bugs into productivity

calculations

Effort amp Cost

Quality-Adjusted

Productivity

Automated Technical Debt

Click to edit Master title style Best Practices for Analysis

1) Segment baselines

2) Beware sampling effects

3) Understand variation

4) Evaluate demographics

5) Investigate distributions

6) Account for maturity effects

7) Beware external data sets

Click to edit Master title style 1 Evaluate Demographics

Routine reports

COBOL apps

Function Points

Prod

uctiv

ity

Web-based apps

Innovative mobile

Custom ERP

Click to edit Master title style 2 Segment Baselines

Year Projects Productivity Total Corporate 1981 28 2342 1980 21 1939 Telecommunications 1981 14 1811 1980 12 1458 Engineering amp Defense 1981 8 2965 1980 6 2739 Business Applications 1981 6 3054 1980 3 1813

Multiple baselines are usually the most valid

Click to edit Master title style 3 Beware Sampling Effects

22

0

500

1000

1500

2000

2500

1980 1981 1982

Line

s of

cod

epe

rson

-yea

rs

Baselines

Lots of small

projects Lots of large

projects

Click to edit Master title style 4 Understand Variation

23

Product factors bull timing constraints bull memory utilization bull CPU occupancy bull resource constraints bull complexity bull product size

Project factors bull concurrent development bull development computer bull requirements definition bull requirements churn bull programming practices bull personnel experience bull client experience bull client participation

Project 49

Product 16

Not explained

35

R2product = 16

R2project = 49

R2

model = 65

J Vosburgh B Curtis et al (1984) Proceedings of ICSE 1994

Click to edit Master title style 5 Investigate Distributions

Low Medium High

Rel

ativ

e p

rod

uct

ivit

y (

of m

ean

)

Software Engineering Practices Usage

-100

-50

0

50

100

150

200

250

Click to edit Master title style 6 Account for Maturity Effects

Which organization will be required to spend more effort on correcting software flaws

Click to edit Master title style 7 Caution for External Data Sets

26

DACS Us

Productivity

Siz

e Data definitions

Data collection

Data validity

Data age

Demographics

Click to edit Master title style

1) Tracking traditional projects

2) Iterative and Agile measurement

3) Team Software Process measurement

33

Section 2 Measuring to Manage

Software Projects

Click to edit Master title style Measuring Project Progress

300K 250K 200K 150K 100K 50K 0K

Jan Feb Mar Apr May Jun

Planned rate of code Growth

Actual code growth

Completion of coding

ldquoThe measures say we are on schedulerdquo

300K

250K

200K

150K

100K

50K

0K Jan Feb Mar Apr May Jun

Planned rate of code Growth

Actual code growth

Completion of coding

ldquoBut the code keeps growing We should have measured the requirements growthrdquo

50

100

150

Defect reports

Total

Open

Closed

125

75

25

Jan Feb Mar Apr May Jun

Completion of coding

ldquoAnd all the defect reports remaining to be fixed will keep us far behind schedulerdquo

Click to edit Master title style Requirements Change Impact

Number of requirements

affected

Chart6

Requirements Added
Requirements Changed
Requirements Deleted
Person hours of work or rework
2
20
1
2
11
1
45
35
3
15
61
1
1
20
4
15
15
0
18
05
2

Sheet1

Sheet1

Requirements Added
Requirements Changed
Requirements Deleted
Person hours of work or rework

Sheet2

Sheet3

Click to edit Master title style Phase Defect Removal Model

PHASE

Escapes Previous Phase KSLOC

Defect Injection KSLOC

Subtotal KSLOC

Removal Effective-ness

Defect Removal KSLOC

Escapes at phase exit KSLOC

Rqmts 0 12 12 0 0 12

Design 12 180 192 76 146 46

Code amp UT 46 154 200 71 142 58

Inte-gration 58 0 58 67 39 19

Sys Test 19 0 19 58 11 08

Opera-tion 08

Kan (1999) Metrics and Model in Software Quality Engineering

Click to edit Master title style SCRUM Board Scrumboard displays

the path to completion for each story by showing task status

Story points are an estimate of days for each story

Agile estimating is by feature (story) rather than by task

lsquoDonersquo does not always mean all tasks are finished

Kanban methods restrict the work in progress to a limited number of paths

Click to edit Master title style Burndown Chart and Velocity

Burndown chart displays story completion by day

Burndown chart indicates actual versus estimated progress

Velocity is the rate at which stories are completed

Velocity indicates sustainable progress

Velocity results provide one source of historical data for improving estimates

Story points without the context of productivity factors are dangerous for estimating

Click to edit Master title style Team Velocity

0

1

2

3

4

5

1 2 2 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20Days

Story Points per Hour

0

1

2

3

4

5

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

HrsStryPt

Analyzing Velocity

Item Cycle Times

02468

101214161820

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

Stories Hours HrsStryPt

0

10

20

30

40

50

60

1 2 2 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

Burndown Chart

Days

Click to edit Master title style Measuring and Managing Flow

Released

In progress

Backlogged

Cumulative Flow Diagram Days to delivery

outliers

Anderson (2010) Kanban

Click to edit Master title style

0

10

20

30

40

50

60

70

80

1 2 3 4 5 6 7 8 9 10 11 12

Sprints

Sto

ry p

oin

ts d

eliv

ered

Trends Over Sprints

0

10

20

30

40

50

60

70

80

1 2 3 4 5 6 7 8 9 10 11 12

Sprints

Sto

ry p

oin

ts d

eliv

ered

0

05

1

15

2

25

3

35

4

1 2 3 4 5 6 7 8 9 10 11 12

Sprints

Tech

nic

al d

ebt

Track and correlate multiple measures

across sprints

Predictive models

Click to edit Master title style Sources and Measures

Stories submitted

Stories pulled back

Growth in size of stories

Stories added mid-sprint

Stories under review

Failed tests

Non-development tasks

Assists to other projects

Data sources

Measures Project tracking

Code control

Bug tracking

Build system

Test environment

Deployment tools

Operational monitoring

Click to edit Master title style Recommended Books

Click to edit Master title style Personal Software Process (PSP)

PSP 0 Current process Time recording

Defect recording Defect type standards

PSP 01 Coding standard

Size measurement Process improvement

proposal

Baseline Personal Process

Humphrey (1997) Introduction to the Personal Software Process

PSP 1 Size estimating

Test report

Personal Planning

PSP 2 Code reviews

Design reviews

Personal Quality Management

PSP 21 Cyclic development

Cyclic Personal Process

PSP 11 Task planning

Schedule planning

PSP 21 Design templates

Click to edit Master title style Team Software Process (TSP) bull Built from personal processes of team members

bull Well defined team roles

bull Project launch workshops for planning

bull Team measurement and tracking

bull Post-mortems for improvement

bull Team application of lean principles

Humphrey (2000) Introduction to the Team Software Process

Click to edit Master title style Aggregated Personal Data Best

R2s range from 70 to 87

The ability to predict the amount of time required to produce a piece of software is dramatically improved by estimating at the individual level first and then combining the estimates rather than developing a single team estimate averaged over individuals

R2 = 53

Click to edit Master title style Defect Detection at Intuit

Consequently test changes from defect

detection into correctness verification

PSPTSP shifts defect detection from the test phase back up into development

Fagan (2005)

Personal reviews

33

Compile 33

Team reviews 19

Unit test 15

Test 14

Click to edit Master title style

Dramatic reductions

ndash Delivered defects

ndash Days per KLOC

ndash Schedule deviation

ndash Test defectskloc

ndash Variation in results

Humphrey (2006) TSP Leading a Development Team

TSP Benefits

Click to edit Master title style Section 3 Section 3 Measuring Software

Improvement Programs

1) Improvement program methods

2) Maturity-based measurement

3) Improvement program results

61

Click to edit Master title style CMMI Maturity Transitions

Level 5 Optimizing

Innovation management

Level 4 Quantitatively

Managed

Capability management

Level 3 Defined

Process management

Level 2 Managed

Project management

Level 1 Initial

Inconsistent management

Chrisis Konrad amp Shrum (2005) CMMI

Click to edit Master title style Enhanced Maturity Framework

Level 5 Innovating

Innovation management

Level 4 Optimized

Capability management

Level 3 Standardized

Organizational management

Level 2 Stabilized

Work unit management

Level 1 Initial

Inconsistent management

Click to edit Master title style Maturity Level Transformation

Level 5 Opportunistic improvements

Empowered culture

Level 5 Proactive

improvements Agile culture

Organization

Projects

Individual

Level 4 Process and results

managed statistically Precision culture

Level 3 Organization develops

standard processes Common culture

Level 2 Project managers stabilize projects

Local culture

Level 1 Ad Hoc inconsistent

development practices Relationship culture

Trust

Click to edit Master title style Measurement by Level

Level 5 mdash Exploratory data ∆ t = N x ROI

Level 4 mdash Predictive data X plusmn (196σ radicn )

Level 3 mdash Process data 83 defects per review

Level 2 mdash Management data Effort = α LOCβ

Level 1 mdash Haphazard data ldquoabout a million linesrdquo

Measure-maturity mismatch slows improvement and creates

dysfunction

Click to edit Master title style Average Improvement Results

Productivity growth 4 35 9 - 67

Pre-test defect detection 3 22 6 - 25

Time to market 2 darr 19 15 - 23

Field error reports 5 darr 39 10 - 94

Return on investment 5 501 41 - 881

Annual Annual Improvement benefit Orgs median range

Click to edit Master title style Schedule Performance

0

02

04

06

08

1

12

14

16

1 2 3

Schedule performance

index

CMM maturity level

Budgeted cost of work performed

Budgeted cost of work scheduled

behind ahead

Air Force study of contractors

Flowe amp Thordahl (1994)

Click to edit Master title style Cost Performance

Cost performance

index

0

05

1

15

2

25

1 2 3 CMM Maturity Level

Budgeted cost of work performed Actual cost of

work performed

overrun underrun

Air Force study of contractors

Flowe amp Thordahl (1994)

Click to edit Master title style Crosbyrsquos Cost of Quality Model Project Cost

Performance Cost of Quality Generating plans documentation Development - requirements - design - code - integration Prevention Appraisal

Reviews - system - requirements - design - test plan - test scripts Walkthroughs Testing (first time) Independent Verification and Validation (first time) Audits

Training Policies Procedures and methods Tools Planning Quality Improvement Projects Data Gathering and Analysis Root Cause Analysis Quality Reporting

Nonconformance Conformance Fixing Defects Reworking any Document Updating Source Code Re-reviews Re-tests Lab costs for re-tests Patches(Internal and External) Engineering Changes Change Control Boards External Failures and fines Customer Support Help Desks

Dion (1993) Crosby (1979) Quality Is Free

Click to edit Master title style Raytheons Cost of Quality

bull Performance mdash cost of building it right first time bull Nonconformance mdash cost of rework bull Appraisal mdash cost of testing bull Prevention mdash cost of preventing nonconformance

Dion (1993) Haley (1995)

23

18

Year Level Perform Nonconf Appraise Prevent

1988 1 34 41 15 7

1990 2 55 18 15 12

1992 3 66 11

1994 4 76 6 75

Click to edit Master title style Raytheonrsquos Productivity

0

1

2

3

4

1988 1989 1990 1991 1992 1993 1994

Growth relative to 1989 baseline

Lydon (1995)

Click to edit Master title style Raytheonrsquos Cost Reduction

0

02

04

06

08

1

12

1988 1989 1990 1991 1992 1993 1994

Reduction relative to 1989 baseline

Lydon (1995)

Click to edit Master title style

Haley (1995)

1993 1992 1991 1990 1989 1988 1994

50

40

30

20

10

0

-10

Actual cost

Budgeted cost

over-run under-run

4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1

Raytheonrsquos Cost Predictability

Click to edit Master title style OMRONrsquos Reuse gains

Sakamoto et al (1996)

0

5 0 0

1 0 0 0

1 5 0 0

2 0 0 0

2 5 0 0

3 0 0 0

1 9 8 7 1 9 8 8 1 9 8 9 1 9 9 0 1 9 9 1 1 9 9 2 1 9 9 3 1 9 9 4

Steps

100

75

50

25

0

of reuse

Click to edit Master title style Section 4 Section 4 Measuring Quality and

Technical Debt

1) Structural quality measurement

2) Technical debt

3) Consortium for IT Software Quality (CISQ)

89

Click to edit Master title style What about the Product CMMIrsquos primary assumption

ldquoThe quality of a software system is governed by quality of the process used to develop itrdquo

- Watts Humphrey

CMMIrsquos product quality problem

1 Assessments do not verify product quality

2 Compliance focus undermines CMMIrsquos primary assumption

3 Product quality focus at Level 4 in CMM was lost in quantitative process control in CMMI

4 The assumptions underlying control charts are violated by software data

Click to edit Master title style Six Sigmarsquos Challenge

Process focus ndash process improvement ndash Six Sigma Product focus ndash product improvement ndash Design for 6σ

Product focus supplements product focus to unlock even more value from software

Six Sigma projects must have significant benefits Huge benefits

Large benefits

Good benefits

Okay benefits

Small benefits

Now what

Ultimately we run out of projects with significant enough benefits to continue the programhellip

so what is the solution that allows continual improvement

Click to edit Master title style Testing is Not Enough ldquoAs higher levels of assurance are demandedhelliptesting cannot deliver the level of confidence required at a reasonable costrdquo

ldquoThe correctness of the code is rarely the weakest linkrdquo

Jackson D (2009) Spinellis D (2006)

ldquohellipa failure to satisfy a non-functional requirement can be critical even catastrophichellipnon-functional requirements are sometimes difficult to verify We cannot write a test case to verify a systemrsquos reliabilityhellipThe ability to associate code to non-functional properties can be a powerful weapon in a software engineerrsquos arsenalrdquo

Click to edit Master title style CASTrsquos Application Intelligence Platform

Application Analysis

Evaluation of 1200+ coding amp

architectural rules

Application meta-data

Transferability

Changeability

Robustness

Performance

Security

Quality Measurements

Detected Violations

Expensive operation in loop Static vs pooled connections Complex query on big table Large indices on big table

Empty CATCH block Uncontrolled data access Poor memory management Opened resource not closed

SQL injection Cross-site scripting Buffer overflow Uncontrolled format string

Unstructured code Misuse of inheritance Lack of comments Violated naming convention

Highly coupled component Duplicated code Index modified in loop High cyclomatic complexity

Language Parsers

Oracle PLSQL Sybase T-SQL SQL Server T-SQL IBM SQLPSM C C++ C Pro C Cobol CICS Visual Basic VBNet ASPNet Java J2EE JSP XML HTML Javascript VBScript PHP PowerBuilder Oracle Forms PeopleSoft SAP ABAP Netweaver Tibco Business Objects Universal Analyzer for other languages

Click to edit Master title style Modern Apps are a Technology Stack Ar

chite

ctur

al C

ompl

ianc

e

Data Flow Transaction Risk

EJB PLSQL

Oracle

SQL Server

DB2

TSQL

Hibernate

Spring

Struts NET

COBOL

Sybase IMS

Messaging

Integration quality Architectural

compliance Risk propagation Application security Resiliency checks Transaction integrity

Function point Effort estimation Data access control SDK versioning Calibration across

technologies IT organization level

System Level 3

Code style amp layout Expression complexity Code documentation Class or program design Basic coding standards Developer level

Unit Level 1

Java Java Java

Java Java

Web Services Java Java Single languagetechnology layer

Intra-technology architecture Intra-layer dependencies Inter-program invocation Security vulnerabilities Development team level

Technology Level 2

JSP ASPNET APIs

Click to edit Master title style Architecturally Complex Defects

Primary cause of operational

problems

20x as many

fixes to correct

Architecturally Complex Defect

A structural flaw involving interactions among multiple components that reside in different application layers

48

52 92

8 Architecturally

Complex Defects

Code unit-level violations

of total app defects

of total repair effort

Most architecturally complex defects involve an architectural hotspot ndash a badly-built component that should be replaced

Click to edit Master title style Propagated Risk

The impact of violation 284 in Component B is very widely propagated throughout the system ndash

it presents very high risk

The impact of violation 342 in Component A is only narrowly propagated in the system

ndash it presents low risk

Propagated Risk Index

A ranking of the risk created by each violation based on the its severity and the breadth of its impact in the system

Violation 342 in component A

Violation 284 in component B

Remediating violation 284 in Component B will have

the greater impact on reducing risk and cost

Click to edit Master title style Transaction Risk

Customers care most about the dependability of their transactions

The risk of a transaction is determined by the number and severity of violations along its path

Transactions can be ranked by risk to prioritize their remediation

The Transaction Risk Index can be further tuned using the operational frequency of each transaction

Transaction Risk Index

A ranking of transactions based on the risk created by the number and severity of violations along the transaction path

Transaction entryexit in the user interface layer

Transaction fulfillment in the data storage layer

1 3 2 4 5

Transaction 1 poses greatest risk

Transaction processing in the business logic layer

Click to edit Master title style Risk-Based Prioritization

Violation Health Factor Severity Propagated Risk Index

In a high risk transaction

Frequency of path execution

148 Security 9

371 Robustness 9

062 Performance 8

284 Robustness 7

016 Changeability 7

241 Security 5

173 Transferability 4

359 Transferability 3

Click to edit Master title style Static Analysis in US DoD

John Keane (2013)

Testing finds lt13 of structural defects

Static analysis before testing is 85 efficient

With testing and static analysis combined test schedules are reduced by 50

Click to edit Master title style Reducing Operational Incidents amp Costs

Study of structural quality measures and maintenance effort across 20 customers in a large global system integrator

TQI increase of 24 decreased corrective maintenance effort by 50

Total Quality Index

Click to edit Master title style Reducing Operational Losses

0

20

40

60

80

100

120

140

160

180

100 150 200 250 300 350 400

Technical Quality Index

Pro

du

ctio

n In

cid

ents

+$2000000 Annual Loss

$50000 Annual Loss

$5000 Annual Loss

Large international investment bank Business critical applications

Click to edit Master title style Factors Affecting Structural Quality

2017 CRASH Report Request from infocastsoftwarecom

Factor apps Robust Security Perform Change Transfer

Technology 1606 20 3 16 26 7

Java-EE

Industry 677 5 5 7 4 6

App type 510 1 1 1 1 1

Source 580 1 1 4

Shore 540

Maturity 72 28 25 15 24 12

Method 208 10 6 14 6

Team size 186 7 5 8 8

of users 262 4 3 5 4

Percentage of variation in structural quality scores accounted for by various factors

Click to edit Master title style The Technical Debt Metaphor

Structural quality problems in production code

Technical Debt

Principal borrowed

Interest on the debt

Business Risk Liability from debt

Opportunity cost

Interestmdashcontinuing IT costs attributable to the violations causing technical debt ie higher maintenance costs greater resource usage etc

Principalcost of fixing problems remaining in the code after release that must be remediated

Opportunity costmdashbenefits that could have been achieved had resources been put on new capability rather than retiring technical debt

Liabilitymdashbusiness costs related to outages breaches corrupted data etc

Technical Debt the future cost of fixing defects remaining in code at release a component of the cost of ownership

Todayrsquos talk focuses on the principal

Curtis et al (2012) IEEE Software

Click to edit Master title style Tech Debt by Quality Factor

Transferability

40

Changeability

30

Security 7

Robustness

18

70 of Technical Debt is in IT Cost (Transferability Changeability)

30 of Technical Debt is in Business Risk

(Robustness Performance Security)

Health Factor proportions are mostly

consistent across technologies

Curtis et al (2012) IEEE Software

Click to edit Master title style Join CISQ wwwit-cisqorg

Click to edit Master title style Section

References

115

Click to edit Master title style References 1 Anderson DJ (2010) Kanban Sequim WA Blue Hole Press

Basili V Trendowicz A Kowalczyk M Heidrich J Seaman C Munch J amp Rombach D (2014) Aligning Organizations through Measurement Heidelberg Springer

Boehm B (1987) Increasing software productivity IEEE Computer 20 (9) 43-57

Boehm B et al (2000) Software Cost Estimation with COCOMO II Prentice-Hall

Chrissis MB Konrad M amp Shrum S (2011) CMMI Third Edition Guidelines for Process Integration and Product Improvement Reading MA Addison Wesley

Cohn M (2006) Agile Estimating and Planning Upper Saddle River NJ Prentice-Hall

Crosby P (1979) Quality Is Free New York McGraw-Hill

Curtis B Sapiddi J amp Szynkarski A (2012) Estimating the principle of an applicationrsquos technical debt IEEE Software 29 (6) 34-42

Dion R (1993) Process improvement and the corporate balance sheet IEEE Software 10 (4) 28-35

Duncker R (1992) Presentation at the 25th Annual Conference of the Singapore Computer Society Singapore November 1992

Dyne K (1999) ESSI Benefits from Continuous Software Improvements In Proceedings of ESEPGrsquo99 Milton Keyes UK ESPI Foundation

Flowe RM amp Thordahl JB (1994) A Correlational Study of the SEIrsquos Capability Maturity Model and Software Development Performance in DoD Contracts (AFITGSSLAR94D-2) Dayton OH Air Force Institute of Technology Wright Patterson Air Force Base

116

Click to edit Master title style References 2 Haley T Ireland B Wojtaszek E Nash D amp Dion R (1995) Raytheon Electronic Systems

Experience in Software Process Improvement (Tech Rep CMUSEI-95-TR-017) ) Pittsburgh Software Engineering Institute Carnegie Mellon University

Ebert C amp Dumke R (2007) Software Measurement Berlin Springer-Verlag

Fagan E amp Davis N (2005) TSP PSP at Intuit Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Fenton N (2006) New directions for Software Metrics CIO Symposium on Software Best Practices

Garmus D amp Herron D (2001) Function Point Analysis Boston Addison-Wesley

George E (2005) Project management with TSP Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Herbsleb J Carleton A Rozum J Siegel J amp Zubrow D (1994) Benefits of CMM-Based Software Process Improvement Initial Results (Tech Rep CMUSEI-94-TR-13) Pittsburgh Software Engineering Institute Carnegie Mellon University

Humphrey W (1997) Introduction to the Personal Software Process Boston Addison Wesley

Humphrey W (2000) Introduction to the Team Software Process Boston Addison Wesley

Humphrey W (2006) TSP Leading a Development Team Boston Addison Wesley

Jackson D (2009) A direct path to dependable software Communications of the ACM 52 (4)

Kan SH (1995) Metrics and Models in Software Quality Engineering Reading MA Addison-Wesley

Keeni B (2000) IEEE Software 16 (4)

117

Click to edit Master title style References 2 Larman C (2004) Agile and Iterative Development Boston Addison Wesley

Lydon T (1995) Productivity drivers Process and capital In Proceedings of the 1995 SEPG Conference Pittsburgh Software Engineering Institute Carnegie Mellon University

McGarry J et al (2002) Practical Software Measurement Reading MA Addison Wesley

Paulk M Weber C Curtis B amp Chrissis M (1995) The Capability Maturity Model Guidelines for Improving the Software Process Reading MA Addison-Wesley

Peterman W (2000) IEEE Software JulyAugust issue

Poppendieck M amp Poppendieck T (2007) Implementing Lean Software Development Boston Addison-Wesley

Sakamoto K Kishida K amp Nakakoji K (1996) Cultural adaptation of the CMM In Fuggetta A amp Wolf A (Eds) Software Process Chichester UK Wiley 137-154

Sinha M (2006) Performance models ndash Enabling organizations prediction capability Proceedings of SEPGrsquo06 Pittsburgh Software Engineering Institute Carnegie Mellon University

Schwaber K amp Beedle M (2002) Agile Software Development Using Scrum

Spinellis D (2006) Code Quality Addison-Wesley

Van Epps S amp Marshall R (2012) Using TSP at the end of the development cycle Proceedings of TSP 2012 Pittsburgh Software Engineering Institute Carnegie Mellon University

Vu JD (1996) Software process improvement A business case In Proceedings of the European SEPG Conference Milton Keynes UK European Software Process Improvement Foundation

118

  • Best Practices for Software Process and Product Measurement
  • Instructor Dr Bill Curtis
  • Agenda
  • Section 1
  • Why Does Measurement Languish
  • Measurement Process Guidance
  • Measurement Categories
  • Productivity Analysis Objectives
  • Productivity Analysis Measures
  • Software Productivity
  • Software Size Measures
  • How Many Lines of Code
  • Function Point Estimation
  • Effort mdash Weakest Measure
  • How Quality Affects Productivity
  • Carry-forward Rework
  • Example of Quality Impact
  • Quality-Adjusted Productivity
  • Best Practices for Analysis
  • 1 Evaluate Demographics
  • 2 Segment Baselines
  • 3 Beware Sampling Effects
  • 4 Understand Variation
  • 5 Investigate Distributions
  • 6 Account for Maturity Effects
  • 7 Caution for External Data Sets
  • Slide Number 27
  • Measuring Project Progress
  • Requirements Change Impact
  • Phase Defect Removal Model
  • SCRUM Board
  • Burndown Chart and Velocity
  • Analyzing Velocity
  • Measuring and Managing Flow
  • Trends Over Sprints
  • Sources and Measures
  • Recommended Books
  • Personal Software Process (PSP)
  • Team Software Process (TSP)
  • Aggregated Personal Data Best
  • Defect Detection at Intuit
  • TSP Benefits
  • Section 3
  • CMMI Maturity Transitions
  • Enhanced Maturity Framework
  • Maturity Level Transformation
  • Measurement by Level
  • Average Improvement Results
  • Schedule Performance
  • Cost Performance
  • Crosbyrsquos Cost of Quality Model
  • Raytheons Cost of Quality
  • Raytheonrsquos Productivity
  • Raytheonrsquos Cost Reduction
  • Raytheonrsquos Cost Predictability
  • OMRONrsquos Reuse gains
  • Section 4
  • What about the Product
  • Six Sigmarsquos Challenge
  • Testing is Not Enough
  • CASTrsquos Application Intelligence Platform
  • Modern Apps are a Technology Stack
  • Architecturally Complex Defects
  • Propagated Risk
  • Transaction Risk
  • Risk-Based Prioritization
  • Static Analysis in US DoD
  • Reducing Operational Incidents amp Costs
  • Reducing Operational Losses
  • Slide Number 70
  • The Technical Debt Metaphor
  • Tech Debt by Quality Factor
  • Join CISQ wwwit-cisqorg
  • Section
  • References 1
  • References 2
  • References 2

PHASE

Escapes Previous Phase

KSLOC

Defect Injection

KSLOC

Subtotal

KSLOC

Removal

Effective-ness

Defect Removal

KSLOC

Escapes at phase exit KSLOC

Rqmts

0

12

12

0

0

12

Design

12

180

192

76

146

46

Code amp UT

46

154

200

71

142

58

Inte-gration

58

0

58

67

39

19

Sys Test

19

0

19

58

11

08

Opera-tion

08

Requirements Added Requirements Changed Requirements Deleted
Feb 2 20 1
Mar 2 11 1
Apr 45 35 3
May 15 61 1
Jun 1 20 4
Jul 15 15 0
Aug 18 05 2
Feb Feb Feb
Mar Mar Mar
Apr Apr Apr
May May May
Jun Jun Jun
Jul Jul Jul
Aug Aug Aug
Page 8: Best Practices for Software Process and Product Measurement · Software Process and Product Measurement Dr. Bill Curtis ... measurement best practice • Best guidance for starting

Click to edit Master title style Productivity Analysis Objectives

Improvement

Productivity Analysis Estimation Benchmarking

Managing Vendors

Click to edit Master title style Productivity Analysis Measures

Adjustment Measures

bull Functional bull Structural bull Behavioral

bull Application bull Project bull Organization

Quality Demo-graphics

Primary Measures

bull Instructions bull Functions bull Requirements

bull Hours bull Roles bull Phases

Size Effort

Productivity Analysis

Click to edit Master title style Software Productivity

Release Productivity =

developed Size of software deleted or modified

Total effort expended on the release

Software Productivity =

Size of software produced

Total effort expended to produce it

Click to edit Master title style Software Size Measures

Instructions Lines of Code

Most frequently used Different definitions of a line can cause counts to vary by 10x Smaller programs often accomplish the same functionality with higher quality coding

Requirements-based Use Case Points Story Points

Use Case Points have not become widely used and need more development Story points are subjective to each team and are susceptible to several forms of bias

Functions Function Points

Popular in IT Several counting schemes (IFPUG NESMA Mark II COSMIC etc) Manual counting is expensive and subjectivemdashcertified counters can differ by 10 Automated FPs taking root

Click to edit Master title style How Many Lines of Code define LOWER 0 lower limit of table define UPPER 300 upper limit define STEP 20 step size main() print a Fahrenheit-Celsius conversion table int fahr for(fahr=LOWER fahrlt=UPPER fahr=fahr+STEP) printf(ldquo4d 61fnrdquo fahr (5090)(fahr-32))

1 2 3 4 5 6 7 8 9 0

2

4

6

8

10

12

Number of Lines of Code

Number of votes

B Park (1992)

Click to edit Master title style Function Point Estimation

Ebert amp Dumke (2007) Software Measurement p188

R2 = 95 y = 779x + 4350

0 200 400 600 800 1000 0

2000

4000

6000

8000

Un

adju

sted

Fu

nct

ion

Po

ints

EIs + EOs

Data store Software

Entries

Exits

Reads

Writes

Functional view of software

Functional size can be estimated from external inputs and outputs

Upfront functional analysis provides basis for good estimates

Repository of FP data provides basis for calibrating estimates

39

Click to edit Master title style Effort mdash Weakest Measure

Effort Unreliable

Inconsistent

Effort

After the fact estimates

bull Memory lapses bull Time-splicing bull Inconsistency

Under-reporting

bull Contract issues

bull HR issues

bull Impressions

Lack of normalization

bull Roles included

bull Phases included

bull Hours in P-Year

Click to edit Master title style How Quality Affects Productivity

Original productivity baseline

Incremental increases in technical debt

Continuing decrease in productivity

Unless you take action

Assumption Productivity is a stable number

Reality Productivity is unstable tending to decline

Click to edit Master title style Carry-forward Rework

Release N+2

Develop N+2 Rework N+2 Rework N+1 Rework N+0

Develop N

Release N

Rework N

Unfixed defects release N

Release N+1

Develop N+1 Rework N+1 Rework N+0

Unfixed defects release N

Unfixed defects release N+1

Click to edit Master title style Example of Quality Impact Project B (Better Faster Cheaper)

ndash 20 developers 3 months

ndash $120k per FTE

ndash 4 FPs per staff month

ndash 240 FPs delivered

$2500FP cost

ndash 5 critical violations per FP

ndash $500 per fix

ndash Cost for 1200 fixes = $600k

ndash Total Cost to Own = $1200k

$5000FP of TCO

Project A (Plodders)

ndash 20 developers 3 months

ndash $120k per FTE

ndash 3 FPs per staff month

ndash 180 FPs delivered

$3333FP cost

ndash 2 critical violations per FP

ndash $500 per fix

ndash Cost for 360 fixes = $180k

ndash Total Cost to Own = $780k

$4333FP of TCO

Project B is 25 more productive

However

Project A is 134 more productive

Click to edit Master title style Quality-Adjusted Productivity

Productivity

Estimation

Benchmarks

Value amp ROI

Etc

Automated Enhancement

Points

Size of both functional and non-functional code segments

Corrective effort in future releases for defects injected in

this release

Must add future effort to fix bugs into productivity

calculations

Effort amp Cost

Quality-Adjusted

Productivity

Automated Technical Debt

Click to edit Master title style Best Practices for Analysis

1) Segment baselines

2) Beware sampling effects

3) Understand variation

4) Evaluate demographics

5) Investigate distributions

6) Account for maturity effects

7) Beware external data sets

Click to edit Master title style 1 Evaluate Demographics

Routine reports

COBOL apps

Function Points

Prod

uctiv

ity

Web-based apps

Innovative mobile

Custom ERP

Click to edit Master title style 2 Segment Baselines

Year Projects Productivity Total Corporate 1981 28 2342 1980 21 1939 Telecommunications 1981 14 1811 1980 12 1458 Engineering amp Defense 1981 8 2965 1980 6 2739 Business Applications 1981 6 3054 1980 3 1813

Multiple baselines are usually the most valid

Click to edit Master title style 3 Beware Sampling Effects

22

0

500

1000

1500

2000

2500

1980 1981 1982

Line

s of

cod

epe

rson

-yea

rs

Baselines

Lots of small

projects Lots of large

projects

Click to edit Master title style 4 Understand Variation

23

Product factors bull timing constraints bull memory utilization bull CPU occupancy bull resource constraints bull complexity bull product size

Project factors bull concurrent development bull development computer bull requirements definition bull requirements churn bull programming practices bull personnel experience bull client experience bull client participation

Project 49

Product 16

Not explained

35

R2product = 16

R2project = 49

R2

model = 65

J Vosburgh B Curtis et al (1984) Proceedings of ICSE 1994

Click to edit Master title style 5 Investigate Distributions

Low Medium High

Rel

ativ

e p

rod

uct

ivit

y (

of m

ean

)

Software Engineering Practices Usage

-100

-50

0

50

100

150

200

250

Click to edit Master title style 6 Account for Maturity Effects

Which organization will be required to spend more effort on correcting software flaws

Click to edit Master title style 7 Caution for External Data Sets

26

DACS Us

Productivity

Siz

e Data definitions

Data collection

Data validity

Data age

Demographics

Click to edit Master title style

1) Tracking traditional projects

2) Iterative and Agile measurement

3) Team Software Process measurement

33

Section 2 Measuring to Manage

Software Projects

Click to edit Master title style Measuring Project Progress

300K 250K 200K 150K 100K 50K 0K

Jan Feb Mar Apr May Jun

Planned rate of code Growth

Actual code growth

Completion of coding

ldquoThe measures say we are on schedulerdquo

300K

250K

200K

150K

100K

50K

0K Jan Feb Mar Apr May Jun

Planned rate of code Growth

Actual code growth

Completion of coding

ldquoBut the code keeps growing We should have measured the requirements growthrdquo

50

100

150

Defect reports

Total

Open

Closed

125

75

25

Jan Feb Mar Apr May Jun

Completion of coding

ldquoAnd all the defect reports remaining to be fixed will keep us far behind schedulerdquo

Click to edit Master title style Requirements Change Impact

Number of requirements

affected

Chart6

Requirements Added
Requirements Changed
Requirements Deleted
Person hours of work or rework
2
20
1
2
11
1
45
35
3
15
61
1
1
20
4
15
15
0
18
05
2

Sheet1

Sheet1

Requirements Added
Requirements Changed
Requirements Deleted
Person hours of work or rework

Sheet2

Sheet3

Click to edit Master title style Phase Defect Removal Model

PHASE

Escapes Previous Phase KSLOC

Defect Injection KSLOC

Subtotal KSLOC

Removal Effective-ness

Defect Removal KSLOC

Escapes at phase exit KSLOC

Rqmts 0 12 12 0 0 12

Design 12 180 192 76 146 46

Code amp UT 46 154 200 71 142 58

Inte-gration 58 0 58 67 39 19

Sys Test 19 0 19 58 11 08

Opera-tion 08

Kan (1999) Metrics and Model in Software Quality Engineering

Click to edit Master title style SCRUM Board Scrumboard displays

the path to completion for each story by showing task status

Story points are an estimate of days for each story

Agile estimating is by feature (story) rather than by task

lsquoDonersquo does not always mean all tasks are finished

Kanban methods restrict the work in progress to a limited number of paths

Click to edit Master title style Burndown Chart and Velocity

Burndown chart displays story completion by day

Burndown chart indicates actual versus estimated progress

Velocity is the rate at which stories are completed

Velocity indicates sustainable progress

Velocity results provide one source of historical data for improving estimates

Story points without the context of productivity factors are dangerous for estimating

Click to edit Master title style Team Velocity

0

1

2

3

4

5

1 2 2 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20Days

Story Points per Hour

0

1

2

3

4

5

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

HrsStryPt

Analyzing Velocity

Item Cycle Times

02468

101214161820

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

Stories Hours HrsStryPt

0

10

20

30

40

50

60

1 2 2 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

Burndown Chart

Days

Click to edit Master title style Measuring and Managing Flow

Released

In progress

Backlogged

Cumulative Flow Diagram Days to delivery

outliers

Anderson (2010) Kanban

Click to edit Master title style

0

10

20

30

40

50

60

70

80

1 2 3 4 5 6 7 8 9 10 11 12

Sprints

Sto

ry p

oin

ts d

eliv

ered

Trends Over Sprints

0

10

20

30

40

50

60

70

80

1 2 3 4 5 6 7 8 9 10 11 12

Sprints

Sto

ry p

oin

ts d

eliv

ered

0

05

1

15

2

25

3

35

4

1 2 3 4 5 6 7 8 9 10 11 12

Sprints

Tech

nic

al d

ebt

Track and correlate multiple measures

across sprints

Predictive models

Click to edit Master title style Sources and Measures

Stories submitted

Stories pulled back

Growth in size of stories

Stories added mid-sprint

Stories under review

Failed tests

Non-development tasks

Assists to other projects

Data sources

Measures Project tracking

Code control

Bug tracking

Build system

Test environment

Deployment tools

Operational monitoring

Click to edit Master title style Recommended Books

Click to edit Master title style Personal Software Process (PSP)

PSP 0 Current process Time recording

Defect recording Defect type standards

PSP 01 Coding standard

Size measurement Process improvement

proposal

Baseline Personal Process

Humphrey (1997) Introduction to the Personal Software Process

PSP 1 Size estimating

Test report

Personal Planning

PSP 2 Code reviews

Design reviews

Personal Quality Management

PSP 21 Cyclic development

Cyclic Personal Process

PSP 11 Task planning

Schedule planning

PSP 21 Design templates

Click to edit Master title style Team Software Process (TSP) bull Built from personal processes of team members

bull Well defined team roles

bull Project launch workshops for planning

bull Team measurement and tracking

bull Post-mortems for improvement

bull Team application of lean principles

Humphrey (2000) Introduction to the Team Software Process

Click to edit Master title style Aggregated Personal Data Best

R2s range from 70 to 87

The ability to predict the amount of time required to produce a piece of software is dramatically improved by estimating at the individual level first and then combining the estimates rather than developing a single team estimate averaged over individuals

R2 = 53

Click to edit Master title style Defect Detection at Intuit

Consequently test changes from defect

detection into correctness verification

PSPTSP shifts defect detection from the test phase back up into development

Fagan (2005)

Personal reviews

33

Compile 33

Team reviews 19

Unit test 15

Test 14

Click to edit Master title style

Dramatic reductions

ndash Delivered defects

ndash Days per KLOC

ndash Schedule deviation

ndash Test defectskloc

ndash Variation in results

Humphrey (2006) TSP Leading a Development Team

TSP Benefits

Click to edit Master title style Section 3 Section 3 Measuring Software

Improvement Programs

1) Improvement program methods

2) Maturity-based measurement

3) Improvement program results

61

Click to edit Master title style CMMI Maturity Transitions

Level 5 Optimizing

Innovation management

Level 4 Quantitatively

Managed

Capability management

Level 3 Defined

Process management

Level 2 Managed

Project management

Level 1 Initial

Inconsistent management

Chrisis Konrad amp Shrum (2005) CMMI

Click to edit Master title style Enhanced Maturity Framework

Level 5 Innovating

Innovation management

Level 4 Optimized

Capability management

Level 3 Standardized

Organizational management

Level 2 Stabilized

Work unit management

Level 1 Initial

Inconsistent management

Click to edit Master title style Maturity Level Transformation

Level 5 Opportunistic improvements

Empowered culture

Level 5 Proactive

improvements Agile culture

Organization

Projects

Individual

Level 4 Process and results

managed statistically Precision culture

Level 3 Organization develops

standard processes Common culture

Level 2 Project managers stabilize projects

Local culture

Level 1 Ad Hoc inconsistent

development practices Relationship culture

Trust

Click to edit Master title style Measurement by Level

Level 5 mdash Exploratory data ∆ t = N x ROI

Level 4 mdash Predictive data X plusmn (196σ radicn )

Level 3 mdash Process data 83 defects per review

Level 2 mdash Management data Effort = α LOCβ

Level 1 mdash Haphazard data ldquoabout a million linesrdquo

Measure-maturity mismatch slows improvement and creates

dysfunction

Click to edit Master title style Average Improvement Results

Productivity growth 4 35 9 - 67

Pre-test defect detection 3 22 6 - 25

Time to market 2 darr 19 15 - 23

Field error reports 5 darr 39 10 - 94

Return on investment 5 501 41 - 881

Annual Annual Improvement benefit Orgs median range

Click to edit Master title style Schedule Performance

0

02

04

06

08

1

12

14

16

1 2 3

Schedule performance

index

CMM maturity level

Budgeted cost of work performed

Budgeted cost of work scheduled

behind ahead

Air Force study of contractors

Flowe amp Thordahl (1994)

Click to edit Master title style Cost Performance

Cost performance

index

0

05

1

15

2

25

1 2 3 CMM Maturity Level

Budgeted cost of work performed Actual cost of

work performed

overrun underrun

Air Force study of contractors

Flowe amp Thordahl (1994)

Click to edit Master title style Crosbyrsquos Cost of Quality Model Project Cost

Performance Cost of Quality Generating plans documentation Development - requirements - design - code - integration Prevention Appraisal

Reviews - system - requirements - design - test plan - test scripts Walkthroughs Testing (first time) Independent Verification and Validation (first time) Audits

Training Policies Procedures and methods Tools Planning Quality Improvement Projects Data Gathering and Analysis Root Cause Analysis Quality Reporting

Nonconformance Conformance Fixing Defects Reworking any Document Updating Source Code Re-reviews Re-tests Lab costs for re-tests Patches(Internal and External) Engineering Changes Change Control Boards External Failures and fines Customer Support Help Desks

Dion (1993) Crosby (1979) Quality Is Free

Click to edit Master title style Raytheons Cost of Quality

bull Performance mdash cost of building it right first time bull Nonconformance mdash cost of rework bull Appraisal mdash cost of testing bull Prevention mdash cost of preventing nonconformance

Dion (1993) Haley (1995)

23

18

Year Level Perform Nonconf Appraise Prevent

1988 1 34 41 15 7

1990 2 55 18 15 12

1992 3 66 11

1994 4 76 6 75

Click to edit Master title style Raytheonrsquos Productivity

0

1

2

3

4

1988 1989 1990 1991 1992 1993 1994

Growth relative to 1989 baseline

Lydon (1995)

Click to edit Master title style Raytheonrsquos Cost Reduction

0

02

04

06

08

1

12

1988 1989 1990 1991 1992 1993 1994

Reduction relative to 1989 baseline

Lydon (1995)

Click to edit Master title style

Haley (1995)

1993 1992 1991 1990 1989 1988 1994

50

40

30

20

10

0

-10

Actual cost

Budgeted cost

over-run under-run

4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1

Raytheonrsquos Cost Predictability

Click to edit Master title style OMRONrsquos Reuse gains

Sakamoto et al (1996)

0

5 0 0

1 0 0 0

1 5 0 0

2 0 0 0

2 5 0 0

3 0 0 0

1 9 8 7 1 9 8 8 1 9 8 9 1 9 9 0 1 9 9 1 1 9 9 2 1 9 9 3 1 9 9 4

Steps

100

75

50

25

0

of reuse

Click to edit Master title style Section 4 Section 4 Measuring Quality and

Technical Debt

1) Structural quality measurement

2) Technical debt

3) Consortium for IT Software Quality (CISQ)

89

Click to edit Master title style What about the Product CMMIrsquos primary assumption

ldquoThe quality of a software system is governed by quality of the process used to develop itrdquo

- Watts Humphrey

CMMIrsquos product quality problem

1 Assessments do not verify product quality

2 Compliance focus undermines CMMIrsquos primary assumption

3 Product quality focus at Level 4 in CMM was lost in quantitative process control in CMMI

4 The assumptions underlying control charts are violated by software data

Click to edit Master title style Six Sigmarsquos Challenge

Process focus ndash process improvement ndash Six Sigma Product focus ndash product improvement ndash Design for 6σ

Product focus supplements product focus to unlock even more value from software

Six Sigma projects must have significant benefits Huge benefits

Large benefits

Good benefits

Okay benefits

Small benefits

Now what

Ultimately we run out of projects with significant enough benefits to continue the programhellip

so what is the solution that allows continual improvement

Click to edit Master title style Testing is Not Enough ldquoAs higher levels of assurance are demandedhelliptesting cannot deliver the level of confidence required at a reasonable costrdquo

ldquoThe correctness of the code is rarely the weakest linkrdquo

Jackson D (2009) Spinellis D (2006)

ldquohellipa failure to satisfy a non-functional requirement can be critical even catastrophichellipnon-functional requirements are sometimes difficult to verify We cannot write a test case to verify a systemrsquos reliabilityhellipThe ability to associate code to non-functional properties can be a powerful weapon in a software engineerrsquos arsenalrdquo

Click to edit Master title style CASTrsquos Application Intelligence Platform

Application Analysis

Evaluation of 1200+ coding amp

architectural rules

Application meta-data

Transferability

Changeability

Robustness

Performance

Security

Quality Measurements

Detected Violations

Expensive operation in loop Static vs pooled connections Complex query on big table Large indices on big table

Empty CATCH block Uncontrolled data access Poor memory management Opened resource not closed

SQL injection Cross-site scripting Buffer overflow Uncontrolled format string

Unstructured code Misuse of inheritance Lack of comments Violated naming convention

Highly coupled component Duplicated code Index modified in loop High cyclomatic complexity

Language Parsers

Oracle PLSQL Sybase T-SQL SQL Server T-SQL IBM SQLPSM C C++ C Pro C Cobol CICS Visual Basic VBNet ASPNet Java J2EE JSP XML HTML Javascript VBScript PHP PowerBuilder Oracle Forms PeopleSoft SAP ABAP Netweaver Tibco Business Objects Universal Analyzer for other languages

Click to edit Master title style Modern Apps are a Technology Stack Ar

chite

ctur

al C

ompl

ianc

e

Data Flow Transaction Risk

EJB PLSQL

Oracle

SQL Server

DB2

TSQL

Hibernate

Spring

Struts NET

COBOL

Sybase IMS

Messaging

Integration quality Architectural

compliance Risk propagation Application security Resiliency checks Transaction integrity

Function point Effort estimation Data access control SDK versioning Calibration across

technologies IT organization level

System Level 3

Code style amp layout Expression complexity Code documentation Class or program design Basic coding standards Developer level

Unit Level 1

Java Java Java

Java Java

Web Services Java Java Single languagetechnology layer

Intra-technology architecture Intra-layer dependencies Inter-program invocation Security vulnerabilities Development team level

Technology Level 2

JSP ASPNET APIs

Click to edit Master title style Architecturally Complex Defects

Primary cause of operational

problems

20x as many

fixes to correct

Architecturally Complex Defect

A structural flaw involving interactions among multiple components that reside in different application layers

48

52 92

8 Architecturally

Complex Defects

Code unit-level violations

of total app defects

of total repair effort

Most architecturally complex defects involve an architectural hotspot ndash a badly-built component that should be replaced

Click to edit Master title style Propagated Risk

The impact of violation 284 in Component B is very widely propagated throughout the system ndash

it presents very high risk

The impact of violation 342 in Component A is only narrowly propagated in the system

ndash it presents low risk

Propagated Risk Index

A ranking of the risk created by each violation based on the its severity and the breadth of its impact in the system

Violation 342 in component A

Violation 284 in component B

Remediating violation 284 in Component B will have

the greater impact on reducing risk and cost

Click to edit Master title style Transaction Risk

Customers care most about the dependability of their transactions

The risk of a transaction is determined by the number and severity of violations along its path

Transactions can be ranked by risk to prioritize their remediation

The Transaction Risk Index can be further tuned using the operational frequency of each transaction

Transaction Risk Index

A ranking of transactions based on the risk created by the number and severity of violations along the transaction path

Transaction entryexit in the user interface layer

Transaction fulfillment in the data storage layer

1 3 2 4 5

Transaction 1 poses greatest risk

Transaction processing in the business logic layer

Click to edit Master title style Risk-Based Prioritization

Violation Health Factor Severity Propagated Risk Index

In a high risk transaction

Frequency of path execution

148 Security 9

371 Robustness 9

062 Performance 8

284 Robustness 7

016 Changeability 7

241 Security 5

173 Transferability 4

359 Transferability 3

Click to edit Master title style Static Analysis in US DoD

John Keane (2013)

Testing finds lt13 of structural defects

Static analysis before testing is 85 efficient

With testing and static analysis combined test schedules are reduced by 50

Click to edit Master title style Reducing Operational Incidents amp Costs

Study of structural quality measures and maintenance effort across 20 customers in a large global system integrator

TQI increase of 24 decreased corrective maintenance effort by 50

Total Quality Index

Click to edit Master title style Reducing Operational Losses

0

20

40

60

80

100

120

140

160

180

100 150 200 250 300 350 400

Technical Quality Index

Pro

du

ctio

n In

cid

ents

+$2000000 Annual Loss

$50000 Annual Loss

$5000 Annual Loss

Large international investment bank Business critical applications

Click to edit Master title style Factors Affecting Structural Quality

2017 CRASH Report Request from infocastsoftwarecom

Factor apps Robust Security Perform Change Transfer

Technology 1606 20 3 16 26 7

Java-EE

Industry 677 5 5 7 4 6

App type 510 1 1 1 1 1

Source 580 1 1 4

Shore 540

Maturity 72 28 25 15 24 12

Method 208 10 6 14 6

Team size 186 7 5 8 8

of users 262 4 3 5 4

Percentage of variation in structural quality scores accounted for by various factors

Click to edit Master title style The Technical Debt Metaphor

Structural quality problems in production code

Technical Debt

Principal borrowed

Interest on the debt

Business Risk Liability from debt

Opportunity cost

Interestmdashcontinuing IT costs attributable to the violations causing technical debt ie higher maintenance costs greater resource usage etc

Principalcost of fixing problems remaining in the code after release that must be remediated

Opportunity costmdashbenefits that could have been achieved had resources been put on new capability rather than retiring technical debt

Liabilitymdashbusiness costs related to outages breaches corrupted data etc

Technical Debt the future cost of fixing defects remaining in code at release a component of the cost of ownership

Todayrsquos talk focuses on the principal

Curtis et al (2012) IEEE Software

Click to edit Master title style Tech Debt by Quality Factor

Transferability

40

Changeability

30

Security 7

Robustness

18

70 of Technical Debt is in IT Cost (Transferability Changeability)

30 of Technical Debt is in Business Risk

(Robustness Performance Security)

Health Factor proportions are mostly

consistent across technologies

Curtis et al (2012) IEEE Software

Click to edit Master title style Join CISQ wwwit-cisqorg

Click to edit Master title style Section

References

115

Click to edit Master title style References 1 Anderson DJ (2010) Kanban Sequim WA Blue Hole Press

Basili V Trendowicz A Kowalczyk M Heidrich J Seaman C Munch J amp Rombach D (2014) Aligning Organizations through Measurement Heidelberg Springer

Boehm B (1987) Increasing software productivity IEEE Computer 20 (9) 43-57

Boehm B et al (2000) Software Cost Estimation with COCOMO II Prentice-Hall

Chrissis MB Konrad M amp Shrum S (2011) CMMI Third Edition Guidelines for Process Integration and Product Improvement Reading MA Addison Wesley

Cohn M (2006) Agile Estimating and Planning Upper Saddle River NJ Prentice-Hall

Crosby P (1979) Quality Is Free New York McGraw-Hill

Curtis B Sapiddi J amp Szynkarski A (2012) Estimating the principle of an applicationrsquos technical debt IEEE Software 29 (6) 34-42

Dion R (1993) Process improvement and the corporate balance sheet IEEE Software 10 (4) 28-35

Duncker R (1992) Presentation at the 25th Annual Conference of the Singapore Computer Society Singapore November 1992

Dyne K (1999) ESSI Benefits from Continuous Software Improvements In Proceedings of ESEPGrsquo99 Milton Keyes UK ESPI Foundation

Flowe RM amp Thordahl JB (1994) A Correlational Study of the SEIrsquos Capability Maturity Model and Software Development Performance in DoD Contracts (AFITGSSLAR94D-2) Dayton OH Air Force Institute of Technology Wright Patterson Air Force Base

116

Click to edit Master title style References 2 Haley T Ireland B Wojtaszek E Nash D amp Dion R (1995) Raytheon Electronic Systems

Experience in Software Process Improvement (Tech Rep CMUSEI-95-TR-017) ) Pittsburgh Software Engineering Institute Carnegie Mellon University

Ebert C amp Dumke R (2007) Software Measurement Berlin Springer-Verlag

Fagan E amp Davis N (2005) TSP PSP at Intuit Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Fenton N (2006) New directions for Software Metrics CIO Symposium on Software Best Practices

Garmus D amp Herron D (2001) Function Point Analysis Boston Addison-Wesley

George E (2005) Project management with TSP Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Herbsleb J Carleton A Rozum J Siegel J amp Zubrow D (1994) Benefits of CMM-Based Software Process Improvement Initial Results (Tech Rep CMUSEI-94-TR-13) Pittsburgh Software Engineering Institute Carnegie Mellon University

Humphrey W (1997) Introduction to the Personal Software Process Boston Addison Wesley

Humphrey W (2000) Introduction to the Team Software Process Boston Addison Wesley

Humphrey W (2006) TSP Leading a Development Team Boston Addison Wesley

Jackson D (2009) A direct path to dependable software Communications of the ACM 52 (4)

Kan SH (1995) Metrics and Models in Software Quality Engineering Reading MA Addison-Wesley

Keeni B (2000) IEEE Software 16 (4)

117

Click to edit Master title style References 2 Larman C (2004) Agile and Iterative Development Boston Addison Wesley

Lydon T (1995) Productivity drivers Process and capital In Proceedings of the 1995 SEPG Conference Pittsburgh Software Engineering Institute Carnegie Mellon University

McGarry J et al (2002) Practical Software Measurement Reading MA Addison Wesley

Paulk M Weber C Curtis B amp Chrissis M (1995) The Capability Maturity Model Guidelines for Improving the Software Process Reading MA Addison-Wesley

Peterman W (2000) IEEE Software JulyAugust issue

Poppendieck M amp Poppendieck T (2007) Implementing Lean Software Development Boston Addison-Wesley

Sakamoto K Kishida K amp Nakakoji K (1996) Cultural adaptation of the CMM In Fuggetta A amp Wolf A (Eds) Software Process Chichester UK Wiley 137-154

Sinha M (2006) Performance models ndash Enabling organizations prediction capability Proceedings of SEPGrsquo06 Pittsburgh Software Engineering Institute Carnegie Mellon University

Schwaber K amp Beedle M (2002) Agile Software Development Using Scrum

Spinellis D (2006) Code Quality Addison-Wesley

Van Epps S amp Marshall R (2012) Using TSP at the end of the development cycle Proceedings of TSP 2012 Pittsburgh Software Engineering Institute Carnegie Mellon University

Vu JD (1996) Software process improvement A business case In Proceedings of the European SEPG Conference Milton Keynes UK European Software Process Improvement Foundation

118

  • Best Practices for Software Process and Product Measurement
  • Instructor Dr Bill Curtis
  • Agenda
  • Section 1
  • Why Does Measurement Languish
  • Measurement Process Guidance
  • Measurement Categories
  • Productivity Analysis Objectives
  • Productivity Analysis Measures
  • Software Productivity
  • Software Size Measures
  • How Many Lines of Code
  • Function Point Estimation
  • Effort mdash Weakest Measure
  • How Quality Affects Productivity
  • Carry-forward Rework
  • Example of Quality Impact
  • Quality-Adjusted Productivity
  • Best Practices for Analysis
  • 1 Evaluate Demographics
  • 2 Segment Baselines
  • 3 Beware Sampling Effects
  • 4 Understand Variation
  • 5 Investigate Distributions
  • 6 Account for Maturity Effects
  • 7 Caution for External Data Sets
  • Slide Number 27
  • Measuring Project Progress
  • Requirements Change Impact
  • Phase Defect Removal Model
  • SCRUM Board
  • Burndown Chart and Velocity
  • Analyzing Velocity
  • Measuring and Managing Flow
  • Trends Over Sprints
  • Sources and Measures
  • Recommended Books
  • Personal Software Process (PSP)
  • Team Software Process (TSP)
  • Aggregated Personal Data Best
  • Defect Detection at Intuit
  • TSP Benefits
  • Section 3
  • CMMI Maturity Transitions
  • Enhanced Maturity Framework
  • Maturity Level Transformation
  • Measurement by Level
  • Average Improvement Results
  • Schedule Performance
  • Cost Performance
  • Crosbyrsquos Cost of Quality Model
  • Raytheons Cost of Quality
  • Raytheonrsquos Productivity
  • Raytheonrsquos Cost Reduction
  • Raytheonrsquos Cost Predictability
  • OMRONrsquos Reuse gains
  • Section 4
  • What about the Product
  • Six Sigmarsquos Challenge
  • Testing is Not Enough
  • CASTrsquos Application Intelligence Platform
  • Modern Apps are a Technology Stack
  • Architecturally Complex Defects
  • Propagated Risk
  • Transaction Risk
  • Risk-Based Prioritization
  • Static Analysis in US DoD
  • Reducing Operational Incidents amp Costs
  • Reducing Operational Losses
  • Slide Number 70
  • The Technical Debt Metaphor
  • Tech Debt by Quality Factor
  • Join CISQ wwwit-cisqorg
  • Section
  • References 1
  • References 2
  • References 2

PHASE

Escapes Previous Phase

KSLOC

Defect Injection

KSLOC

Subtotal

KSLOC

Removal

Effective-ness

Defect Removal

KSLOC

Escapes at phase exit KSLOC

Rqmts

0

12

12

0

0

12

Design

12

180

192

76

146

46

Code amp UT

46

154

200

71

142

58

Inte-gration

58

0

58

67

39

19

Sys Test

19

0

19

58

11

08

Opera-tion

08

Requirements Added Requirements Changed Requirements Deleted
Feb 2 20 1
Mar 2 11 1
Apr 45 35 3
May 15 61 1
Jun 1 20 4
Jul 15 15 0
Aug 18 05 2
Feb Feb Feb
Mar Mar Mar
Apr Apr Apr
May May May
Jun Jun Jun
Jul Jul Jul
Aug Aug Aug
Page 9: Best Practices for Software Process and Product Measurement · Software Process and Product Measurement Dr. Bill Curtis ... measurement best practice • Best guidance for starting

Click to edit Master title style Productivity Analysis Measures

Adjustment Measures

bull Functional bull Structural bull Behavioral

bull Application bull Project bull Organization

Quality Demo-graphics

Primary Measures

bull Instructions bull Functions bull Requirements

bull Hours bull Roles bull Phases

Size Effort

Productivity Analysis

Click to edit Master title style Software Productivity

Release Productivity =

developed Size of software deleted or modified

Total effort expended on the release

Software Productivity =

Size of software produced

Total effort expended to produce it

Click to edit Master title style Software Size Measures

Instructions Lines of Code

Most frequently used Different definitions of a line can cause counts to vary by 10x Smaller programs often accomplish the same functionality with higher quality coding

Requirements-based Use Case Points Story Points

Use Case Points have not become widely used and need more development Story points are subjective to each team and are susceptible to several forms of bias

Functions Function Points

Popular in IT Several counting schemes (IFPUG NESMA Mark II COSMIC etc) Manual counting is expensive and subjectivemdashcertified counters can differ by 10 Automated FPs taking root

Click to edit Master title style How Many Lines of Code define LOWER 0 lower limit of table define UPPER 300 upper limit define STEP 20 step size main() print a Fahrenheit-Celsius conversion table int fahr for(fahr=LOWER fahrlt=UPPER fahr=fahr+STEP) printf(ldquo4d 61fnrdquo fahr (5090)(fahr-32))

1 2 3 4 5 6 7 8 9 0

2

4

6

8

10

12

Number of Lines of Code

Number of votes

B Park (1992)

Click to edit Master title style Function Point Estimation

Ebert amp Dumke (2007) Software Measurement p188

R2 = 95 y = 779x + 4350

0 200 400 600 800 1000 0

2000

4000

6000

8000

Un

adju

sted

Fu

nct

ion

Po

ints

EIs + EOs

Data store Software

Entries

Exits

Reads

Writes

Functional view of software

Functional size can be estimated from external inputs and outputs

Upfront functional analysis provides basis for good estimates

Repository of FP data provides basis for calibrating estimates

39

Click to edit Master title style Effort mdash Weakest Measure

Effort Unreliable

Inconsistent

Effort

After the fact estimates

bull Memory lapses bull Time-splicing bull Inconsistency

Under-reporting

bull Contract issues

bull HR issues

bull Impressions

Lack of normalization

bull Roles included

bull Phases included

bull Hours in P-Year

Click to edit Master title style How Quality Affects Productivity

Original productivity baseline

Incremental increases in technical debt

Continuing decrease in productivity

Unless you take action

Assumption Productivity is a stable number

Reality Productivity is unstable tending to decline

Click to edit Master title style Carry-forward Rework

Release N+2

Develop N+2 Rework N+2 Rework N+1 Rework N+0

Develop N

Release N

Rework N

Unfixed defects release N

Release N+1

Develop N+1 Rework N+1 Rework N+0

Unfixed defects release N

Unfixed defects release N+1

Click to edit Master title style Example of Quality Impact Project B (Better Faster Cheaper)

ndash 20 developers 3 months

ndash $120k per FTE

ndash 4 FPs per staff month

ndash 240 FPs delivered

$2500FP cost

ndash 5 critical violations per FP

ndash $500 per fix

ndash Cost for 1200 fixes = $600k

ndash Total Cost to Own = $1200k

$5000FP of TCO

Project A (Plodders)

ndash 20 developers 3 months

ndash $120k per FTE

ndash 3 FPs per staff month

ndash 180 FPs delivered

$3333FP cost

ndash 2 critical violations per FP

ndash $500 per fix

ndash Cost for 360 fixes = $180k

ndash Total Cost to Own = $780k

$4333FP of TCO

Project B is 25 more productive

However

Project A is 134 more productive

Click to edit Master title style Quality-Adjusted Productivity

Productivity

Estimation

Benchmarks

Value amp ROI

Etc

Automated Enhancement

Points

Size of both functional and non-functional code segments

Corrective effort in future releases for defects injected in

this release

Must add future effort to fix bugs into productivity

calculations

Effort amp Cost

Quality-Adjusted

Productivity

Automated Technical Debt

Click to edit Master title style Best Practices for Analysis

1) Segment baselines

2) Beware sampling effects

3) Understand variation

4) Evaluate demographics

5) Investigate distributions

6) Account for maturity effects

7) Beware external data sets

Click to edit Master title style 1 Evaluate Demographics

Routine reports

COBOL apps

Function Points

Prod

uctiv

ity

Web-based apps

Innovative mobile

Custom ERP

Click to edit Master title style 2 Segment Baselines

Year Projects Productivity Total Corporate 1981 28 2342 1980 21 1939 Telecommunications 1981 14 1811 1980 12 1458 Engineering amp Defense 1981 8 2965 1980 6 2739 Business Applications 1981 6 3054 1980 3 1813

Multiple baselines are usually the most valid

Click to edit Master title style 3 Beware Sampling Effects

22

0

500

1000

1500

2000

2500

1980 1981 1982

Line

s of

cod

epe

rson

-yea

rs

Baselines

Lots of small

projects Lots of large

projects

Click to edit Master title style 4 Understand Variation

23

Product factors bull timing constraints bull memory utilization bull CPU occupancy bull resource constraints bull complexity bull product size

Project factors bull concurrent development bull development computer bull requirements definition bull requirements churn bull programming practices bull personnel experience bull client experience bull client participation

Project 49

Product 16

Not explained

35

R2product = 16

R2project = 49

R2

model = 65

J Vosburgh B Curtis et al (1984) Proceedings of ICSE 1994

Click to edit Master title style 5 Investigate Distributions

Low Medium High

Rel

ativ

e p

rod

uct

ivit

y (

of m

ean

)

Software Engineering Practices Usage

-100

-50

0

50

100

150

200

250

Click to edit Master title style 6 Account for Maturity Effects

Which organization will be required to spend more effort on correcting software flaws

Click to edit Master title style 7 Caution for External Data Sets

26

DACS Us

Productivity

Siz

e Data definitions

Data collection

Data validity

Data age

Demographics

Click to edit Master title style

1) Tracking traditional projects

2) Iterative and Agile measurement

3) Team Software Process measurement

33

Section 2 Measuring to Manage

Software Projects

Click to edit Master title style Measuring Project Progress

300K 250K 200K 150K 100K 50K 0K

Jan Feb Mar Apr May Jun

Planned rate of code Growth

Actual code growth

Completion of coding

ldquoThe measures say we are on schedulerdquo

300K

250K

200K

150K

100K

50K

0K Jan Feb Mar Apr May Jun

Planned rate of code Growth

Actual code growth

Completion of coding

ldquoBut the code keeps growing We should have measured the requirements growthrdquo

50

100

150

Defect reports

Total

Open

Closed

125

75

25

Jan Feb Mar Apr May Jun

Completion of coding

ldquoAnd all the defect reports remaining to be fixed will keep us far behind schedulerdquo

Click to edit Master title style Requirements Change Impact

Number of requirements

affected

Chart6

Requirements Added
Requirements Changed
Requirements Deleted
Person hours of work or rework
2
20
1
2
11
1
45
35
3
15
61
1
1
20
4
15
15
0
18
05
2

Sheet1

Sheet1

Requirements Added
Requirements Changed
Requirements Deleted
Person hours of work or rework

Sheet2

Sheet3

Click to edit Master title style Phase Defect Removal Model

PHASE

Escapes Previous Phase KSLOC

Defect Injection KSLOC

Subtotal KSLOC

Removal Effective-ness

Defect Removal KSLOC

Escapes at phase exit KSLOC

Rqmts 0 12 12 0 0 12

Design 12 180 192 76 146 46

Code amp UT 46 154 200 71 142 58

Inte-gration 58 0 58 67 39 19

Sys Test 19 0 19 58 11 08

Opera-tion 08

Kan (1999) Metrics and Model in Software Quality Engineering

Click to edit Master title style SCRUM Board Scrumboard displays

the path to completion for each story by showing task status

Story points are an estimate of days for each story

Agile estimating is by feature (story) rather than by task

lsquoDonersquo does not always mean all tasks are finished

Kanban methods restrict the work in progress to a limited number of paths

Click to edit Master title style Burndown Chart and Velocity

Burndown chart displays story completion by day

Burndown chart indicates actual versus estimated progress

Velocity is the rate at which stories are completed

Velocity indicates sustainable progress

Velocity results provide one source of historical data for improving estimates

Story points without the context of productivity factors are dangerous for estimating

Click to edit Master title style Team Velocity

0

1

2

3

4

5

1 2 2 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20Days

Story Points per Hour

0

1

2

3

4

5

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

HrsStryPt

Analyzing Velocity

Item Cycle Times

02468

101214161820

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

Stories Hours HrsStryPt

0

10

20

30

40

50

60

1 2 2 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

Burndown Chart

Days

Click to edit Master title style Measuring and Managing Flow

Released

In progress

Backlogged

Cumulative Flow Diagram Days to delivery

outliers

Anderson (2010) Kanban

Click to edit Master title style

0

10

20

30

40

50

60

70

80

1 2 3 4 5 6 7 8 9 10 11 12

Sprints

Sto

ry p

oin

ts d

eliv

ered

Trends Over Sprints

0

10

20

30

40

50

60

70

80

1 2 3 4 5 6 7 8 9 10 11 12

Sprints

Sto

ry p

oin

ts d

eliv

ered

0

05

1

15

2

25

3

35

4

1 2 3 4 5 6 7 8 9 10 11 12

Sprints

Tech

nic

al d

ebt

Track and correlate multiple measures

across sprints

Predictive models

Click to edit Master title style Sources and Measures

Stories submitted

Stories pulled back

Growth in size of stories

Stories added mid-sprint

Stories under review

Failed tests

Non-development tasks

Assists to other projects

Data sources

Measures Project tracking

Code control

Bug tracking

Build system

Test environment

Deployment tools

Operational monitoring

Click to edit Master title style Recommended Books

Click to edit Master title style Personal Software Process (PSP)

PSP 0 Current process Time recording

Defect recording Defect type standards

PSP 01 Coding standard

Size measurement Process improvement

proposal

Baseline Personal Process

Humphrey (1997) Introduction to the Personal Software Process

PSP 1 Size estimating

Test report

Personal Planning

PSP 2 Code reviews

Design reviews

Personal Quality Management

PSP 21 Cyclic development

Cyclic Personal Process

PSP 11 Task planning

Schedule planning

PSP 21 Design templates

Click to edit Master title style Team Software Process (TSP) bull Built from personal processes of team members

bull Well defined team roles

bull Project launch workshops for planning

bull Team measurement and tracking

bull Post-mortems for improvement

bull Team application of lean principles

Humphrey (2000) Introduction to the Team Software Process

Click to edit Master title style Aggregated Personal Data Best

R2s range from 70 to 87

The ability to predict the amount of time required to produce a piece of software is dramatically improved by estimating at the individual level first and then combining the estimates rather than developing a single team estimate averaged over individuals

R2 = 53

Click to edit Master title style Defect Detection at Intuit

Consequently test changes from defect

detection into correctness verification

PSPTSP shifts defect detection from the test phase back up into development

Fagan (2005)

Personal reviews

33

Compile 33

Team reviews 19

Unit test 15

Test 14

Click to edit Master title style

Dramatic reductions

ndash Delivered defects

ndash Days per KLOC

ndash Schedule deviation

ndash Test defectskloc

ndash Variation in results

Humphrey (2006) TSP Leading a Development Team

TSP Benefits

Click to edit Master title style Section 3 Section 3 Measuring Software

Improvement Programs

1) Improvement program methods

2) Maturity-based measurement

3) Improvement program results

61

Click to edit Master title style CMMI Maturity Transitions

Level 5 Optimizing

Innovation management

Level 4 Quantitatively

Managed

Capability management

Level 3 Defined

Process management

Level 2 Managed

Project management

Level 1 Initial

Inconsistent management

Chrisis Konrad amp Shrum (2005) CMMI

Click to edit Master title style Enhanced Maturity Framework

Level 5 Innovating

Innovation management

Level 4 Optimized

Capability management

Level 3 Standardized

Organizational management

Level 2 Stabilized

Work unit management

Level 1 Initial

Inconsistent management

Click to edit Master title style Maturity Level Transformation

Level 5 Opportunistic improvements

Empowered culture

Level 5 Proactive

improvements Agile culture

Organization

Projects

Individual

Level 4 Process and results

managed statistically Precision culture

Level 3 Organization develops

standard processes Common culture

Level 2 Project managers stabilize projects

Local culture

Level 1 Ad Hoc inconsistent

development practices Relationship culture

Trust

Click to edit Master title style Measurement by Level

Level 5 mdash Exploratory data ∆ t = N x ROI

Level 4 mdash Predictive data X plusmn (196σ radicn )

Level 3 mdash Process data 83 defects per review

Level 2 mdash Management data Effort = α LOCβ

Level 1 mdash Haphazard data ldquoabout a million linesrdquo

Measure-maturity mismatch slows improvement and creates

dysfunction

Click to edit Master title style Average Improvement Results

Productivity growth 4 35 9 - 67

Pre-test defect detection 3 22 6 - 25

Time to market 2 darr 19 15 - 23

Field error reports 5 darr 39 10 - 94

Return on investment 5 501 41 - 881

Annual Annual Improvement benefit Orgs median range

Click to edit Master title style Schedule Performance

0

02

04

06

08

1

12

14

16

1 2 3

Schedule performance

index

CMM maturity level

Budgeted cost of work performed

Budgeted cost of work scheduled

behind ahead

Air Force study of contractors

Flowe amp Thordahl (1994)

Click to edit Master title style Cost Performance

Cost performance

index

0

05

1

15

2

25

1 2 3 CMM Maturity Level

Budgeted cost of work performed Actual cost of

work performed

overrun underrun

Air Force study of contractors

Flowe amp Thordahl (1994)

Click to edit Master title style Crosbyrsquos Cost of Quality Model Project Cost

Performance Cost of Quality Generating plans documentation Development - requirements - design - code - integration Prevention Appraisal

Reviews - system - requirements - design - test plan - test scripts Walkthroughs Testing (first time) Independent Verification and Validation (first time) Audits

Training Policies Procedures and methods Tools Planning Quality Improvement Projects Data Gathering and Analysis Root Cause Analysis Quality Reporting

Nonconformance Conformance Fixing Defects Reworking any Document Updating Source Code Re-reviews Re-tests Lab costs for re-tests Patches(Internal and External) Engineering Changes Change Control Boards External Failures and fines Customer Support Help Desks

Dion (1993) Crosby (1979) Quality Is Free

Click to edit Master title style Raytheons Cost of Quality

bull Performance mdash cost of building it right first time bull Nonconformance mdash cost of rework bull Appraisal mdash cost of testing bull Prevention mdash cost of preventing nonconformance

Dion (1993) Haley (1995)

23

18

Year Level Perform Nonconf Appraise Prevent

1988 1 34 41 15 7

1990 2 55 18 15 12

1992 3 66 11

1994 4 76 6 75

Click to edit Master title style Raytheonrsquos Productivity

0

1

2

3

4

1988 1989 1990 1991 1992 1993 1994

Growth relative to 1989 baseline

Lydon (1995)

Click to edit Master title style Raytheonrsquos Cost Reduction

0

02

04

06

08

1

12

1988 1989 1990 1991 1992 1993 1994

Reduction relative to 1989 baseline

Lydon (1995)

Click to edit Master title style

Haley (1995)

1993 1992 1991 1990 1989 1988 1994

50

40

30

20

10

0

-10

Actual cost

Budgeted cost

over-run under-run

4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1

Raytheonrsquos Cost Predictability

Click to edit Master title style OMRONrsquos Reuse gains

Sakamoto et al (1996)

0

5 0 0

1 0 0 0

1 5 0 0

2 0 0 0

2 5 0 0

3 0 0 0

1 9 8 7 1 9 8 8 1 9 8 9 1 9 9 0 1 9 9 1 1 9 9 2 1 9 9 3 1 9 9 4

Steps

100

75

50

25

0

of reuse

Click to edit Master title style Section 4 Section 4 Measuring Quality and

Technical Debt

1) Structural quality measurement

2) Technical debt

3) Consortium for IT Software Quality (CISQ)

89

Click to edit Master title style What about the Product CMMIrsquos primary assumption

ldquoThe quality of a software system is governed by quality of the process used to develop itrdquo

- Watts Humphrey

CMMIrsquos product quality problem

1 Assessments do not verify product quality

2 Compliance focus undermines CMMIrsquos primary assumption

3 Product quality focus at Level 4 in CMM was lost in quantitative process control in CMMI

4 The assumptions underlying control charts are violated by software data

Click to edit Master title style Six Sigmarsquos Challenge

Process focus ndash process improvement ndash Six Sigma Product focus ndash product improvement ndash Design for 6σ

Product focus supplements product focus to unlock even more value from software

Six Sigma projects must have significant benefits Huge benefits

Large benefits

Good benefits

Okay benefits

Small benefits

Now what

Ultimately we run out of projects with significant enough benefits to continue the programhellip

so what is the solution that allows continual improvement

Click to edit Master title style Testing is Not Enough ldquoAs higher levels of assurance are demandedhelliptesting cannot deliver the level of confidence required at a reasonable costrdquo

ldquoThe correctness of the code is rarely the weakest linkrdquo

Jackson D (2009) Spinellis D (2006)

ldquohellipa failure to satisfy a non-functional requirement can be critical even catastrophichellipnon-functional requirements are sometimes difficult to verify We cannot write a test case to verify a systemrsquos reliabilityhellipThe ability to associate code to non-functional properties can be a powerful weapon in a software engineerrsquos arsenalrdquo

Click to edit Master title style CASTrsquos Application Intelligence Platform

Application Analysis

Evaluation of 1200+ coding amp

architectural rules

Application meta-data

Transferability

Changeability

Robustness

Performance

Security

Quality Measurements

Detected Violations

Expensive operation in loop Static vs pooled connections Complex query on big table Large indices on big table

Empty CATCH block Uncontrolled data access Poor memory management Opened resource not closed

SQL injection Cross-site scripting Buffer overflow Uncontrolled format string

Unstructured code Misuse of inheritance Lack of comments Violated naming convention

Highly coupled component Duplicated code Index modified in loop High cyclomatic complexity

Language Parsers

Oracle PLSQL Sybase T-SQL SQL Server T-SQL IBM SQLPSM C C++ C Pro C Cobol CICS Visual Basic VBNet ASPNet Java J2EE JSP XML HTML Javascript VBScript PHP PowerBuilder Oracle Forms PeopleSoft SAP ABAP Netweaver Tibco Business Objects Universal Analyzer for other languages

Click to edit Master title style Modern Apps are a Technology Stack Ar

chite

ctur

al C

ompl

ianc

e

Data Flow Transaction Risk

EJB PLSQL

Oracle

SQL Server

DB2

TSQL

Hibernate

Spring

Struts NET

COBOL

Sybase IMS

Messaging

Integration quality Architectural

compliance Risk propagation Application security Resiliency checks Transaction integrity

Function point Effort estimation Data access control SDK versioning Calibration across

technologies IT organization level

System Level 3

Code style amp layout Expression complexity Code documentation Class or program design Basic coding standards Developer level

Unit Level 1

Java Java Java

Java Java

Web Services Java Java Single languagetechnology layer

Intra-technology architecture Intra-layer dependencies Inter-program invocation Security vulnerabilities Development team level

Technology Level 2

JSP ASPNET APIs

Click to edit Master title style Architecturally Complex Defects

Primary cause of operational

problems

20x as many

fixes to correct

Architecturally Complex Defect

A structural flaw involving interactions among multiple components that reside in different application layers

48

52 92

8 Architecturally

Complex Defects

Code unit-level violations

of total app defects

of total repair effort

Most architecturally complex defects involve an architectural hotspot ndash a badly-built component that should be replaced

Click to edit Master title style Propagated Risk

The impact of violation 284 in Component B is very widely propagated throughout the system ndash

it presents very high risk

The impact of violation 342 in Component A is only narrowly propagated in the system

ndash it presents low risk

Propagated Risk Index

A ranking of the risk created by each violation based on the its severity and the breadth of its impact in the system

Violation 342 in component A

Violation 284 in component B

Remediating violation 284 in Component B will have

the greater impact on reducing risk and cost

Click to edit Master title style Transaction Risk

Customers care most about the dependability of their transactions

The risk of a transaction is determined by the number and severity of violations along its path

Transactions can be ranked by risk to prioritize their remediation

The Transaction Risk Index can be further tuned using the operational frequency of each transaction

Transaction Risk Index

A ranking of transactions based on the risk created by the number and severity of violations along the transaction path

Transaction entryexit in the user interface layer

Transaction fulfillment in the data storage layer

1 3 2 4 5

Transaction 1 poses greatest risk

Transaction processing in the business logic layer

Click to edit Master title style Risk-Based Prioritization

Violation Health Factor Severity Propagated Risk Index

In a high risk transaction

Frequency of path execution

148 Security 9

371 Robustness 9

062 Performance 8

284 Robustness 7

016 Changeability 7

241 Security 5

173 Transferability 4

359 Transferability 3

Click to edit Master title style Static Analysis in US DoD

John Keane (2013)

Testing finds lt13 of structural defects

Static analysis before testing is 85 efficient

With testing and static analysis combined test schedules are reduced by 50

Click to edit Master title style Reducing Operational Incidents amp Costs

Study of structural quality measures and maintenance effort across 20 customers in a large global system integrator

TQI increase of 24 decreased corrective maintenance effort by 50

Total Quality Index

Click to edit Master title style Reducing Operational Losses

0

20

40

60

80

100

120

140

160

180

100 150 200 250 300 350 400

Technical Quality Index

Pro

du

ctio

n In

cid

ents

+$2000000 Annual Loss

$50000 Annual Loss

$5000 Annual Loss

Large international investment bank Business critical applications

Click to edit Master title style Factors Affecting Structural Quality

2017 CRASH Report Request from infocastsoftwarecom

Factor apps Robust Security Perform Change Transfer

Technology 1606 20 3 16 26 7

Java-EE

Industry 677 5 5 7 4 6

App type 510 1 1 1 1 1

Source 580 1 1 4

Shore 540

Maturity 72 28 25 15 24 12

Method 208 10 6 14 6

Team size 186 7 5 8 8

of users 262 4 3 5 4

Percentage of variation in structural quality scores accounted for by various factors

Click to edit Master title style The Technical Debt Metaphor

Structural quality problems in production code

Technical Debt

Principal borrowed

Interest on the debt

Business Risk Liability from debt

Opportunity cost

Interestmdashcontinuing IT costs attributable to the violations causing technical debt ie higher maintenance costs greater resource usage etc

Principalcost of fixing problems remaining in the code after release that must be remediated

Opportunity costmdashbenefits that could have been achieved had resources been put on new capability rather than retiring technical debt

Liabilitymdashbusiness costs related to outages breaches corrupted data etc

Technical Debt the future cost of fixing defects remaining in code at release a component of the cost of ownership

Todayrsquos talk focuses on the principal

Curtis et al (2012) IEEE Software

Click to edit Master title style Tech Debt by Quality Factor

Transferability

40

Changeability

30

Security 7

Robustness

18

70 of Technical Debt is in IT Cost (Transferability Changeability)

30 of Technical Debt is in Business Risk

(Robustness Performance Security)

Health Factor proportions are mostly

consistent across technologies

Curtis et al (2012) IEEE Software

Click to edit Master title style Join CISQ wwwit-cisqorg

Click to edit Master title style Section

References

115

Click to edit Master title style References 1 Anderson DJ (2010) Kanban Sequim WA Blue Hole Press

Basili V Trendowicz A Kowalczyk M Heidrich J Seaman C Munch J amp Rombach D (2014) Aligning Organizations through Measurement Heidelberg Springer

Boehm B (1987) Increasing software productivity IEEE Computer 20 (9) 43-57

Boehm B et al (2000) Software Cost Estimation with COCOMO II Prentice-Hall

Chrissis MB Konrad M amp Shrum S (2011) CMMI Third Edition Guidelines for Process Integration and Product Improvement Reading MA Addison Wesley

Cohn M (2006) Agile Estimating and Planning Upper Saddle River NJ Prentice-Hall

Crosby P (1979) Quality Is Free New York McGraw-Hill

Curtis B Sapiddi J amp Szynkarski A (2012) Estimating the principle of an applicationrsquos technical debt IEEE Software 29 (6) 34-42

Dion R (1993) Process improvement and the corporate balance sheet IEEE Software 10 (4) 28-35

Duncker R (1992) Presentation at the 25th Annual Conference of the Singapore Computer Society Singapore November 1992

Dyne K (1999) ESSI Benefits from Continuous Software Improvements In Proceedings of ESEPGrsquo99 Milton Keyes UK ESPI Foundation

Flowe RM amp Thordahl JB (1994) A Correlational Study of the SEIrsquos Capability Maturity Model and Software Development Performance in DoD Contracts (AFITGSSLAR94D-2) Dayton OH Air Force Institute of Technology Wright Patterson Air Force Base

116

Click to edit Master title style References 2 Haley T Ireland B Wojtaszek E Nash D amp Dion R (1995) Raytheon Electronic Systems

Experience in Software Process Improvement (Tech Rep CMUSEI-95-TR-017) ) Pittsburgh Software Engineering Institute Carnegie Mellon University

Ebert C amp Dumke R (2007) Software Measurement Berlin Springer-Verlag

Fagan E amp Davis N (2005) TSP PSP at Intuit Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Fenton N (2006) New directions for Software Metrics CIO Symposium on Software Best Practices

Garmus D amp Herron D (2001) Function Point Analysis Boston Addison-Wesley

George E (2005) Project management with TSP Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Herbsleb J Carleton A Rozum J Siegel J amp Zubrow D (1994) Benefits of CMM-Based Software Process Improvement Initial Results (Tech Rep CMUSEI-94-TR-13) Pittsburgh Software Engineering Institute Carnegie Mellon University

Humphrey W (1997) Introduction to the Personal Software Process Boston Addison Wesley

Humphrey W (2000) Introduction to the Team Software Process Boston Addison Wesley

Humphrey W (2006) TSP Leading a Development Team Boston Addison Wesley

Jackson D (2009) A direct path to dependable software Communications of the ACM 52 (4)

Kan SH (1995) Metrics and Models in Software Quality Engineering Reading MA Addison-Wesley

Keeni B (2000) IEEE Software 16 (4)

117

Click to edit Master title style References 2 Larman C (2004) Agile and Iterative Development Boston Addison Wesley

Lydon T (1995) Productivity drivers Process and capital In Proceedings of the 1995 SEPG Conference Pittsburgh Software Engineering Institute Carnegie Mellon University

McGarry J et al (2002) Practical Software Measurement Reading MA Addison Wesley

Paulk M Weber C Curtis B amp Chrissis M (1995) The Capability Maturity Model Guidelines for Improving the Software Process Reading MA Addison-Wesley

Peterman W (2000) IEEE Software JulyAugust issue

Poppendieck M amp Poppendieck T (2007) Implementing Lean Software Development Boston Addison-Wesley

Sakamoto K Kishida K amp Nakakoji K (1996) Cultural adaptation of the CMM In Fuggetta A amp Wolf A (Eds) Software Process Chichester UK Wiley 137-154

Sinha M (2006) Performance models ndash Enabling organizations prediction capability Proceedings of SEPGrsquo06 Pittsburgh Software Engineering Institute Carnegie Mellon University

Schwaber K amp Beedle M (2002) Agile Software Development Using Scrum

Spinellis D (2006) Code Quality Addison-Wesley

Van Epps S amp Marshall R (2012) Using TSP at the end of the development cycle Proceedings of TSP 2012 Pittsburgh Software Engineering Institute Carnegie Mellon University

Vu JD (1996) Software process improvement A business case In Proceedings of the European SEPG Conference Milton Keynes UK European Software Process Improvement Foundation

118

  • Best Practices for Software Process and Product Measurement
  • Instructor Dr Bill Curtis
  • Agenda
  • Section 1
  • Why Does Measurement Languish
  • Measurement Process Guidance
  • Measurement Categories
  • Productivity Analysis Objectives
  • Productivity Analysis Measures
  • Software Productivity
  • Software Size Measures
  • How Many Lines of Code
  • Function Point Estimation
  • Effort mdash Weakest Measure
  • How Quality Affects Productivity
  • Carry-forward Rework
  • Example of Quality Impact
  • Quality-Adjusted Productivity
  • Best Practices for Analysis
  • 1 Evaluate Demographics
  • 2 Segment Baselines
  • 3 Beware Sampling Effects
  • 4 Understand Variation
  • 5 Investigate Distributions
  • 6 Account for Maturity Effects
  • 7 Caution for External Data Sets
  • Slide Number 27
  • Measuring Project Progress
  • Requirements Change Impact
  • Phase Defect Removal Model
  • SCRUM Board
  • Burndown Chart and Velocity
  • Analyzing Velocity
  • Measuring and Managing Flow
  • Trends Over Sprints
  • Sources and Measures
  • Recommended Books
  • Personal Software Process (PSP)
  • Team Software Process (TSP)
  • Aggregated Personal Data Best
  • Defect Detection at Intuit
  • TSP Benefits
  • Section 3
  • CMMI Maturity Transitions
  • Enhanced Maturity Framework
  • Maturity Level Transformation
  • Measurement by Level
  • Average Improvement Results
  • Schedule Performance
  • Cost Performance
  • Crosbyrsquos Cost of Quality Model
  • Raytheons Cost of Quality
  • Raytheonrsquos Productivity
  • Raytheonrsquos Cost Reduction
  • Raytheonrsquos Cost Predictability
  • OMRONrsquos Reuse gains
  • Section 4
  • What about the Product
  • Six Sigmarsquos Challenge
  • Testing is Not Enough
  • CASTrsquos Application Intelligence Platform
  • Modern Apps are a Technology Stack
  • Architecturally Complex Defects
  • Propagated Risk
  • Transaction Risk
  • Risk-Based Prioritization
  • Static Analysis in US DoD
  • Reducing Operational Incidents amp Costs
  • Reducing Operational Losses
  • Slide Number 70
  • The Technical Debt Metaphor
  • Tech Debt by Quality Factor
  • Join CISQ wwwit-cisqorg
  • Section
  • References 1
  • References 2
  • References 2

PHASE

Escapes Previous Phase

KSLOC

Defect Injection

KSLOC

Subtotal

KSLOC

Removal

Effective-ness

Defect Removal

KSLOC

Escapes at phase exit KSLOC

Rqmts

0

12

12

0

0

12

Design

12

180

192

76

146

46

Code amp UT

46

154

200

71

142

58

Inte-gration

58

0

58

67

39

19

Sys Test

19

0

19

58

11

08

Opera-tion

08

Requirements Added Requirements Changed Requirements Deleted
Feb 2 20 1
Mar 2 11 1
Apr 45 35 3
May 15 61 1
Jun 1 20 4
Jul 15 15 0
Aug 18 05 2
Feb Feb Feb
Mar Mar Mar
Apr Apr Apr
May May May
Jun Jun Jun
Jul Jul Jul
Aug Aug Aug
Page 10: Best Practices for Software Process and Product Measurement · Software Process and Product Measurement Dr. Bill Curtis ... measurement best practice • Best guidance for starting

Click to edit Master title style Software Productivity

Release Productivity =

developed Size of software deleted or modified

Total effort expended on the release

Software Productivity =

Size of software produced

Total effort expended to produce it

Click to edit Master title style Software Size Measures

Instructions Lines of Code

Most frequently used Different definitions of a line can cause counts to vary by 10x Smaller programs often accomplish the same functionality with higher quality coding

Requirements-based Use Case Points Story Points

Use Case Points have not become widely used and need more development Story points are subjective to each team and are susceptible to several forms of bias

Functions Function Points

Popular in IT Several counting schemes (IFPUG NESMA Mark II COSMIC etc) Manual counting is expensive and subjectivemdashcertified counters can differ by 10 Automated FPs taking root

Click to edit Master title style How Many Lines of Code define LOWER 0 lower limit of table define UPPER 300 upper limit define STEP 20 step size main() print a Fahrenheit-Celsius conversion table int fahr for(fahr=LOWER fahrlt=UPPER fahr=fahr+STEP) printf(ldquo4d 61fnrdquo fahr (5090)(fahr-32))

1 2 3 4 5 6 7 8 9 0

2

4

6

8

10

12

Number of Lines of Code

Number of votes

B Park (1992)

Click to edit Master title style Function Point Estimation

Ebert amp Dumke (2007) Software Measurement p188

R2 = 95 y = 779x + 4350

0 200 400 600 800 1000 0

2000

4000

6000

8000

Un

adju

sted

Fu

nct

ion

Po

ints

EIs + EOs

Data store Software

Entries

Exits

Reads

Writes

Functional view of software

Functional size can be estimated from external inputs and outputs

Upfront functional analysis provides basis for good estimates

Repository of FP data provides basis for calibrating estimates

39

Click to edit Master title style Effort mdash Weakest Measure

Effort Unreliable

Inconsistent

Effort

After the fact estimates

bull Memory lapses bull Time-splicing bull Inconsistency

Under-reporting

bull Contract issues

bull HR issues

bull Impressions

Lack of normalization

bull Roles included

bull Phases included

bull Hours in P-Year

Click to edit Master title style How Quality Affects Productivity

Original productivity baseline

Incremental increases in technical debt

Continuing decrease in productivity

Unless you take action

Assumption Productivity is a stable number

Reality Productivity is unstable tending to decline

Click to edit Master title style Carry-forward Rework

Release N+2

Develop N+2 Rework N+2 Rework N+1 Rework N+0

Develop N

Release N

Rework N

Unfixed defects release N

Release N+1

Develop N+1 Rework N+1 Rework N+0

Unfixed defects release N

Unfixed defects release N+1

Click to edit Master title style Example of Quality Impact Project B (Better Faster Cheaper)

ndash 20 developers 3 months

ndash $120k per FTE

ndash 4 FPs per staff month

ndash 240 FPs delivered

$2500FP cost

ndash 5 critical violations per FP

ndash $500 per fix

ndash Cost for 1200 fixes = $600k

ndash Total Cost to Own = $1200k

$5000FP of TCO

Project A (Plodders)

ndash 20 developers 3 months

ndash $120k per FTE

ndash 3 FPs per staff month

ndash 180 FPs delivered

$3333FP cost

ndash 2 critical violations per FP

ndash $500 per fix

ndash Cost for 360 fixes = $180k

ndash Total Cost to Own = $780k

$4333FP of TCO

Project B is 25 more productive

However

Project A is 134 more productive

Click to edit Master title style Quality-Adjusted Productivity

Productivity

Estimation

Benchmarks

Value amp ROI

Etc

Automated Enhancement

Points

Size of both functional and non-functional code segments

Corrective effort in future releases for defects injected in

this release

Must add future effort to fix bugs into productivity

calculations

Effort amp Cost

Quality-Adjusted

Productivity

Automated Technical Debt

Click to edit Master title style Best Practices for Analysis

1) Segment baselines

2) Beware sampling effects

3) Understand variation

4) Evaluate demographics

5) Investigate distributions

6) Account for maturity effects

7) Beware external data sets

Click to edit Master title style 1 Evaluate Demographics

Routine reports

COBOL apps

Function Points

Prod

uctiv

ity

Web-based apps

Innovative mobile

Custom ERP

Click to edit Master title style 2 Segment Baselines

Year Projects Productivity Total Corporate 1981 28 2342 1980 21 1939 Telecommunications 1981 14 1811 1980 12 1458 Engineering amp Defense 1981 8 2965 1980 6 2739 Business Applications 1981 6 3054 1980 3 1813

Multiple baselines are usually the most valid

Click to edit Master title style 3 Beware Sampling Effects

22

0

500

1000

1500

2000

2500

1980 1981 1982

Line

s of

cod

epe

rson

-yea

rs

Baselines

Lots of small

projects Lots of large

projects

Click to edit Master title style 4 Understand Variation

23

Product factors bull timing constraints bull memory utilization bull CPU occupancy bull resource constraints bull complexity bull product size

Project factors bull concurrent development bull development computer bull requirements definition bull requirements churn bull programming practices bull personnel experience bull client experience bull client participation

Project 49

Product 16

Not explained

35

R2product = 16

R2project = 49

R2

model = 65

J Vosburgh B Curtis et al (1984) Proceedings of ICSE 1994

Click to edit Master title style 5 Investigate Distributions

Low Medium High

Rel

ativ

e p

rod

uct

ivit

y (

of m

ean

)

Software Engineering Practices Usage

-100

-50

0

50

100

150

200

250

Click to edit Master title style 6 Account for Maturity Effects

Which organization will be required to spend more effort on correcting software flaws

Click to edit Master title style 7 Caution for External Data Sets

26

DACS Us

Productivity

Siz

e Data definitions

Data collection

Data validity

Data age

Demographics

Click to edit Master title style

1) Tracking traditional projects

2) Iterative and Agile measurement

3) Team Software Process measurement

33

Section 2 Measuring to Manage

Software Projects

Click to edit Master title style Measuring Project Progress

300K 250K 200K 150K 100K 50K 0K

Jan Feb Mar Apr May Jun

Planned rate of code Growth

Actual code growth

Completion of coding

ldquoThe measures say we are on schedulerdquo

300K

250K

200K

150K

100K

50K

0K Jan Feb Mar Apr May Jun

Planned rate of code Growth

Actual code growth

Completion of coding

ldquoBut the code keeps growing We should have measured the requirements growthrdquo

50

100

150

Defect reports

Total

Open

Closed

125

75

25

Jan Feb Mar Apr May Jun

Completion of coding

ldquoAnd all the defect reports remaining to be fixed will keep us far behind schedulerdquo

Click to edit Master title style Requirements Change Impact

Number of requirements

affected

Chart6

Requirements Added
Requirements Changed
Requirements Deleted
Person hours of work or rework
2
20
1
2
11
1
45
35
3
15
61
1
1
20
4
15
15
0
18
05
2

Sheet1

Sheet1

Requirements Added
Requirements Changed
Requirements Deleted
Person hours of work or rework

Sheet2

Sheet3

Click to edit Master title style Phase Defect Removal Model

PHASE

Escapes Previous Phase KSLOC

Defect Injection KSLOC

Subtotal KSLOC

Removal Effective-ness

Defect Removal KSLOC

Escapes at phase exit KSLOC

Rqmts 0 12 12 0 0 12

Design 12 180 192 76 146 46

Code amp UT 46 154 200 71 142 58

Inte-gration 58 0 58 67 39 19

Sys Test 19 0 19 58 11 08

Opera-tion 08

Kan (1999) Metrics and Model in Software Quality Engineering

Click to edit Master title style SCRUM Board Scrumboard displays

the path to completion for each story by showing task status

Story points are an estimate of days for each story

Agile estimating is by feature (story) rather than by task

lsquoDonersquo does not always mean all tasks are finished

Kanban methods restrict the work in progress to a limited number of paths

Click to edit Master title style Burndown Chart and Velocity

Burndown chart displays story completion by day

Burndown chart indicates actual versus estimated progress

Velocity is the rate at which stories are completed

Velocity indicates sustainable progress

Velocity results provide one source of historical data for improving estimates

Story points without the context of productivity factors are dangerous for estimating

Click to edit Master title style Team Velocity

0

1

2

3

4

5

1 2 2 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20Days

Story Points per Hour

0

1

2

3

4

5

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

HrsStryPt

Analyzing Velocity

Item Cycle Times

02468

101214161820

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

Stories Hours HrsStryPt

0

10

20

30

40

50

60

1 2 2 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

Burndown Chart

Days

Click to edit Master title style Measuring and Managing Flow

Released

In progress

Backlogged

Cumulative Flow Diagram Days to delivery

outliers

Anderson (2010) Kanban

Click to edit Master title style

0

10

20

30

40

50

60

70

80

1 2 3 4 5 6 7 8 9 10 11 12

Sprints

Sto

ry p

oin

ts d

eliv

ered

Trends Over Sprints

0

10

20

30

40

50

60

70

80

1 2 3 4 5 6 7 8 9 10 11 12

Sprints

Sto

ry p

oin

ts d

eliv

ered

0

05

1

15

2

25

3

35

4

1 2 3 4 5 6 7 8 9 10 11 12

Sprints

Tech

nic

al d

ebt

Track and correlate multiple measures

across sprints

Predictive models

Click to edit Master title style Sources and Measures

Stories submitted

Stories pulled back

Growth in size of stories

Stories added mid-sprint

Stories under review

Failed tests

Non-development tasks

Assists to other projects

Data sources

Measures Project tracking

Code control

Bug tracking

Build system

Test environment

Deployment tools

Operational monitoring

Click to edit Master title style Recommended Books

Click to edit Master title style Personal Software Process (PSP)

PSP 0 Current process Time recording

Defect recording Defect type standards

PSP 01 Coding standard

Size measurement Process improvement

proposal

Baseline Personal Process

Humphrey (1997) Introduction to the Personal Software Process

PSP 1 Size estimating

Test report

Personal Planning

PSP 2 Code reviews

Design reviews

Personal Quality Management

PSP 21 Cyclic development

Cyclic Personal Process

PSP 11 Task planning

Schedule planning

PSP 21 Design templates

Click to edit Master title style Team Software Process (TSP) bull Built from personal processes of team members

bull Well defined team roles

bull Project launch workshops for planning

bull Team measurement and tracking

bull Post-mortems for improvement

bull Team application of lean principles

Humphrey (2000) Introduction to the Team Software Process

Click to edit Master title style Aggregated Personal Data Best

R2s range from 70 to 87

The ability to predict the amount of time required to produce a piece of software is dramatically improved by estimating at the individual level first and then combining the estimates rather than developing a single team estimate averaged over individuals

R2 = 53

Click to edit Master title style Defect Detection at Intuit

Consequently test changes from defect

detection into correctness verification

PSPTSP shifts defect detection from the test phase back up into development

Fagan (2005)

Personal reviews

33

Compile 33

Team reviews 19

Unit test 15

Test 14

Click to edit Master title style

Dramatic reductions

ndash Delivered defects

ndash Days per KLOC

ndash Schedule deviation

ndash Test defectskloc

ndash Variation in results

Humphrey (2006) TSP Leading a Development Team

TSP Benefits

Click to edit Master title style Section 3 Section 3 Measuring Software

Improvement Programs

1) Improvement program methods

2) Maturity-based measurement

3) Improvement program results

61

Click to edit Master title style CMMI Maturity Transitions

Level 5 Optimizing

Innovation management

Level 4 Quantitatively

Managed

Capability management

Level 3 Defined

Process management

Level 2 Managed

Project management

Level 1 Initial

Inconsistent management

Chrisis Konrad amp Shrum (2005) CMMI

Click to edit Master title style Enhanced Maturity Framework

Level 5 Innovating

Innovation management

Level 4 Optimized

Capability management

Level 3 Standardized

Organizational management

Level 2 Stabilized

Work unit management

Level 1 Initial

Inconsistent management

Click to edit Master title style Maturity Level Transformation

Level 5 Opportunistic improvements

Empowered culture

Level 5 Proactive

improvements Agile culture

Organization

Projects

Individual

Level 4 Process and results

managed statistically Precision culture

Level 3 Organization develops

standard processes Common culture

Level 2 Project managers stabilize projects

Local culture

Level 1 Ad Hoc inconsistent

development practices Relationship culture

Trust

Click to edit Master title style Measurement by Level

Level 5 mdash Exploratory data ∆ t = N x ROI

Level 4 mdash Predictive data X plusmn (196σ radicn )

Level 3 mdash Process data 83 defects per review

Level 2 mdash Management data Effort = α LOCβ

Level 1 mdash Haphazard data ldquoabout a million linesrdquo

Measure-maturity mismatch slows improvement and creates

dysfunction

Click to edit Master title style Average Improvement Results

Productivity growth 4 35 9 - 67

Pre-test defect detection 3 22 6 - 25

Time to market 2 darr 19 15 - 23

Field error reports 5 darr 39 10 - 94

Return on investment 5 501 41 - 881

Annual Annual Improvement benefit Orgs median range

Click to edit Master title style Schedule Performance

0

02

04

06

08

1

12

14

16

1 2 3

Schedule performance

index

CMM maturity level

Budgeted cost of work performed

Budgeted cost of work scheduled

behind ahead

Air Force study of contractors

Flowe amp Thordahl (1994)

Click to edit Master title style Cost Performance

Cost performance

index

0

05

1

15

2

25

1 2 3 CMM Maturity Level

Budgeted cost of work performed Actual cost of

work performed

overrun underrun

Air Force study of contractors

Flowe amp Thordahl (1994)

Click to edit Master title style Crosbyrsquos Cost of Quality Model Project Cost

Performance Cost of Quality Generating plans documentation Development - requirements - design - code - integration Prevention Appraisal

Reviews - system - requirements - design - test plan - test scripts Walkthroughs Testing (first time) Independent Verification and Validation (first time) Audits

Training Policies Procedures and methods Tools Planning Quality Improvement Projects Data Gathering and Analysis Root Cause Analysis Quality Reporting

Nonconformance Conformance Fixing Defects Reworking any Document Updating Source Code Re-reviews Re-tests Lab costs for re-tests Patches(Internal and External) Engineering Changes Change Control Boards External Failures and fines Customer Support Help Desks

Dion (1993) Crosby (1979) Quality Is Free

Click to edit Master title style Raytheons Cost of Quality

bull Performance mdash cost of building it right first time bull Nonconformance mdash cost of rework bull Appraisal mdash cost of testing bull Prevention mdash cost of preventing nonconformance

Dion (1993) Haley (1995)

23

18

Year Level Perform Nonconf Appraise Prevent

1988 1 34 41 15 7

1990 2 55 18 15 12

1992 3 66 11

1994 4 76 6 75

Click to edit Master title style Raytheonrsquos Productivity

0

1

2

3

4

1988 1989 1990 1991 1992 1993 1994

Growth relative to 1989 baseline

Lydon (1995)

Click to edit Master title style Raytheonrsquos Cost Reduction

0

02

04

06

08

1

12

1988 1989 1990 1991 1992 1993 1994

Reduction relative to 1989 baseline

Lydon (1995)

Click to edit Master title style

Haley (1995)

1993 1992 1991 1990 1989 1988 1994

50

40

30

20

10

0

-10

Actual cost

Budgeted cost

over-run under-run

4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1

Raytheonrsquos Cost Predictability

Click to edit Master title style OMRONrsquos Reuse gains

Sakamoto et al (1996)

0

5 0 0

1 0 0 0

1 5 0 0

2 0 0 0

2 5 0 0

3 0 0 0

1 9 8 7 1 9 8 8 1 9 8 9 1 9 9 0 1 9 9 1 1 9 9 2 1 9 9 3 1 9 9 4

Steps

100

75

50

25

0

of reuse

Click to edit Master title style Section 4 Section 4 Measuring Quality and

Technical Debt

1) Structural quality measurement

2) Technical debt

3) Consortium for IT Software Quality (CISQ)

89

Click to edit Master title style What about the Product CMMIrsquos primary assumption

ldquoThe quality of a software system is governed by quality of the process used to develop itrdquo

- Watts Humphrey

CMMIrsquos product quality problem

1 Assessments do not verify product quality

2 Compliance focus undermines CMMIrsquos primary assumption

3 Product quality focus at Level 4 in CMM was lost in quantitative process control in CMMI

4 The assumptions underlying control charts are violated by software data

Click to edit Master title style Six Sigmarsquos Challenge

Process focus ndash process improvement ndash Six Sigma Product focus ndash product improvement ndash Design for 6σ

Product focus supplements product focus to unlock even more value from software

Six Sigma projects must have significant benefits Huge benefits

Large benefits

Good benefits

Okay benefits

Small benefits

Now what

Ultimately we run out of projects with significant enough benefits to continue the programhellip

so what is the solution that allows continual improvement

Click to edit Master title style Testing is Not Enough ldquoAs higher levels of assurance are demandedhelliptesting cannot deliver the level of confidence required at a reasonable costrdquo

ldquoThe correctness of the code is rarely the weakest linkrdquo

Jackson D (2009) Spinellis D (2006)

ldquohellipa failure to satisfy a non-functional requirement can be critical even catastrophichellipnon-functional requirements are sometimes difficult to verify We cannot write a test case to verify a systemrsquos reliabilityhellipThe ability to associate code to non-functional properties can be a powerful weapon in a software engineerrsquos arsenalrdquo

Click to edit Master title style CASTrsquos Application Intelligence Platform

Application Analysis

Evaluation of 1200+ coding amp

architectural rules

Application meta-data

Transferability

Changeability

Robustness

Performance

Security

Quality Measurements

Detected Violations

Expensive operation in loop Static vs pooled connections Complex query on big table Large indices on big table

Empty CATCH block Uncontrolled data access Poor memory management Opened resource not closed

SQL injection Cross-site scripting Buffer overflow Uncontrolled format string

Unstructured code Misuse of inheritance Lack of comments Violated naming convention

Highly coupled component Duplicated code Index modified in loop High cyclomatic complexity

Language Parsers

Oracle PLSQL Sybase T-SQL SQL Server T-SQL IBM SQLPSM C C++ C Pro C Cobol CICS Visual Basic VBNet ASPNet Java J2EE JSP XML HTML Javascript VBScript PHP PowerBuilder Oracle Forms PeopleSoft SAP ABAP Netweaver Tibco Business Objects Universal Analyzer for other languages

Click to edit Master title style Modern Apps are a Technology Stack Ar

chite

ctur

al C

ompl

ianc

e

Data Flow Transaction Risk

EJB PLSQL

Oracle

SQL Server

DB2

TSQL

Hibernate

Spring

Struts NET

COBOL

Sybase IMS

Messaging

Integration quality Architectural

compliance Risk propagation Application security Resiliency checks Transaction integrity

Function point Effort estimation Data access control SDK versioning Calibration across

technologies IT organization level

System Level 3

Code style amp layout Expression complexity Code documentation Class or program design Basic coding standards Developer level

Unit Level 1

Java Java Java

Java Java

Web Services Java Java Single languagetechnology layer

Intra-technology architecture Intra-layer dependencies Inter-program invocation Security vulnerabilities Development team level

Technology Level 2

JSP ASPNET APIs

Click to edit Master title style Architecturally Complex Defects

Primary cause of operational

problems

20x as many

fixes to correct

Architecturally Complex Defect

A structural flaw involving interactions among multiple components that reside in different application layers

48

52 92

8 Architecturally

Complex Defects

Code unit-level violations

of total app defects

of total repair effort

Most architecturally complex defects involve an architectural hotspot ndash a badly-built component that should be replaced

Click to edit Master title style Propagated Risk

The impact of violation 284 in Component B is very widely propagated throughout the system ndash

it presents very high risk

The impact of violation 342 in Component A is only narrowly propagated in the system

ndash it presents low risk

Propagated Risk Index

A ranking of the risk created by each violation based on the its severity and the breadth of its impact in the system

Violation 342 in component A

Violation 284 in component B

Remediating violation 284 in Component B will have

the greater impact on reducing risk and cost

Click to edit Master title style Transaction Risk

Customers care most about the dependability of their transactions

The risk of a transaction is determined by the number and severity of violations along its path

Transactions can be ranked by risk to prioritize their remediation

The Transaction Risk Index can be further tuned using the operational frequency of each transaction

Transaction Risk Index

A ranking of transactions based on the risk created by the number and severity of violations along the transaction path

Transaction entryexit in the user interface layer

Transaction fulfillment in the data storage layer

1 3 2 4 5

Transaction 1 poses greatest risk

Transaction processing in the business logic layer

Click to edit Master title style Risk-Based Prioritization

Violation Health Factor Severity Propagated Risk Index

In a high risk transaction

Frequency of path execution

148 Security 9

371 Robustness 9

062 Performance 8

284 Robustness 7

016 Changeability 7

241 Security 5

173 Transferability 4

359 Transferability 3

Click to edit Master title style Static Analysis in US DoD

John Keane (2013)

Testing finds lt13 of structural defects

Static analysis before testing is 85 efficient

With testing and static analysis combined test schedules are reduced by 50

Click to edit Master title style Reducing Operational Incidents amp Costs

Study of structural quality measures and maintenance effort across 20 customers in a large global system integrator

TQI increase of 24 decreased corrective maintenance effort by 50

Total Quality Index

Click to edit Master title style Reducing Operational Losses

0

20

40

60

80

100

120

140

160

180

100 150 200 250 300 350 400

Technical Quality Index

Pro

du

ctio

n In

cid

ents

+$2000000 Annual Loss

$50000 Annual Loss

$5000 Annual Loss

Large international investment bank Business critical applications

Click to edit Master title style Factors Affecting Structural Quality

2017 CRASH Report Request from infocastsoftwarecom

Factor apps Robust Security Perform Change Transfer

Technology 1606 20 3 16 26 7

Java-EE

Industry 677 5 5 7 4 6

App type 510 1 1 1 1 1

Source 580 1 1 4

Shore 540

Maturity 72 28 25 15 24 12

Method 208 10 6 14 6

Team size 186 7 5 8 8

of users 262 4 3 5 4

Percentage of variation in structural quality scores accounted for by various factors

Click to edit Master title style The Technical Debt Metaphor

Structural quality problems in production code

Technical Debt

Principal borrowed

Interest on the debt

Business Risk Liability from debt

Opportunity cost

Interestmdashcontinuing IT costs attributable to the violations causing technical debt ie higher maintenance costs greater resource usage etc

Principalcost of fixing problems remaining in the code after release that must be remediated

Opportunity costmdashbenefits that could have been achieved had resources been put on new capability rather than retiring technical debt

Liabilitymdashbusiness costs related to outages breaches corrupted data etc

Technical Debt the future cost of fixing defects remaining in code at release a component of the cost of ownership

Todayrsquos talk focuses on the principal

Curtis et al (2012) IEEE Software

Click to edit Master title style Tech Debt by Quality Factor

Transferability

40

Changeability

30

Security 7

Robustness

18

70 of Technical Debt is in IT Cost (Transferability Changeability)

30 of Technical Debt is in Business Risk

(Robustness Performance Security)

Health Factor proportions are mostly

consistent across technologies

Curtis et al (2012) IEEE Software

Click to edit Master title style Join CISQ wwwit-cisqorg

Click to edit Master title style Section

References

115

Click to edit Master title style References 1 Anderson DJ (2010) Kanban Sequim WA Blue Hole Press

Basili V Trendowicz A Kowalczyk M Heidrich J Seaman C Munch J amp Rombach D (2014) Aligning Organizations through Measurement Heidelberg Springer

Boehm B (1987) Increasing software productivity IEEE Computer 20 (9) 43-57

Boehm B et al (2000) Software Cost Estimation with COCOMO II Prentice-Hall

Chrissis MB Konrad M amp Shrum S (2011) CMMI Third Edition Guidelines for Process Integration and Product Improvement Reading MA Addison Wesley

Cohn M (2006) Agile Estimating and Planning Upper Saddle River NJ Prentice-Hall

Crosby P (1979) Quality Is Free New York McGraw-Hill

Curtis B Sapiddi J amp Szynkarski A (2012) Estimating the principle of an applicationrsquos technical debt IEEE Software 29 (6) 34-42

Dion R (1993) Process improvement and the corporate balance sheet IEEE Software 10 (4) 28-35

Duncker R (1992) Presentation at the 25th Annual Conference of the Singapore Computer Society Singapore November 1992

Dyne K (1999) ESSI Benefits from Continuous Software Improvements In Proceedings of ESEPGrsquo99 Milton Keyes UK ESPI Foundation

Flowe RM amp Thordahl JB (1994) A Correlational Study of the SEIrsquos Capability Maturity Model and Software Development Performance in DoD Contracts (AFITGSSLAR94D-2) Dayton OH Air Force Institute of Technology Wright Patterson Air Force Base

116

Click to edit Master title style References 2 Haley T Ireland B Wojtaszek E Nash D amp Dion R (1995) Raytheon Electronic Systems

Experience in Software Process Improvement (Tech Rep CMUSEI-95-TR-017) ) Pittsburgh Software Engineering Institute Carnegie Mellon University

Ebert C amp Dumke R (2007) Software Measurement Berlin Springer-Verlag

Fagan E amp Davis N (2005) TSP PSP at Intuit Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Fenton N (2006) New directions for Software Metrics CIO Symposium on Software Best Practices

Garmus D amp Herron D (2001) Function Point Analysis Boston Addison-Wesley

George E (2005) Project management with TSP Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Herbsleb J Carleton A Rozum J Siegel J amp Zubrow D (1994) Benefits of CMM-Based Software Process Improvement Initial Results (Tech Rep CMUSEI-94-TR-13) Pittsburgh Software Engineering Institute Carnegie Mellon University

Humphrey W (1997) Introduction to the Personal Software Process Boston Addison Wesley

Humphrey W (2000) Introduction to the Team Software Process Boston Addison Wesley

Humphrey W (2006) TSP Leading a Development Team Boston Addison Wesley

Jackson D (2009) A direct path to dependable software Communications of the ACM 52 (4)

Kan SH (1995) Metrics and Models in Software Quality Engineering Reading MA Addison-Wesley

Keeni B (2000) IEEE Software 16 (4)

117

Click to edit Master title style References 2 Larman C (2004) Agile and Iterative Development Boston Addison Wesley

Lydon T (1995) Productivity drivers Process and capital In Proceedings of the 1995 SEPG Conference Pittsburgh Software Engineering Institute Carnegie Mellon University

McGarry J et al (2002) Practical Software Measurement Reading MA Addison Wesley

Paulk M Weber C Curtis B amp Chrissis M (1995) The Capability Maturity Model Guidelines for Improving the Software Process Reading MA Addison-Wesley

Peterman W (2000) IEEE Software JulyAugust issue

Poppendieck M amp Poppendieck T (2007) Implementing Lean Software Development Boston Addison-Wesley

Sakamoto K Kishida K amp Nakakoji K (1996) Cultural adaptation of the CMM In Fuggetta A amp Wolf A (Eds) Software Process Chichester UK Wiley 137-154

Sinha M (2006) Performance models ndash Enabling organizations prediction capability Proceedings of SEPGrsquo06 Pittsburgh Software Engineering Institute Carnegie Mellon University

Schwaber K amp Beedle M (2002) Agile Software Development Using Scrum

Spinellis D (2006) Code Quality Addison-Wesley

Van Epps S amp Marshall R (2012) Using TSP at the end of the development cycle Proceedings of TSP 2012 Pittsburgh Software Engineering Institute Carnegie Mellon University

Vu JD (1996) Software process improvement A business case In Proceedings of the European SEPG Conference Milton Keynes UK European Software Process Improvement Foundation

118

  • Best Practices for Software Process and Product Measurement
  • Instructor Dr Bill Curtis
  • Agenda
  • Section 1
  • Why Does Measurement Languish
  • Measurement Process Guidance
  • Measurement Categories
  • Productivity Analysis Objectives
  • Productivity Analysis Measures
  • Software Productivity
  • Software Size Measures
  • How Many Lines of Code
  • Function Point Estimation
  • Effort mdash Weakest Measure
  • How Quality Affects Productivity
  • Carry-forward Rework
  • Example of Quality Impact
  • Quality-Adjusted Productivity
  • Best Practices for Analysis
  • 1 Evaluate Demographics
  • 2 Segment Baselines
  • 3 Beware Sampling Effects
  • 4 Understand Variation
  • 5 Investigate Distributions
  • 6 Account for Maturity Effects
  • 7 Caution for External Data Sets
  • Slide Number 27
  • Measuring Project Progress
  • Requirements Change Impact
  • Phase Defect Removal Model
  • SCRUM Board
  • Burndown Chart and Velocity
  • Analyzing Velocity
  • Measuring and Managing Flow
  • Trends Over Sprints
  • Sources and Measures
  • Recommended Books
  • Personal Software Process (PSP)
  • Team Software Process (TSP)
  • Aggregated Personal Data Best
  • Defect Detection at Intuit
  • TSP Benefits
  • Section 3
  • CMMI Maturity Transitions
  • Enhanced Maturity Framework
  • Maturity Level Transformation
  • Measurement by Level
  • Average Improvement Results
  • Schedule Performance
  • Cost Performance
  • Crosbyrsquos Cost of Quality Model
  • Raytheons Cost of Quality
  • Raytheonrsquos Productivity
  • Raytheonrsquos Cost Reduction
  • Raytheonrsquos Cost Predictability
  • OMRONrsquos Reuse gains
  • Section 4
  • What about the Product
  • Six Sigmarsquos Challenge
  • Testing is Not Enough
  • CASTrsquos Application Intelligence Platform
  • Modern Apps are a Technology Stack
  • Architecturally Complex Defects
  • Propagated Risk
  • Transaction Risk
  • Risk-Based Prioritization
  • Static Analysis in US DoD
  • Reducing Operational Incidents amp Costs
  • Reducing Operational Losses
  • Slide Number 70
  • The Technical Debt Metaphor
  • Tech Debt by Quality Factor
  • Join CISQ wwwit-cisqorg
  • Section
  • References 1
  • References 2
  • References 2

PHASE

Escapes Previous Phase

KSLOC

Defect Injection

KSLOC

Subtotal

KSLOC

Removal

Effective-ness

Defect Removal

KSLOC

Escapes at phase exit KSLOC

Rqmts

0

12

12

0

0

12

Design

12

180

192

76

146

46

Code amp UT

46

154

200

71

142

58

Inte-gration

58

0

58

67

39

19

Sys Test

19

0

19

58

11

08

Opera-tion

08

Requirements Added Requirements Changed Requirements Deleted
Feb 2 20 1
Mar 2 11 1
Apr 45 35 3
May 15 61 1
Jun 1 20 4
Jul 15 15 0
Aug 18 05 2
Feb Feb Feb
Mar Mar Mar
Apr Apr Apr
May May May
Jun Jun Jun
Jul Jul Jul
Aug Aug Aug
Page 11: Best Practices for Software Process and Product Measurement · Software Process and Product Measurement Dr. Bill Curtis ... measurement best practice • Best guidance for starting

Click to edit Master title style Software Size Measures

Instructions Lines of Code

Most frequently used Different definitions of a line can cause counts to vary by 10x Smaller programs often accomplish the same functionality with higher quality coding

Requirements-based Use Case Points Story Points

Use Case Points have not become widely used and need more development Story points are subjective to each team and are susceptible to several forms of bias

Functions Function Points

Popular in IT Several counting schemes (IFPUG NESMA Mark II COSMIC etc) Manual counting is expensive and subjectivemdashcertified counters can differ by 10 Automated FPs taking root

Click to edit Master title style How Many Lines of Code define LOWER 0 lower limit of table define UPPER 300 upper limit define STEP 20 step size main() print a Fahrenheit-Celsius conversion table int fahr for(fahr=LOWER fahrlt=UPPER fahr=fahr+STEP) printf(ldquo4d 61fnrdquo fahr (5090)(fahr-32))

1 2 3 4 5 6 7 8 9 0

2

4

6

8

10

12

Number of Lines of Code

Number of votes

B Park (1992)

Click to edit Master title style Function Point Estimation

Ebert amp Dumke (2007) Software Measurement p188

R2 = 95 y = 779x + 4350

0 200 400 600 800 1000 0

2000

4000

6000

8000

Un

adju

sted

Fu

nct

ion

Po

ints

EIs + EOs

Data store Software

Entries

Exits

Reads

Writes

Functional view of software

Functional size can be estimated from external inputs and outputs

Upfront functional analysis provides basis for good estimates

Repository of FP data provides basis for calibrating estimates

39

Click to edit Master title style Effort mdash Weakest Measure

Effort Unreliable

Inconsistent

Effort

After the fact estimates

bull Memory lapses bull Time-splicing bull Inconsistency

Under-reporting

bull Contract issues

bull HR issues

bull Impressions

Lack of normalization

bull Roles included

bull Phases included

bull Hours in P-Year

Click to edit Master title style How Quality Affects Productivity

Original productivity baseline

Incremental increases in technical debt

Continuing decrease in productivity

Unless you take action

Assumption Productivity is a stable number

Reality Productivity is unstable tending to decline

Click to edit Master title style Carry-forward Rework

Release N+2

Develop N+2 Rework N+2 Rework N+1 Rework N+0

Develop N

Release N

Rework N

Unfixed defects release N

Release N+1

Develop N+1 Rework N+1 Rework N+0

Unfixed defects release N

Unfixed defects release N+1

Click to edit Master title style Example of Quality Impact Project B (Better Faster Cheaper)

ndash 20 developers 3 months

ndash $120k per FTE

ndash 4 FPs per staff month

ndash 240 FPs delivered

$2500FP cost

ndash 5 critical violations per FP

ndash $500 per fix

ndash Cost for 1200 fixes = $600k

ndash Total Cost to Own = $1200k

$5000FP of TCO

Project A (Plodders)

ndash 20 developers 3 months

ndash $120k per FTE

ndash 3 FPs per staff month

ndash 180 FPs delivered

$3333FP cost

ndash 2 critical violations per FP

ndash $500 per fix

ndash Cost for 360 fixes = $180k

ndash Total Cost to Own = $780k

$4333FP of TCO

Project B is 25 more productive

However

Project A is 134 more productive

Click to edit Master title style Quality-Adjusted Productivity

Productivity

Estimation

Benchmarks

Value amp ROI

Etc

Automated Enhancement

Points

Size of both functional and non-functional code segments

Corrective effort in future releases for defects injected in

this release

Must add future effort to fix bugs into productivity

calculations

Effort amp Cost

Quality-Adjusted

Productivity

Automated Technical Debt

Click to edit Master title style Best Practices for Analysis

1) Segment baselines

2) Beware sampling effects

3) Understand variation

4) Evaluate demographics

5) Investigate distributions

6) Account for maturity effects

7) Beware external data sets

Click to edit Master title style 1 Evaluate Demographics

Routine reports

COBOL apps

Function Points

Prod

uctiv

ity

Web-based apps

Innovative mobile

Custom ERP

Click to edit Master title style 2 Segment Baselines

Year Projects Productivity Total Corporate 1981 28 2342 1980 21 1939 Telecommunications 1981 14 1811 1980 12 1458 Engineering amp Defense 1981 8 2965 1980 6 2739 Business Applications 1981 6 3054 1980 3 1813

Multiple baselines are usually the most valid

Click to edit Master title style 3 Beware Sampling Effects

22

0

500

1000

1500

2000

2500

1980 1981 1982

Line

s of

cod

epe

rson

-yea

rs

Baselines

Lots of small

projects Lots of large

projects

Click to edit Master title style 4 Understand Variation

23

Product factors bull timing constraints bull memory utilization bull CPU occupancy bull resource constraints bull complexity bull product size

Project factors bull concurrent development bull development computer bull requirements definition bull requirements churn bull programming practices bull personnel experience bull client experience bull client participation

Project 49

Product 16

Not explained

35

R2product = 16

R2project = 49

R2

model = 65

J Vosburgh B Curtis et al (1984) Proceedings of ICSE 1994

Click to edit Master title style 5 Investigate Distributions

Low Medium High

Rel

ativ

e p

rod

uct

ivit

y (

of m

ean

)

Software Engineering Practices Usage

-100

-50

0

50

100

150

200

250

Click to edit Master title style 6 Account for Maturity Effects

Which organization will be required to spend more effort on correcting software flaws

Click to edit Master title style 7 Caution for External Data Sets

26

DACS Us

Productivity

Siz

e Data definitions

Data collection

Data validity

Data age

Demographics

Click to edit Master title style

1) Tracking traditional projects

2) Iterative and Agile measurement

3) Team Software Process measurement

33

Section 2 Measuring to Manage

Software Projects

Click to edit Master title style Measuring Project Progress

300K 250K 200K 150K 100K 50K 0K

Jan Feb Mar Apr May Jun

Planned rate of code Growth

Actual code growth

Completion of coding

ldquoThe measures say we are on schedulerdquo

300K

250K

200K

150K

100K

50K

0K Jan Feb Mar Apr May Jun

Planned rate of code Growth

Actual code growth

Completion of coding

ldquoBut the code keeps growing We should have measured the requirements growthrdquo

50

100

150

Defect reports

Total

Open

Closed

125

75

25

Jan Feb Mar Apr May Jun

Completion of coding

ldquoAnd all the defect reports remaining to be fixed will keep us far behind schedulerdquo

Click to edit Master title style Requirements Change Impact

Number of requirements

affected

Chart6

Requirements Added
Requirements Changed
Requirements Deleted
Person hours of work or rework
2
20
1
2
11
1
45
35
3
15
61
1
1
20
4
15
15
0
18
05
2

Sheet1

Sheet1

Requirements Added
Requirements Changed
Requirements Deleted
Person hours of work or rework

Sheet2

Sheet3

Click to edit Master title style Phase Defect Removal Model

PHASE

Escapes Previous Phase KSLOC

Defect Injection KSLOC

Subtotal KSLOC

Removal Effective-ness

Defect Removal KSLOC

Escapes at phase exit KSLOC

Rqmts 0 12 12 0 0 12

Design 12 180 192 76 146 46

Code amp UT 46 154 200 71 142 58

Inte-gration 58 0 58 67 39 19

Sys Test 19 0 19 58 11 08

Opera-tion 08

Kan (1999) Metrics and Model in Software Quality Engineering

Click to edit Master title style SCRUM Board Scrumboard displays

the path to completion for each story by showing task status

Story points are an estimate of days for each story

Agile estimating is by feature (story) rather than by task

lsquoDonersquo does not always mean all tasks are finished

Kanban methods restrict the work in progress to a limited number of paths

Click to edit Master title style Burndown Chart and Velocity

Burndown chart displays story completion by day

Burndown chart indicates actual versus estimated progress

Velocity is the rate at which stories are completed

Velocity indicates sustainable progress

Velocity results provide one source of historical data for improving estimates

Story points without the context of productivity factors are dangerous for estimating

Click to edit Master title style Team Velocity

0

1

2

3

4

5

1 2 2 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20Days

Story Points per Hour

0

1

2

3

4

5

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

HrsStryPt

Analyzing Velocity

Item Cycle Times

02468

101214161820

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

Stories Hours HrsStryPt

0

10

20

30

40

50

60

1 2 2 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

Burndown Chart

Days

Click to edit Master title style Measuring and Managing Flow

Released

In progress

Backlogged

Cumulative Flow Diagram Days to delivery

outliers

Anderson (2010) Kanban

Click to edit Master title style

0

10

20

30

40

50

60

70

80

1 2 3 4 5 6 7 8 9 10 11 12

Sprints

Sto

ry p

oin

ts d

eliv

ered

Trends Over Sprints

0

10

20

30

40

50

60

70

80

1 2 3 4 5 6 7 8 9 10 11 12

Sprints

Sto

ry p

oin

ts d

eliv

ered

0

05

1

15

2

25

3

35

4

1 2 3 4 5 6 7 8 9 10 11 12

Sprints

Tech

nic

al d

ebt

Track and correlate multiple measures

across sprints

Predictive models

Click to edit Master title style Sources and Measures

Stories submitted

Stories pulled back

Growth in size of stories

Stories added mid-sprint

Stories under review

Failed tests

Non-development tasks

Assists to other projects

Data sources

Measures Project tracking

Code control

Bug tracking

Build system

Test environment

Deployment tools

Operational monitoring

Click to edit Master title style Recommended Books

Click to edit Master title style Personal Software Process (PSP)

PSP 0 Current process Time recording

Defect recording Defect type standards

PSP 01 Coding standard

Size measurement Process improvement

proposal

Baseline Personal Process

Humphrey (1997) Introduction to the Personal Software Process

PSP 1 Size estimating

Test report

Personal Planning

PSP 2 Code reviews

Design reviews

Personal Quality Management

PSP 21 Cyclic development

Cyclic Personal Process

PSP 11 Task planning

Schedule planning

PSP 21 Design templates

Click to edit Master title style Team Software Process (TSP) bull Built from personal processes of team members

bull Well defined team roles

bull Project launch workshops for planning

bull Team measurement and tracking

bull Post-mortems for improvement

bull Team application of lean principles

Humphrey (2000) Introduction to the Team Software Process

Click to edit Master title style Aggregated Personal Data Best

R2s range from 70 to 87

The ability to predict the amount of time required to produce a piece of software is dramatically improved by estimating at the individual level first and then combining the estimates rather than developing a single team estimate averaged over individuals

R2 = 53

Click to edit Master title style Defect Detection at Intuit

Consequently test changes from defect

detection into correctness verification

PSPTSP shifts defect detection from the test phase back up into development

Fagan (2005)

Personal reviews

33

Compile 33

Team reviews 19

Unit test 15

Test 14

Click to edit Master title style

Dramatic reductions

ndash Delivered defects

ndash Days per KLOC

ndash Schedule deviation

ndash Test defectskloc

ndash Variation in results

Humphrey (2006) TSP Leading a Development Team

TSP Benefits

Click to edit Master title style Section 3 Section 3 Measuring Software

Improvement Programs

1) Improvement program methods

2) Maturity-based measurement

3) Improvement program results

61

Click to edit Master title style CMMI Maturity Transitions

Level 5 Optimizing

Innovation management

Level 4 Quantitatively

Managed

Capability management

Level 3 Defined

Process management

Level 2 Managed

Project management

Level 1 Initial

Inconsistent management

Chrisis Konrad amp Shrum (2005) CMMI

Click to edit Master title style Enhanced Maturity Framework

Level 5 Innovating

Innovation management

Level 4 Optimized

Capability management

Level 3 Standardized

Organizational management

Level 2 Stabilized

Work unit management

Level 1 Initial

Inconsistent management

Click to edit Master title style Maturity Level Transformation

Level 5 Opportunistic improvements

Empowered culture

Level 5 Proactive

improvements Agile culture

Organization

Projects

Individual

Level 4 Process and results

managed statistically Precision culture

Level 3 Organization develops

standard processes Common culture

Level 2 Project managers stabilize projects

Local culture

Level 1 Ad Hoc inconsistent

development practices Relationship culture

Trust

Click to edit Master title style Measurement by Level

Level 5 mdash Exploratory data ∆ t = N x ROI

Level 4 mdash Predictive data X plusmn (196σ radicn )

Level 3 mdash Process data 83 defects per review

Level 2 mdash Management data Effort = α LOCβ

Level 1 mdash Haphazard data ldquoabout a million linesrdquo

Measure-maturity mismatch slows improvement and creates

dysfunction

Click to edit Master title style Average Improvement Results

Productivity growth 4 35 9 - 67

Pre-test defect detection 3 22 6 - 25

Time to market 2 darr 19 15 - 23

Field error reports 5 darr 39 10 - 94

Return on investment 5 501 41 - 881

Annual Annual Improvement benefit Orgs median range

Click to edit Master title style Schedule Performance

0

02

04

06

08

1

12

14

16

1 2 3

Schedule performance

index

CMM maturity level

Budgeted cost of work performed

Budgeted cost of work scheduled

behind ahead

Air Force study of contractors

Flowe amp Thordahl (1994)

Click to edit Master title style Cost Performance

Cost performance

index

0

05

1

15

2

25

1 2 3 CMM Maturity Level

Budgeted cost of work performed Actual cost of

work performed

overrun underrun

Air Force study of contractors

Flowe amp Thordahl (1994)

Click to edit Master title style Crosbyrsquos Cost of Quality Model Project Cost

Performance Cost of Quality Generating plans documentation Development - requirements - design - code - integration Prevention Appraisal

Reviews - system - requirements - design - test plan - test scripts Walkthroughs Testing (first time) Independent Verification and Validation (first time) Audits

Training Policies Procedures and methods Tools Planning Quality Improvement Projects Data Gathering and Analysis Root Cause Analysis Quality Reporting

Nonconformance Conformance Fixing Defects Reworking any Document Updating Source Code Re-reviews Re-tests Lab costs for re-tests Patches(Internal and External) Engineering Changes Change Control Boards External Failures and fines Customer Support Help Desks

Dion (1993) Crosby (1979) Quality Is Free

Click to edit Master title style Raytheons Cost of Quality

bull Performance mdash cost of building it right first time bull Nonconformance mdash cost of rework bull Appraisal mdash cost of testing bull Prevention mdash cost of preventing nonconformance

Dion (1993) Haley (1995)

23

18

Year Level Perform Nonconf Appraise Prevent

1988 1 34 41 15 7

1990 2 55 18 15 12

1992 3 66 11

1994 4 76 6 75

Click to edit Master title style Raytheonrsquos Productivity

0

1

2

3

4

1988 1989 1990 1991 1992 1993 1994

Growth relative to 1989 baseline

Lydon (1995)

Click to edit Master title style Raytheonrsquos Cost Reduction

0

02

04

06

08

1

12

1988 1989 1990 1991 1992 1993 1994

Reduction relative to 1989 baseline

Lydon (1995)

Click to edit Master title style

Haley (1995)

1993 1992 1991 1990 1989 1988 1994

50

40

30

20

10

0

-10

Actual cost

Budgeted cost

over-run under-run

4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1

Raytheonrsquos Cost Predictability

Click to edit Master title style OMRONrsquos Reuse gains

Sakamoto et al (1996)

0

5 0 0

1 0 0 0

1 5 0 0

2 0 0 0

2 5 0 0

3 0 0 0

1 9 8 7 1 9 8 8 1 9 8 9 1 9 9 0 1 9 9 1 1 9 9 2 1 9 9 3 1 9 9 4

Steps

100

75

50

25

0

of reuse

Click to edit Master title style Section 4 Section 4 Measuring Quality and

Technical Debt

1) Structural quality measurement

2) Technical debt

3) Consortium for IT Software Quality (CISQ)

89

Click to edit Master title style What about the Product CMMIrsquos primary assumption

ldquoThe quality of a software system is governed by quality of the process used to develop itrdquo

- Watts Humphrey

CMMIrsquos product quality problem

1 Assessments do not verify product quality

2 Compliance focus undermines CMMIrsquos primary assumption

3 Product quality focus at Level 4 in CMM was lost in quantitative process control in CMMI

4 The assumptions underlying control charts are violated by software data

Click to edit Master title style Six Sigmarsquos Challenge

Process focus ndash process improvement ndash Six Sigma Product focus ndash product improvement ndash Design for 6σ

Product focus supplements product focus to unlock even more value from software

Six Sigma projects must have significant benefits Huge benefits

Large benefits

Good benefits

Okay benefits

Small benefits

Now what

Ultimately we run out of projects with significant enough benefits to continue the programhellip

so what is the solution that allows continual improvement

Click to edit Master title style Testing is Not Enough ldquoAs higher levels of assurance are demandedhelliptesting cannot deliver the level of confidence required at a reasonable costrdquo

ldquoThe correctness of the code is rarely the weakest linkrdquo

Jackson D (2009) Spinellis D (2006)

ldquohellipa failure to satisfy a non-functional requirement can be critical even catastrophichellipnon-functional requirements are sometimes difficult to verify We cannot write a test case to verify a systemrsquos reliabilityhellipThe ability to associate code to non-functional properties can be a powerful weapon in a software engineerrsquos arsenalrdquo

Click to edit Master title style CASTrsquos Application Intelligence Platform

Application Analysis

Evaluation of 1200+ coding amp

architectural rules

Application meta-data

Transferability

Changeability

Robustness

Performance

Security

Quality Measurements

Detected Violations

Expensive operation in loop Static vs pooled connections Complex query on big table Large indices on big table

Empty CATCH block Uncontrolled data access Poor memory management Opened resource not closed

SQL injection Cross-site scripting Buffer overflow Uncontrolled format string

Unstructured code Misuse of inheritance Lack of comments Violated naming convention

Highly coupled component Duplicated code Index modified in loop High cyclomatic complexity

Language Parsers

Oracle PLSQL Sybase T-SQL SQL Server T-SQL IBM SQLPSM C C++ C Pro C Cobol CICS Visual Basic VBNet ASPNet Java J2EE JSP XML HTML Javascript VBScript PHP PowerBuilder Oracle Forms PeopleSoft SAP ABAP Netweaver Tibco Business Objects Universal Analyzer for other languages

Click to edit Master title style Modern Apps are a Technology Stack Ar

chite

ctur

al C

ompl

ianc

e

Data Flow Transaction Risk

EJB PLSQL

Oracle

SQL Server

DB2

TSQL

Hibernate

Spring

Struts NET

COBOL

Sybase IMS

Messaging

Integration quality Architectural

compliance Risk propagation Application security Resiliency checks Transaction integrity

Function point Effort estimation Data access control SDK versioning Calibration across

technologies IT organization level

System Level 3

Code style amp layout Expression complexity Code documentation Class or program design Basic coding standards Developer level

Unit Level 1

Java Java Java

Java Java

Web Services Java Java Single languagetechnology layer

Intra-technology architecture Intra-layer dependencies Inter-program invocation Security vulnerabilities Development team level

Technology Level 2

JSP ASPNET APIs

Click to edit Master title style Architecturally Complex Defects

Primary cause of operational

problems

20x as many

fixes to correct

Architecturally Complex Defect

A structural flaw involving interactions among multiple components that reside in different application layers

48

52 92

8 Architecturally

Complex Defects

Code unit-level violations

of total app defects

of total repair effort

Most architecturally complex defects involve an architectural hotspot ndash a badly-built component that should be replaced

Click to edit Master title style Propagated Risk

The impact of violation 284 in Component B is very widely propagated throughout the system ndash

it presents very high risk

The impact of violation 342 in Component A is only narrowly propagated in the system

ndash it presents low risk

Propagated Risk Index

A ranking of the risk created by each violation based on the its severity and the breadth of its impact in the system

Violation 342 in component A

Violation 284 in component B

Remediating violation 284 in Component B will have

the greater impact on reducing risk and cost

Click to edit Master title style Transaction Risk

Customers care most about the dependability of their transactions

The risk of a transaction is determined by the number and severity of violations along its path

Transactions can be ranked by risk to prioritize their remediation

The Transaction Risk Index can be further tuned using the operational frequency of each transaction

Transaction Risk Index

A ranking of transactions based on the risk created by the number and severity of violations along the transaction path

Transaction entryexit in the user interface layer

Transaction fulfillment in the data storage layer

1 3 2 4 5

Transaction 1 poses greatest risk

Transaction processing in the business logic layer

Click to edit Master title style Risk-Based Prioritization

Violation Health Factor Severity Propagated Risk Index

In a high risk transaction

Frequency of path execution

148 Security 9

371 Robustness 9

062 Performance 8

284 Robustness 7

016 Changeability 7

241 Security 5

173 Transferability 4

359 Transferability 3

Click to edit Master title style Static Analysis in US DoD

John Keane (2013)

Testing finds lt13 of structural defects

Static analysis before testing is 85 efficient

With testing and static analysis combined test schedules are reduced by 50

Click to edit Master title style Reducing Operational Incidents amp Costs

Study of structural quality measures and maintenance effort across 20 customers in a large global system integrator

TQI increase of 24 decreased corrective maintenance effort by 50

Total Quality Index

Click to edit Master title style Reducing Operational Losses

0

20

40

60

80

100

120

140

160

180

100 150 200 250 300 350 400

Technical Quality Index

Pro

du

ctio

n In

cid

ents

+$2000000 Annual Loss

$50000 Annual Loss

$5000 Annual Loss

Large international investment bank Business critical applications

Click to edit Master title style Factors Affecting Structural Quality

2017 CRASH Report Request from infocastsoftwarecom

Factor apps Robust Security Perform Change Transfer

Technology 1606 20 3 16 26 7

Java-EE

Industry 677 5 5 7 4 6

App type 510 1 1 1 1 1

Source 580 1 1 4

Shore 540

Maturity 72 28 25 15 24 12

Method 208 10 6 14 6

Team size 186 7 5 8 8

of users 262 4 3 5 4

Percentage of variation in structural quality scores accounted for by various factors

Click to edit Master title style The Technical Debt Metaphor

Structural quality problems in production code

Technical Debt

Principal borrowed

Interest on the debt

Business Risk Liability from debt

Opportunity cost

Interestmdashcontinuing IT costs attributable to the violations causing technical debt ie higher maintenance costs greater resource usage etc

Principalcost of fixing problems remaining in the code after release that must be remediated

Opportunity costmdashbenefits that could have been achieved had resources been put on new capability rather than retiring technical debt

Liabilitymdashbusiness costs related to outages breaches corrupted data etc

Technical Debt the future cost of fixing defects remaining in code at release a component of the cost of ownership

Todayrsquos talk focuses on the principal

Curtis et al (2012) IEEE Software

Click to edit Master title style Tech Debt by Quality Factor

Transferability

40

Changeability

30

Security 7

Robustness

18

70 of Technical Debt is in IT Cost (Transferability Changeability)

30 of Technical Debt is in Business Risk

(Robustness Performance Security)

Health Factor proportions are mostly

consistent across technologies

Curtis et al (2012) IEEE Software

Click to edit Master title style Join CISQ wwwit-cisqorg

Click to edit Master title style Section

References

115

Click to edit Master title style References 1 Anderson DJ (2010) Kanban Sequim WA Blue Hole Press

Basili V Trendowicz A Kowalczyk M Heidrich J Seaman C Munch J amp Rombach D (2014) Aligning Organizations through Measurement Heidelberg Springer

Boehm B (1987) Increasing software productivity IEEE Computer 20 (9) 43-57

Boehm B et al (2000) Software Cost Estimation with COCOMO II Prentice-Hall

Chrissis MB Konrad M amp Shrum S (2011) CMMI Third Edition Guidelines for Process Integration and Product Improvement Reading MA Addison Wesley

Cohn M (2006) Agile Estimating and Planning Upper Saddle River NJ Prentice-Hall

Crosby P (1979) Quality Is Free New York McGraw-Hill

Curtis B Sapiddi J amp Szynkarski A (2012) Estimating the principle of an applicationrsquos technical debt IEEE Software 29 (6) 34-42

Dion R (1993) Process improvement and the corporate balance sheet IEEE Software 10 (4) 28-35

Duncker R (1992) Presentation at the 25th Annual Conference of the Singapore Computer Society Singapore November 1992

Dyne K (1999) ESSI Benefits from Continuous Software Improvements In Proceedings of ESEPGrsquo99 Milton Keyes UK ESPI Foundation

Flowe RM amp Thordahl JB (1994) A Correlational Study of the SEIrsquos Capability Maturity Model and Software Development Performance in DoD Contracts (AFITGSSLAR94D-2) Dayton OH Air Force Institute of Technology Wright Patterson Air Force Base

116

Click to edit Master title style References 2 Haley T Ireland B Wojtaszek E Nash D amp Dion R (1995) Raytheon Electronic Systems

Experience in Software Process Improvement (Tech Rep CMUSEI-95-TR-017) ) Pittsburgh Software Engineering Institute Carnegie Mellon University

Ebert C amp Dumke R (2007) Software Measurement Berlin Springer-Verlag

Fagan E amp Davis N (2005) TSP PSP at Intuit Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Fenton N (2006) New directions for Software Metrics CIO Symposium on Software Best Practices

Garmus D amp Herron D (2001) Function Point Analysis Boston Addison-Wesley

George E (2005) Project management with TSP Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Herbsleb J Carleton A Rozum J Siegel J amp Zubrow D (1994) Benefits of CMM-Based Software Process Improvement Initial Results (Tech Rep CMUSEI-94-TR-13) Pittsburgh Software Engineering Institute Carnegie Mellon University

Humphrey W (1997) Introduction to the Personal Software Process Boston Addison Wesley

Humphrey W (2000) Introduction to the Team Software Process Boston Addison Wesley

Humphrey W (2006) TSP Leading a Development Team Boston Addison Wesley

Jackson D (2009) A direct path to dependable software Communications of the ACM 52 (4)

Kan SH (1995) Metrics and Models in Software Quality Engineering Reading MA Addison-Wesley

Keeni B (2000) IEEE Software 16 (4)

117

Click to edit Master title style References 2 Larman C (2004) Agile and Iterative Development Boston Addison Wesley

Lydon T (1995) Productivity drivers Process and capital In Proceedings of the 1995 SEPG Conference Pittsburgh Software Engineering Institute Carnegie Mellon University

McGarry J et al (2002) Practical Software Measurement Reading MA Addison Wesley

Paulk M Weber C Curtis B amp Chrissis M (1995) The Capability Maturity Model Guidelines for Improving the Software Process Reading MA Addison-Wesley

Peterman W (2000) IEEE Software JulyAugust issue

Poppendieck M amp Poppendieck T (2007) Implementing Lean Software Development Boston Addison-Wesley

Sakamoto K Kishida K amp Nakakoji K (1996) Cultural adaptation of the CMM In Fuggetta A amp Wolf A (Eds) Software Process Chichester UK Wiley 137-154

Sinha M (2006) Performance models ndash Enabling organizations prediction capability Proceedings of SEPGrsquo06 Pittsburgh Software Engineering Institute Carnegie Mellon University

Schwaber K amp Beedle M (2002) Agile Software Development Using Scrum

Spinellis D (2006) Code Quality Addison-Wesley

Van Epps S amp Marshall R (2012) Using TSP at the end of the development cycle Proceedings of TSP 2012 Pittsburgh Software Engineering Institute Carnegie Mellon University

Vu JD (1996) Software process improvement A business case In Proceedings of the European SEPG Conference Milton Keynes UK European Software Process Improvement Foundation

118

  • Best Practices for Software Process and Product Measurement
  • Instructor Dr Bill Curtis
  • Agenda
  • Section 1
  • Why Does Measurement Languish
  • Measurement Process Guidance
  • Measurement Categories
  • Productivity Analysis Objectives
  • Productivity Analysis Measures
  • Software Productivity
  • Software Size Measures
  • How Many Lines of Code
  • Function Point Estimation
  • Effort mdash Weakest Measure
  • How Quality Affects Productivity
  • Carry-forward Rework
  • Example of Quality Impact
  • Quality-Adjusted Productivity
  • Best Practices for Analysis
  • 1 Evaluate Demographics
  • 2 Segment Baselines
  • 3 Beware Sampling Effects
  • 4 Understand Variation
  • 5 Investigate Distributions
  • 6 Account for Maturity Effects
  • 7 Caution for External Data Sets
  • Slide Number 27
  • Measuring Project Progress
  • Requirements Change Impact
  • Phase Defect Removal Model
  • SCRUM Board
  • Burndown Chart and Velocity
  • Analyzing Velocity
  • Measuring and Managing Flow
  • Trends Over Sprints
  • Sources and Measures
  • Recommended Books
  • Personal Software Process (PSP)
  • Team Software Process (TSP)
  • Aggregated Personal Data Best
  • Defect Detection at Intuit
  • TSP Benefits
  • Section 3
  • CMMI Maturity Transitions
  • Enhanced Maturity Framework
  • Maturity Level Transformation
  • Measurement by Level
  • Average Improvement Results
  • Schedule Performance
  • Cost Performance
  • Crosbyrsquos Cost of Quality Model
  • Raytheons Cost of Quality
  • Raytheonrsquos Productivity
  • Raytheonrsquos Cost Reduction
  • Raytheonrsquos Cost Predictability
  • OMRONrsquos Reuse gains
  • Section 4
  • What about the Product
  • Six Sigmarsquos Challenge
  • Testing is Not Enough
  • CASTrsquos Application Intelligence Platform
  • Modern Apps are a Technology Stack
  • Architecturally Complex Defects
  • Propagated Risk
  • Transaction Risk
  • Risk-Based Prioritization
  • Static Analysis in US DoD
  • Reducing Operational Incidents amp Costs
  • Reducing Operational Losses
  • Slide Number 70
  • The Technical Debt Metaphor
  • Tech Debt by Quality Factor
  • Join CISQ wwwit-cisqorg
  • Section
  • References 1
  • References 2
  • References 2

PHASE

Escapes Previous Phase

KSLOC

Defect Injection

KSLOC

Subtotal

KSLOC

Removal

Effective-ness

Defect Removal

KSLOC

Escapes at phase exit KSLOC

Rqmts

0

12

12

0

0

12

Design

12

180

192

76

146

46

Code amp UT

46

154

200

71

142

58

Inte-gration

58

0

58

67

39

19

Sys Test

19

0

19

58

11

08

Opera-tion

08

Requirements Added Requirements Changed Requirements Deleted
Feb 2 20 1
Mar 2 11 1
Apr 45 35 3
May 15 61 1
Jun 1 20 4
Jul 15 15 0
Aug 18 05 2
Feb Feb Feb
Mar Mar Mar
Apr Apr Apr
May May May
Jun Jun Jun
Jul Jul Jul
Aug Aug Aug
Page 12: Best Practices for Software Process and Product Measurement · Software Process and Product Measurement Dr. Bill Curtis ... measurement best practice • Best guidance for starting

Click to edit Master title style How Many Lines of Code define LOWER 0 lower limit of table define UPPER 300 upper limit define STEP 20 step size main() print a Fahrenheit-Celsius conversion table int fahr for(fahr=LOWER fahrlt=UPPER fahr=fahr+STEP) printf(ldquo4d 61fnrdquo fahr (5090)(fahr-32))

1 2 3 4 5 6 7 8 9 0

2

4

6

8

10

12

Number of Lines of Code

Number of votes

B Park (1992)

Click to edit Master title style Function Point Estimation

Ebert amp Dumke (2007) Software Measurement p188

R2 = 95 y = 779x + 4350

0 200 400 600 800 1000 0

2000

4000

6000

8000

Un

adju

sted

Fu

nct

ion

Po

ints

EIs + EOs

Data store Software

Entries

Exits

Reads

Writes

Functional view of software

Functional size can be estimated from external inputs and outputs

Upfront functional analysis provides basis for good estimates

Repository of FP data provides basis for calibrating estimates

39

Click to edit Master title style Effort mdash Weakest Measure

Effort Unreliable

Inconsistent

Effort

After the fact estimates

bull Memory lapses bull Time-splicing bull Inconsistency

Under-reporting

bull Contract issues

bull HR issues

bull Impressions

Lack of normalization

bull Roles included

bull Phases included

bull Hours in P-Year

Click to edit Master title style How Quality Affects Productivity

Original productivity baseline

Incremental increases in technical debt

Continuing decrease in productivity

Unless you take action

Assumption Productivity is a stable number

Reality Productivity is unstable tending to decline

Click to edit Master title style Carry-forward Rework

Release N+2

Develop N+2 Rework N+2 Rework N+1 Rework N+0

Develop N

Release N

Rework N

Unfixed defects release N

Release N+1

Develop N+1 Rework N+1 Rework N+0

Unfixed defects release N

Unfixed defects release N+1

Click to edit Master title style Example of Quality Impact Project B (Better Faster Cheaper)

ndash 20 developers 3 months

ndash $120k per FTE

ndash 4 FPs per staff month

ndash 240 FPs delivered

$2500FP cost

ndash 5 critical violations per FP

ndash $500 per fix

ndash Cost for 1200 fixes = $600k

ndash Total Cost to Own = $1200k

$5000FP of TCO

Project A (Plodders)

ndash 20 developers 3 months

ndash $120k per FTE

ndash 3 FPs per staff month

ndash 180 FPs delivered

$3333FP cost

ndash 2 critical violations per FP

ndash $500 per fix

ndash Cost for 360 fixes = $180k

ndash Total Cost to Own = $780k

$4333FP of TCO

Project B is 25 more productive

However

Project A is 134 more productive

Click to edit Master title style Quality-Adjusted Productivity

Productivity

Estimation

Benchmarks

Value amp ROI

Etc

Automated Enhancement

Points

Size of both functional and non-functional code segments

Corrective effort in future releases for defects injected in

this release

Must add future effort to fix bugs into productivity

calculations

Effort amp Cost

Quality-Adjusted

Productivity

Automated Technical Debt

Click to edit Master title style Best Practices for Analysis

1) Segment baselines

2) Beware sampling effects

3) Understand variation

4) Evaluate demographics

5) Investigate distributions

6) Account for maturity effects

7) Beware external data sets

Click to edit Master title style 1 Evaluate Demographics

Routine reports

COBOL apps

Function Points

Prod

uctiv

ity

Web-based apps

Innovative mobile

Custom ERP

Click to edit Master title style 2 Segment Baselines

Year Projects Productivity Total Corporate 1981 28 2342 1980 21 1939 Telecommunications 1981 14 1811 1980 12 1458 Engineering amp Defense 1981 8 2965 1980 6 2739 Business Applications 1981 6 3054 1980 3 1813

Multiple baselines are usually the most valid

Click to edit Master title style 3 Beware Sampling Effects

22

0

500

1000

1500

2000

2500

1980 1981 1982

Line

s of

cod

epe

rson

-yea

rs

Baselines

Lots of small

projects Lots of large

projects

Click to edit Master title style 4 Understand Variation

23

Product factors bull timing constraints bull memory utilization bull CPU occupancy bull resource constraints bull complexity bull product size

Project factors bull concurrent development bull development computer bull requirements definition bull requirements churn bull programming practices bull personnel experience bull client experience bull client participation

Project 49

Product 16

Not explained

35

R2product = 16

R2project = 49

R2

model = 65

J Vosburgh B Curtis et al (1984) Proceedings of ICSE 1994

Click to edit Master title style 5 Investigate Distributions

Low Medium High

Rel

ativ

e p

rod

uct

ivit

y (

of m

ean

)

Software Engineering Practices Usage

-100

-50

0

50

100

150

200

250

Click to edit Master title style 6 Account for Maturity Effects

Which organization will be required to spend more effort on correcting software flaws

Click to edit Master title style 7 Caution for External Data Sets

26

DACS Us

Productivity

Siz

e Data definitions

Data collection

Data validity

Data age

Demographics

Click to edit Master title style

1) Tracking traditional projects

2) Iterative and Agile measurement

3) Team Software Process measurement

33

Section 2 Measuring to Manage

Software Projects

Click to edit Master title style Measuring Project Progress

300K 250K 200K 150K 100K 50K 0K

Jan Feb Mar Apr May Jun

Planned rate of code Growth

Actual code growth

Completion of coding

ldquoThe measures say we are on schedulerdquo

300K

250K

200K

150K

100K

50K

0K Jan Feb Mar Apr May Jun

Planned rate of code Growth

Actual code growth

Completion of coding

ldquoBut the code keeps growing We should have measured the requirements growthrdquo

50

100

150

Defect reports

Total

Open

Closed

125

75

25

Jan Feb Mar Apr May Jun

Completion of coding

ldquoAnd all the defect reports remaining to be fixed will keep us far behind schedulerdquo

Click to edit Master title style Requirements Change Impact

Number of requirements

affected

Chart6

Requirements Added
Requirements Changed
Requirements Deleted
Person hours of work or rework
2
20
1
2
11
1
45
35
3
15
61
1
1
20
4
15
15
0
18
05
2

Sheet1

Sheet1

Requirements Added
Requirements Changed
Requirements Deleted
Person hours of work or rework

Sheet2

Sheet3

Click to edit Master title style Phase Defect Removal Model

PHASE

Escapes Previous Phase KSLOC

Defect Injection KSLOC

Subtotal KSLOC

Removal Effective-ness

Defect Removal KSLOC

Escapes at phase exit KSLOC

Rqmts 0 12 12 0 0 12

Design 12 180 192 76 146 46

Code amp UT 46 154 200 71 142 58

Inte-gration 58 0 58 67 39 19

Sys Test 19 0 19 58 11 08

Opera-tion 08

Kan (1999) Metrics and Model in Software Quality Engineering

Click to edit Master title style SCRUM Board Scrumboard displays

the path to completion for each story by showing task status

Story points are an estimate of days for each story

Agile estimating is by feature (story) rather than by task

lsquoDonersquo does not always mean all tasks are finished

Kanban methods restrict the work in progress to a limited number of paths

Click to edit Master title style Burndown Chart and Velocity

Burndown chart displays story completion by day

Burndown chart indicates actual versus estimated progress

Velocity is the rate at which stories are completed

Velocity indicates sustainable progress

Velocity results provide one source of historical data for improving estimates

Story points without the context of productivity factors are dangerous for estimating

Click to edit Master title style Team Velocity

0

1

2

3

4

5

1 2 2 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20Days

Story Points per Hour

0

1

2

3

4

5

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

HrsStryPt

Analyzing Velocity

Item Cycle Times

02468

101214161820

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

Stories Hours HrsStryPt

0

10

20

30

40

50

60

1 2 2 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

Burndown Chart

Days

Click to edit Master title style Measuring and Managing Flow

Released

In progress

Backlogged

Cumulative Flow Diagram Days to delivery

outliers

Anderson (2010) Kanban

Click to edit Master title style

0

10

20

30

40

50

60

70

80

1 2 3 4 5 6 7 8 9 10 11 12

Sprints

Sto

ry p

oin

ts d

eliv

ered

Trends Over Sprints

0

10

20

30

40

50

60

70

80

1 2 3 4 5 6 7 8 9 10 11 12

Sprints

Sto

ry p

oin

ts d

eliv

ered

0

05

1

15

2

25

3

35

4

1 2 3 4 5 6 7 8 9 10 11 12

Sprints

Tech

nic

al d

ebt

Track and correlate multiple measures

across sprints

Predictive models

Click to edit Master title style Sources and Measures

Stories submitted

Stories pulled back

Growth in size of stories

Stories added mid-sprint

Stories under review

Failed tests

Non-development tasks

Assists to other projects

Data sources

Measures Project tracking

Code control

Bug tracking

Build system

Test environment

Deployment tools

Operational monitoring

Click to edit Master title style Recommended Books

Click to edit Master title style Personal Software Process (PSP)

PSP 0 Current process Time recording

Defect recording Defect type standards

PSP 01 Coding standard

Size measurement Process improvement

proposal

Baseline Personal Process

Humphrey (1997) Introduction to the Personal Software Process

PSP 1 Size estimating

Test report

Personal Planning

PSP 2 Code reviews

Design reviews

Personal Quality Management

PSP 21 Cyclic development

Cyclic Personal Process

PSP 11 Task planning

Schedule planning

PSP 21 Design templates

Click to edit Master title style Team Software Process (TSP) bull Built from personal processes of team members

bull Well defined team roles

bull Project launch workshops for planning

bull Team measurement and tracking

bull Post-mortems for improvement

bull Team application of lean principles

Humphrey (2000) Introduction to the Team Software Process

Click to edit Master title style Aggregated Personal Data Best

R2s range from 70 to 87

The ability to predict the amount of time required to produce a piece of software is dramatically improved by estimating at the individual level first and then combining the estimates rather than developing a single team estimate averaged over individuals

R2 = 53

Click to edit Master title style Defect Detection at Intuit

Consequently test changes from defect

detection into correctness verification

PSPTSP shifts defect detection from the test phase back up into development

Fagan (2005)

Personal reviews

33

Compile 33

Team reviews 19

Unit test 15

Test 14

Click to edit Master title style

Dramatic reductions

ndash Delivered defects

ndash Days per KLOC

ndash Schedule deviation

ndash Test defectskloc

ndash Variation in results

Humphrey (2006) TSP Leading a Development Team

TSP Benefits

Click to edit Master title style Section 3 Section 3 Measuring Software

Improvement Programs

1) Improvement program methods

2) Maturity-based measurement

3) Improvement program results

61

Click to edit Master title style CMMI Maturity Transitions

Level 5 Optimizing

Innovation management

Level 4 Quantitatively

Managed

Capability management

Level 3 Defined

Process management

Level 2 Managed

Project management

Level 1 Initial

Inconsistent management

Chrisis Konrad amp Shrum (2005) CMMI

Click to edit Master title style Enhanced Maturity Framework

Level 5 Innovating

Innovation management

Level 4 Optimized

Capability management

Level 3 Standardized

Organizational management

Level 2 Stabilized

Work unit management

Level 1 Initial

Inconsistent management

Click to edit Master title style Maturity Level Transformation

Level 5 Opportunistic improvements

Empowered culture

Level 5 Proactive

improvements Agile culture

Organization

Projects

Individual

Level 4 Process and results

managed statistically Precision culture

Level 3 Organization develops

standard processes Common culture

Level 2 Project managers stabilize projects

Local culture

Level 1 Ad Hoc inconsistent

development practices Relationship culture

Trust

Click to edit Master title style Measurement by Level

Level 5 mdash Exploratory data ∆ t = N x ROI

Level 4 mdash Predictive data X plusmn (196σ radicn )

Level 3 mdash Process data 83 defects per review

Level 2 mdash Management data Effort = α LOCβ

Level 1 mdash Haphazard data ldquoabout a million linesrdquo

Measure-maturity mismatch slows improvement and creates

dysfunction

Click to edit Master title style Average Improvement Results

Productivity growth 4 35 9 - 67

Pre-test defect detection 3 22 6 - 25

Time to market 2 darr 19 15 - 23

Field error reports 5 darr 39 10 - 94

Return on investment 5 501 41 - 881

Annual Annual Improvement benefit Orgs median range

Click to edit Master title style Schedule Performance

0

02

04

06

08

1

12

14

16

1 2 3

Schedule performance

index

CMM maturity level

Budgeted cost of work performed

Budgeted cost of work scheduled

behind ahead

Air Force study of contractors

Flowe amp Thordahl (1994)

Click to edit Master title style Cost Performance

Cost performance

index

0

05

1

15

2

25

1 2 3 CMM Maturity Level

Budgeted cost of work performed Actual cost of

work performed

overrun underrun

Air Force study of contractors

Flowe amp Thordahl (1994)

Click to edit Master title style Crosbyrsquos Cost of Quality Model Project Cost

Performance Cost of Quality Generating plans documentation Development - requirements - design - code - integration Prevention Appraisal

Reviews - system - requirements - design - test plan - test scripts Walkthroughs Testing (first time) Independent Verification and Validation (first time) Audits

Training Policies Procedures and methods Tools Planning Quality Improvement Projects Data Gathering and Analysis Root Cause Analysis Quality Reporting

Nonconformance Conformance Fixing Defects Reworking any Document Updating Source Code Re-reviews Re-tests Lab costs for re-tests Patches(Internal and External) Engineering Changes Change Control Boards External Failures and fines Customer Support Help Desks

Dion (1993) Crosby (1979) Quality Is Free

Click to edit Master title style Raytheons Cost of Quality

bull Performance mdash cost of building it right first time bull Nonconformance mdash cost of rework bull Appraisal mdash cost of testing bull Prevention mdash cost of preventing nonconformance

Dion (1993) Haley (1995)

23

18

Year Level Perform Nonconf Appraise Prevent

1988 1 34 41 15 7

1990 2 55 18 15 12

1992 3 66 11

1994 4 76 6 75

Click to edit Master title style Raytheonrsquos Productivity

0

1

2

3

4

1988 1989 1990 1991 1992 1993 1994

Growth relative to 1989 baseline

Lydon (1995)

Click to edit Master title style Raytheonrsquos Cost Reduction

0

02

04

06

08

1

12

1988 1989 1990 1991 1992 1993 1994

Reduction relative to 1989 baseline

Lydon (1995)

Click to edit Master title style

Haley (1995)

1993 1992 1991 1990 1989 1988 1994

50

40

30

20

10

0

-10

Actual cost

Budgeted cost

over-run under-run

4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1

Raytheonrsquos Cost Predictability

Click to edit Master title style OMRONrsquos Reuse gains

Sakamoto et al (1996)

0

5 0 0

1 0 0 0

1 5 0 0

2 0 0 0

2 5 0 0

3 0 0 0

1 9 8 7 1 9 8 8 1 9 8 9 1 9 9 0 1 9 9 1 1 9 9 2 1 9 9 3 1 9 9 4

Steps

100

75

50

25

0

of reuse

Click to edit Master title style Section 4 Section 4 Measuring Quality and

Technical Debt

1) Structural quality measurement

2) Technical debt

3) Consortium for IT Software Quality (CISQ)

89

Click to edit Master title style What about the Product CMMIrsquos primary assumption

ldquoThe quality of a software system is governed by quality of the process used to develop itrdquo

- Watts Humphrey

CMMIrsquos product quality problem

1 Assessments do not verify product quality

2 Compliance focus undermines CMMIrsquos primary assumption

3 Product quality focus at Level 4 in CMM was lost in quantitative process control in CMMI

4 The assumptions underlying control charts are violated by software data

Click to edit Master title style Six Sigmarsquos Challenge

Process focus ndash process improvement ndash Six Sigma Product focus ndash product improvement ndash Design for 6σ

Product focus supplements product focus to unlock even more value from software

Six Sigma projects must have significant benefits Huge benefits

Large benefits

Good benefits

Okay benefits

Small benefits

Now what

Ultimately we run out of projects with significant enough benefits to continue the programhellip

so what is the solution that allows continual improvement

Click to edit Master title style Testing is Not Enough ldquoAs higher levels of assurance are demandedhelliptesting cannot deliver the level of confidence required at a reasonable costrdquo

ldquoThe correctness of the code is rarely the weakest linkrdquo

Jackson D (2009) Spinellis D (2006)

ldquohellipa failure to satisfy a non-functional requirement can be critical even catastrophichellipnon-functional requirements are sometimes difficult to verify We cannot write a test case to verify a systemrsquos reliabilityhellipThe ability to associate code to non-functional properties can be a powerful weapon in a software engineerrsquos arsenalrdquo

Click to edit Master title style CASTrsquos Application Intelligence Platform

Application Analysis

Evaluation of 1200+ coding amp

architectural rules

Application meta-data

Transferability

Changeability

Robustness

Performance

Security

Quality Measurements

Detected Violations

Expensive operation in loop Static vs pooled connections Complex query on big table Large indices on big table

Empty CATCH block Uncontrolled data access Poor memory management Opened resource not closed

SQL injection Cross-site scripting Buffer overflow Uncontrolled format string

Unstructured code Misuse of inheritance Lack of comments Violated naming convention

Highly coupled component Duplicated code Index modified in loop High cyclomatic complexity

Language Parsers

Oracle PLSQL Sybase T-SQL SQL Server T-SQL IBM SQLPSM C C++ C Pro C Cobol CICS Visual Basic VBNet ASPNet Java J2EE JSP XML HTML Javascript VBScript PHP PowerBuilder Oracle Forms PeopleSoft SAP ABAP Netweaver Tibco Business Objects Universal Analyzer for other languages

Click to edit Master title style Modern Apps are a Technology Stack Ar

chite

ctur

al C

ompl

ianc

e

Data Flow Transaction Risk

EJB PLSQL

Oracle

SQL Server

DB2

TSQL

Hibernate

Spring

Struts NET

COBOL

Sybase IMS

Messaging

Integration quality Architectural

compliance Risk propagation Application security Resiliency checks Transaction integrity

Function point Effort estimation Data access control SDK versioning Calibration across

technologies IT organization level

System Level 3

Code style amp layout Expression complexity Code documentation Class or program design Basic coding standards Developer level

Unit Level 1

Java Java Java

Java Java

Web Services Java Java Single languagetechnology layer

Intra-technology architecture Intra-layer dependencies Inter-program invocation Security vulnerabilities Development team level

Technology Level 2

JSP ASPNET APIs

Click to edit Master title style Architecturally Complex Defects

Primary cause of operational

problems

20x as many

fixes to correct

Architecturally Complex Defect

A structural flaw involving interactions among multiple components that reside in different application layers

48

52 92

8 Architecturally

Complex Defects

Code unit-level violations

of total app defects

of total repair effort

Most architecturally complex defects involve an architectural hotspot ndash a badly-built component that should be replaced

Click to edit Master title style Propagated Risk

The impact of violation 284 in Component B is very widely propagated throughout the system ndash

it presents very high risk

The impact of violation 342 in Component A is only narrowly propagated in the system

ndash it presents low risk

Propagated Risk Index

A ranking of the risk created by each violation based on the its severity and the breadth of its impact in the system

Violation 342 in component A

Violation 284 in component B

Remediating violation 284 in Component B will have

the greater impact on reducing risk and cost

Click to edit Master title style Transaction Risk

Customers care most about the dependability of their transactions

The risk of a transaction is determined by the number and severity of violations along its path

Transactions can be ranked by risk to prioritize their remediation

The Transaction Risk Index can be further tuned using the operational frequency of each transaction

Transaction Risk Index

A ranking of transactions based on the risk created by the number and severity of violations along the transaction path

Transaction entryexit in the user interface layer

Transaction fulfillment in the data storage layer

1 3 2 4 5

Transaction 1 poses greatest risk

Transaction processing in the business logic layer

Click to edit Master title style Risk-Based Prioritization

Violation Health Factor Severity Propagated Risk Index

In a high risk transaction

Frequency of path execution

148 Security 9

371 Robustness 9

062 Performance 8

284 Robustness 7

016 Changeability 7

241 Security 5

173 Transferability 4

359 Transferability 3

Click to edit Master title style Static Analysis in US DoD

John Keane (2013)

Testing finds lt13 of structural defects

Static analysis before testing is 85 efficient

With testing and static analysis combined test schedules are reduced by 50

Click to edit Master title style Reducing Operational Incidents amp Costs

Study of structural quality measures and maintenance effort across 20 customers in a large global system integrator

TQI increase of 24 decreased corrective maintenance effort by 50

Total Quality Index

Click to edit Master title style Reducing Operational Losses

0

20

40

60

80

100

120

140

160

180

100 150 200 250 300 350 400

Technical Quality Index

Pro

du

ctio

n In

cid

ents

+$2000000 Annual Loss

$50000 Annual Loss

$5000 Annual Loss

Large international investment bank Business critical applications

Click to edit Master title style Factors Affecting Structural Quality

2017 CRASH Report Request from infocastsoftwarecom

Factor apps Robust Security Perform Change Transfer

Technology 1606 20 3 16 26 7

Java-EE

Industry 677 5 5 7 4 6

App type 510 1 1 1 1 1

Source 580 1 1 4

Shore 540

Maturity 72 28 25 15 24 12

Method 208 10 6 14 6

Team size 186 7 5 8 8

of users 262 4 3 5 4

Percentage of variation in structural quality scores accounted for by various factors

Click to edit Master title style The Technical Debt Metaphor

Structural quality problems in production code

Technical Debt

Principal borrowed

Interest on the debt

Business Risk Liability from debt

Opportunity cost

Interestmdashcontinuing IT costs attributable to the violations causing technical debt ie higher maintenance costs greater resource usage etc

Principalcost of fixing problems remaining in the code after release that must be remediated

Opportunity costmdashbenefits that could have been achieved had resources been put on new capability rather than retiring technical debt

Liabilitymdashbusiness costs related to outages breaches corrupted data etc

Technical Debt the future cost of fixing defects remaining in code at release a component of the cost of ownership

Todayrsquos talk focuses on the principal

Curtis et al (2012) IEEE Software

Click to edit Master title style Tech Debt by Quality Factor

Transferability

40

Changeability

30

Security 7

Robustness

18

70 of Technical Debt is in IT Cost (Transferability Changeability)

30 of Technical Debt is in Business Risk

(Robustness Performance Security)

Health Factor proportions are mostly

consistent across technologies

Curtis et al (2012) IEEE Software

Click to edit Master title style Join CISQ wwwit-cisqorg

Click to edit Master title style Section

References

115

Click to edit Master title style References 1 Anderson DJ (2010) Kanban Sequim WA Blue Hole Press

Basili V Trendowicz A Kowalczyk M Heidrich J Seaman C Munch J amp Rombach D (2014) Aligning Organizations through Measurement Heidelberg Springer

Boehm B (1987) Increasing software productivity IEEE Computer 20 (9) 43-57

Boehm B et al (2000) Software Cost Estimation with COCOMO II Prentice-Hall

Chrissis MB Konrad M amp Shrum S (2011) CMMI Third Edition Guidelines for Process Integration and Product Improvement Reading MA Addison Wesley

Cohn M (2006) Agile Estimating and Planning Upper Saddle River NJ Prentice-Hall

Crosby P (1979) Quality Is Free New York McGraw-Hill

Curtis B Sapiddi J amp Szynkarski A (2012) Estimating the principle of an applicationrsquos technical debt IEEE Software 29 (6) 34-42

Dion R (1993) Process improvement and the corporate balance sheet IEEE Software 10 (4) 28-35

Duncker R (1992) Presentation at the 25th Annual Conference of the Singapore Computer Society Singapore November 1992

Dyne K (1999) ESSI Benefits from Continuous Software Improvements In Proceedings of ESEPGrsquo99 Milton Keyes UK ESPI Foundation

Flowe RM amp Thordahl JB (1994) A Correlational Study of the SEIrsquos Capability Maturity Model and Software Development Performance in DoD Contracts (AFITGSSLAR94D-2) Dayton OH Air Force Institute of Technology Wright Patterson Air Force Base

116

Click to edit Master title style References 2 Haley T Ireland B Wojtaszek E Nash D amp Dion R (1995) Raytheon Electronic Systems

Experience in Software Process Improvement (Tech Rep CMUSEI-95-TR-017) ) Pittsburgh Software Engineering Institute Carnegie Mellon University

Ebert C amp Dumke R (2007) Software Measurement Berlin Springer-Verlag

Fagan E amp Davis N (2005) TSP PSP at Intuit Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Fenton N (2006) New directions for Software Metrics CIO Symposium on Software Best Practices

Garmus D amp Herron D (2001) Function Point Analysis Boston Addison-Wesley

George E (2005) Project management with TSP Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Herbsleb J Carleton A Rozum J Siegel J amp Zubrow D (1994) Benefits of CMM-Based Software Process Improvement Initial Results (Tech Rep CMUSEI-94-TR-13) Pittsburgh Software Engineering Institute Carnegie Mellon University

Humphrey W (1997) Introduction to the Personal Software Process Boston Addison Wesley

Humphrey W (2000) Introduction to the Team Software Process Boston Addison Wesley

Humphrey W (2006) TSP Leading a Development Team Boston Addison Wesley

Jackson D (2009) A direct path to dependable software Communications of the ACM 52 (4)

Kan SH (1995) Metrics and Models in Software Quality Engineering Reading MA Addison-Wesley

Keeni B (2000) IEEE Software 16 (4)

117

Click to edit Master title style References 2 Larman C (2004) Agile and Iterative Development Boston Addison Wesley

Lydon T (1995) Productivity drivers Process and capital In Proceedings of the 1995 SEPG Conference Pittsburgh Software Engineering Institute Carnegie Mellon University

McGarry J et al (2002) Practical Software Measurement Reading MA Addison Wesley

Paulk M Weber C Curtis B amp Chrissis M (1995) The Capability Maturity Model Guidelines for Improving the Software Process Reading MA Addison-Wesley

Peterman W (2000) IEEE Software JulyAugust issue

Poppendieck M amp Poppendieck T (2007) Implementing Lean Software Development Boston Addison-Wesley

Sakamoto K Kishida K amp Nakakoji K (1996) Cultural adaptation of the CMM In Fuggetta A amp Wolf A (Eds) Software Process Chichester UK Wiley 137-154

Sinha M (2006) Performance models ndash Enabling organizations prediction capability Proceedings of SEPGrsquo06 Pittsburgh Software Engineering Institute Carnegie Mellon University

Schwaber K amp Beedle M (2002) Agile Software Development Using Scrum

Spinellis D (2006) Code Quality Addison-Wesley

Van Epps S amp Marshall R (2012) Using TSP at the end of the development cycle Proceedings of TSP 2012 Pittsburgh Software Engineering Institute Carnegie Mellon University

Vu JD (1996) Software process improvement A business case In Proceedings of the European SEPG Conference Milton Keynes UK European Software Process Improvement Foundation

118

  • Best Practices for Software Process and Product Measurement
  • Instructor Dr Bill Curtis
  • Agenda
  • Section 1
  • Why Does Measurement Languish
  • Measurement Process Guidance
  • Measurement Categories
  • Productivity Analysis Objectives
  • Productivity Analysis Measures
  • Software Productivity
  • Software Size Measures
  • How Many Lines of Code
  • Function Point Estimation
  • Effort mdash Weakest Measure
  • How Quality Affects Productivity
  • Carry-forward Rework
  • Example of Quality Impact
  • Quality-Adjusted Productivity
  • Best Practices for Analysis
  • 1 Evaluate Demographics
  • 2 Segment Baselines
  • 3 Beware Sampling Effects
  • 4 Understand Variation
  • 5 Investigate Distributions
  • 6 Account for Maturity Effects
  • 7 Caution for External Data Sets
  • Slide Number 27
  • Measuring Project Progress
  • Requirements Change Impact
  • Phase Defect Removal Model
  • SCRUM Board
  • Burndown Chart and Velocity
  • Analyzing Velocity
  • Measuring and Managing Flow
  • Trends Over Sprints
  • Sources and Measures
  • Recommended Books
  • Personal Software Process (PSP)
  • Team Software Process (TSP)
  • Aggregated Personal Data Best
  • Defect Detection at Intuit
  • TSP Benefits
  • Section 3
  • CMMI Maturity Transitions
  • Enhanced Maturity Framework
  • Maturity Level Transformation
  • Measurement by Level
  • Average Improvement Results
  • Schedule Performance
  • Cost Performance
  • Crosbyrsquos Cost of Quality Model
  • Raytheons Cost of Quality
  • Raytheonrsquos Productivity
  • Raytheonrsquos Cost Reduction
  • Raytheonrsquos Cost Predictability
  • OMRONrsquos Reuse gains
  • Section 4
  • What about the Product
  • Six Sigmarsquos Challenge
  • Testing is Not Enough
  • CASTrsquos Application Intelligence Platform
  • Modern Apps are a Technology Stack
  • Architecturally Complex Defects
  • Propagated Risk
  • Transaction Risk
  • Risk-Based Prioritization
  • Static Analysis in US DoD
  • Reducing Operational Incidents amp Costs
  • Reducing Operational Losses
  • Slide Number 70
  • The Technical Debt Metaphor
  • Tech Debt by Quality Factor
  • Join CISQ wwwit-cisqorg
  • Section
  • References 1
  • References 2
  • References 2

PHASE

Escapes Previous Phase

KSLOC

Defect Injection

KSLOC

Subtotal

KSLOC

Removal

Effective-ness

Defect Removal

KSLOC

Escapes at phase exit KSLOC

Rqmts

0

12

12

0

0

12

Design

12

180

192

76

146

46

Code amp UT

46

154

200

71

142

58

Inte-gration

58

0

58

67

39

19

Sys Test

19

0

19

58

11

08

Opera-tion

08

Requirements Added Requirements Changed Requirements Deleted
Feb 2 20 1
Mar 2 11 1
Apr 45 35 3
May 15 61 1
Jun 1 20 4
Jul 15 15 0
Aug 18 05 2
Feb Feb Feb
Mar Mar Mar
Apr Apr Apr
May May May
Jun Jun Jun
Jul Jul Jul
Aug Aug Aug
Page 13: Best Practices for Software Process and Product Measurement · Software Process and Product Measurement Dr. Bill Curtis ... measurement best practice • Best guidance for starting

Click to edit Master title style Function Point Estimation

Ebert amp Dumke (2007) Software Measurement p188

R2 = 95 y = 779x + 4350

0 200 400 600 800 1000 0

2000

4000

6000

8000

Un

adju

sted

Fu

nct

ion

Po

ints

EIs + EOs

Data store Software

Entries

Exits

Reads

Writes

Functional view of software

Functional size can be estimated from external inputs and outputs

Upfront functional analysis provides basis for good estimates

Repository of FP data provides basis for calibrating estimates

39

Click to edit Master title style Effort mdash Weakest Measure

Effort Unreliable

Inconsistent

Effort

After the fact estimates

bull Memory lapses bull Time-splicing bull Inconsistency

Under-reporting

bull Contract issues

bull HR issues

bull Impressions

Lack of normalization

bull Roles included

bull Phases included

bull Hours in P-Year

Click to edit Master title style How Quality Affects Productivity

Original productivity baseline

Incremental increases in technical debt

Continuing decrease in productivity

Unless you take action

Assumption Productivity is a stable number

Reality Productivity is unstable tending to decline

Click to edit Master title style Carry-forward Rework

Release N+2

Develop N+2 Rework N+2 Rework N+1 Rework N+0

Develop N

Release N

Rework N

Unfixed defects release N

Release N+1

Develop N+1 Rework N+1 Rework N+0

Unfixed defects release N

Unfixed defects release N+1

Click to edit Master title style Example of Quality Impact Project B (Better Faster Cheaper)

ndash 20 developers 3 months

ndash $120k per FTE

ndash 4 FPs per staff month

ndash 240 FPs delivered

$2500FP cost

ndash 5 critical violations per FP

ndash $500 per fix

ndash Cost for 1200 fixes = $600k

ndash Total Cost to Own = $1200k

$5000FP of TCO

Project A (Plodders)

ndash 20 developers 3 months

ndash $120k per FTE

ndash 3 FPs per staff month

ndash 180 FPs delivered

$3333FP cost

ndash 2 critical violations per FP

ndash $500 per fix

ndash Cost for 360 fixes = $180k

ndash Total Cost to Own = $780k

$4333FP of TCO

Project B is 25 more productive

However

Project A is 134 more productive

Click to edit Master title style Quality-Adjusted Productivity

Productivity

Estimation

Benchmarks

Value amp ROI

Etc

Automated Enhancement

Points

Size of both functional and non-functional code segments

Corrective effort in future releases for defects injected in

this release

Must add future effort to fix bugs into productivity

calculations

Effort amp Cost

Quality-Adjusted

Productivity

Automated Technical Debt

Click to edit Master title style Best Practices for Analysis

1) Segment baselines

2) Beware sampling effects

3) Understand variation

4) Evaluate demographics

5) Investigate distributions

6) Account for maturity effects

7) Beware external data sets

Click to edit Master title style 1 Evaluate Demographics

Routine reports

COBOL apps

Function Points

Prod

uctiv

ity

Web-based apps

Innovative mobile

Custom ERP

Click to edit Master title style 2 Segment Baselines

Year Projects Productivity Total Corporate 1981 28 2342 1980 21 1939 Telecommunications 1981 14 1811 1980 12 1458 Engineering amp Defense 1981 8 2965 1980 6 2739 Business Applications 1981 6 3054 1980 3 1813

Multiple baselines are usually the most valid

Click to edit Master title style 3 Beware Sampling Effects

22

0

500

1000

1500

2000

2500

1980 1981 1982

Line

s of

cod

epe

rson

-yea

rs

Baselines

Lots of small

projects Lots of large

projects

Click to edit Master title style 4 Understand Variation

23

Product factors bull timing constraints bull memory utilization bull CPU occupancy bull resource constraints bull complexity bull product size

Project factors bull concurrent development bull development computer bull requirements definition bull requirements churn bull programming practices bull personnel experience bull client experience bull client participation

Project 49

Product 16

Not explained

35

R2product = 16

R2project = 49

R2

model = 65

J Vosburgh B Curtis et al (1984) Proceedings of ICSE 1994

Click to edit Master title style 5 Investigate Distributions

Low Medium High

Rel

ativ

e p

rod

uct

ivit

y (

of m

ean

)

Software Engineering Practices Usage

-100

-50

0

50

100

150

200

250

Click to edit Master title style 6 Account for Maturity Effects

Which organization will be required to spend more effort on correcting software flaws

Click to edit Master title style 7 Caution for External Data Sets

26

DACS Us

Productivity

Siz

e Data definitions

Data collection

Data validity

Data age

Demographics

Click to edit Master title style

1) Tracking traditional projects

2) Iterative and Agile measurement

3) Team Software Process measurement

33

Section 2 Measuring to Manage

Software Projects

Click to edit Master title style Measuring Project Progress

300K 250K 200K 150K 100K 50K 0K

Jan Feb Mar Apr May Jun

Planned rate of code Growth

Actual code growth

Completion of coding

ldquoThe measures say we are on schedulerdquo

300K

250K

200K

150K

100K

50K

0K Jan Feb Mar Apr May Jun

Planned rate of code Growth

Actual code growth

Completion of coding

ldquoBut the code keeps growing We should have measured the requirements growthrdquo

50

100

150

Defect reports

Total

Open

Closed

125

75

25

Jan Feb Mar Apr May Jun

Completion of coding

ldquoAnd all the defect reports remaining to be fixed will keep us far behind schedulerdquo

Click to edit Master title style Requirements Change Impact

Number of requirements

affected

Chart6

Requirements Added
Requirements Changed
Requirements Deleted
Person hours of work or rework
2
20
1
2
11
1
45
35
3
15
61
1
1
20
4
15
15
0
18
05
2

Sheet1

Sheet1

Requirements Added
Requirements Changed
Requirements Deleted
Person hours of work or rework

Sheet2

Sheet3

Click to edit Master title style Phase Defect Removal Model

PHASE

Escapes Previous Phase KSLOC

Defect Injection KSLOC

Subtotal KSLOC

Removal Effective-ness

Defect Removal KSLOC

Escapes at phase exit KSLOC

Rqmts 0 12 12 0 0 12

Design 12 180 192 76 146 46

Code amp UT 46 154 200 71 142 58

Inte-gration 58 0 58 67 39 19

Sys Test 19 0 19 58 11 08

Opera-tion 08

Kan (1999) Metrics and Model in Software Quality Engineering

Click to edit Master title style SCRUM Board Scrumboard displays

the path to completion for each story by showing task status

Story points are an estimate of days for each story

Agile estimating is by feature (story) rather than by task

lsquoDonersquo does not always mean all tasks are finished

Kanban methods restrict the work in progress to a limited number of paths

Click to edit Master title style Burndown Chart and Velocity

Burndown chart displays story completion by day

Burndown chart indicates actual versus estimated progress

Velocity is the rate at which stories are completed

Velocity indicates sustainable progress

Velocity results provide one source of historical data for improving estimates

Story points without the context of productivity factors are dangerous for estimating

Click to edit Master title style Team Velocity

0

1

2

3

4

5

1 2 2 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20Days

Story Points per Hour

0

1

2

3

4

5

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

HrsStryPt

Analyzing Velocity

Item Cycle Times

02468

101214161820

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

Stories Hours HrsStryPt

0

10

20

30

40

50

60

1 2 2 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

Burndown Chart

Days

Click to edit Master title style Measuring and Managing Flow

Released

In progress

Backlogged

Cumulative Flow Diagram Days to delivery

outliers

Anderson (2010) Kanban

Click to edit Master title style

0

10

20

30

40

50

60

70

80

1 2 3 4 5 6 7 8 9 10 11 12

Sprints

Sto

ry p

oin

ts d

eliv

ered

Trends Over Sprints

0

10

20

30

40

50

60

70

80

1 2 3 4 5 6 7 8 9 10 11 12

Sprints

Sto

ry p

oin

ts d

eliv

ered

0

05

1

15

2

25

3

35

4

1 2 3 4 5 6 7 8 9 10 11 12

Sprints

Tech

nic

al d

ebt

Track and correlate multiple measures

across sprints

Predictive models

Click to edit Master title style Sources and Measures

Stories submitted

Stories pulled back

Growth in size of stories

Stories added mid-sprint

Stories under review

Failed tests

Non-development tasks

Assists to other projects

Data sources

Measures Project tracking

Code control

Bug tracking

Build system

Test environment

Deployment tools

Operational monitoring

Click to edit Master title style Recommended Books

Click to edit Master title style Personal Software Process (PSP)

PSP 0 Current process Time recording

Defect recording Defect type standards

PSP 01 Coding standard

Size measurement Process improvement

proposal

Baseline Personal Process

Humphrey (1997) Introduction to the Personal Software Process

PSP 1 Size estimating

Test report

Personal Planning

PSP 2 Code reviews

Design reviews

Personal Quality Management

PSP 21 Cyclic development

Cyclic Personal Process

PSP 11 Task planning

Schedule planning

PSP 21 Design templates

Click to edit Master title style Team Software Process (TSP) bull Built from personal processes of team members

bull Well defined team roles

bull Project launch workshops for planning

bull Team measurement and tracking

bull Post-mortems for improvement

bull Team application of lean principles

Humphrey (2000) Introduction to the Team Software Process

Click to edit Master title style Aggregated Personal Data Best

R2s range from 70 to 87

The ability to predict the amount of time required to produce a piece of software is dramatically improved by estimating at the individual level first and then combining the estimates rather than developing a single team estimate averaged over individuals

R2 = 53

Click to edit Master title style Defect Detection at Intuit

Consequently test changes from defect

detection into correctness verification

PSPTSP shifts defect detection from the test phase back up into development

Fagan (2005)

Personal reviews

33

Compile 33

Team reviews 19

Unit test 15

Test 14

Click to edit Master title style

Dramatic reductions

ndash Delivered defects

ndash Days per KLOC

ndash Schedule deviation

ndash Test defectskloc

ndash Variation in results

Humphrey (2006) TSP Leading a Development Team

TSP Benefits

Click to edit Master title style Section 3 Section 3 Measuring Software

Improvement Programs

1) Improvement program methods

2) Maturity-based measurement

3) Improvement program results

61

Click to edit Master title style CMMI Maturity Transitions

Level 5 Optimizing

Innovation management

Level 4 Quantitatively

Managed

Capability management

Level 3 Defined

Process management

Level 2 Managed

Project management

Level 1 Initial

Inconsistent management

Chrisis Konrad amp Shrum (2005) CMMI

Click to edit Master title style Enhanced Maturity Framework

Level 5 Innovating

Innovation management

Level 4 Optimized

Capability management

Level 3 Standardized

Organizational management

Level 2 Stabilized

Work unit management

Level 1 Initial

Inconsistent management

Click to edit Master title style Maturity Level Transformation

Level 5 Opportunistic improvements

Empowered culture

Level 5 Proactive

improvements Agile culture

Organization

Projects

Individual

Level 4 Process and results

managed statistically Precision culture

Level 3 Organization develops

standard processes Common culture

Level 2 Project managers stabilize projects

Local culture

Level 1 Ad Hoc inconsistent

development practices Relationship culture

Trust

Click to edit Master title style Measurement by Level

Level 5 mdash Exploratory data ∆ t = N x ROI

Level 4 mdash Predictive data X plusmn (196σ radicn )

Level 3 mdash Process data 83 defects per review

Level 2 mdash Management data Effort = α LOCβ

Level 1 mdash Haphazard data ldquoabout a million linesrdquo

Measure-maturity mismatch slows improvement and creates

dysfunction

Click to edit Master title style Average Improvement Results

Productivity growth 4 35 9 - 67

Pre-test defect detection 3 22 6 - 25

Time to market 2 darr 19 15 - 23

Field error reports 5 darr 39 10 - 94

Return on investment 5 501 41 - 881

Annual Annual Improvement benefit Orgs median range

Click to edit Master title style Schedule Performance

0

02

04

06

08

1

12

14

16

1 2 3

Schedule performance

index

CMM maturity level

Budgeted cost of work performed

Budgeted cost of work scheduled

behind ahead

Air Force study of contractors

Flowe amp Thordahl (1994)

Click to edit Master title style Cost Performance

Cost performance

index

0

05

1

15

2

25

1 2 3 CMM Maturity Level

Budgeted cost of work performed Actual cost of

work performed

overrun underrun

Air Force study of contractors

Flowe amp Thordahl (1994)

Click to edit Master title style Crosbyrsquos Cost of Quality Model Project Cost

Performance Cost of Quality Generating plans documentation Development - requirements - design - code - integration Prevention Appraisal

Reviews - system - requirements - design - test plan - test scripts Walkthroughs Testing (first time) Independent Verification and Validation (first time) Audits

Training Policies Procedures and methods Tools Planning Quality Improvement Projects Data Gathering and Analysis Root Cause Analysis Quality Reporting

Nonconformance Conformance Fixing Defects Reworking any Document Updating Source Code Re-reviews Re-tests Lab costs for re-tests Patches(Internal and External) Engineering Changes Change Control Boards External Failures and fines Customer Support Help Desks

Dion (1993) Crosby (1979) Quality Is Free

Click to edit Master title style Raytheons Cost of Quality

bull Performance mdash cost of building it right first time bull Nonconformance mdash cost of rework bull Appraisal mdash cost of testing bull Prevention mdash cost of preventing nonconformance

Dion (1993) Haley (1995)

23

18

Year Level Perform Nonconf Appraise Prevent

1988 1 34 41 15 7

1990 2 55 18 15 12

1992 3 66 11

1994 4 76 6 75

Click to edit Master title style Raytheonrsquos Productivity

0

1

2

3

4

1988 1989 1990 1991 1992 1993 1994

Growth relative to 1989 baseline

Lydon (1995)

Click to edit Master title style Raytheonrsquos Cost Reduction

0

02

04

06

08

1

12

1988 1989 1990 1991 1992 1993 1994

Reduction relative to 1989 baseline

Lydon (1995)

Click to edit Master title style

Haley (1995)

1993 1992 1991 1990 1989 1988 1994

50

40

30

20

10

0

-10

Actual cost

Budgeted cost

over-run under-run

4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1

Raytheonrsquos Cost Predictability

Click to edit Master title style OMRONrsquos Reuse gains

Sakamoto et al (1996)

0

5 0 0

1 0 0 0

1 5 0 0

2 0 0 0

2 5 0 0

3 0 0 0

1 9 8 7 1 9 8 8 1 9 8 9 1 9 9 0 1 9 9 1 1 9 9 2 1 9 9 3 1 9 9 4

Steps

100

75

50

25

0

of reuse

Click to edit Master title style Section 4 Section 4 Measuring Quality and

Technical Debt

1) Structural quality measurement

2) Technical debt

3) Consortium for IT Software Quality (CISQ)

89

Click to edit Master title style What about the Product CMMIrsquos primary assumption

ldquoThe quality of a software system is governed by quality of the process used to develop itrdquo

- Watts Humphrey

CMMIrsquos product quality problem

1 Assessments do not verify product quality

2 Compliance focus undermines CMMIrsquos primary assumption

3 Product quality focus at Level 4 in CMM was lost in quantitative process control in CMMI

4 The assumptions underlying control charts are violated by software data

Click to edit Master title style Six Sigmarsquos Challenge

Process focus ndash process improvement ndash Six Sigma Product focus ndash product improvement ndash Design for 6σ

Product focus supplements product focus to unlock even more value from software

Six Sigma projects must have significant benefits Huge benefits

Large benefits

Good benefits

Okay benefits

Small benefits

Now what

Ultimately we run out of projects with significant enough benefits to continue the programhellip

so what is the solution that allows continual improvement

Click to edit Master title style Testing is Not Enough ldquoAs higher levels of assurance are demandedhelliptesting cannot deliver the level of confidence required at a reasonable costrdquo

ldquoThe correctness of the code is rarely the weakest linkrdquo

Jackson D (2009) Spinellis D (2006)

ldquohellipa failure to satisfy a non-functional requirement can be critical even catastrophichellipnon-functional requirements are sometimes difficult to verify We cannot write a test case to verify a systemrsquos reliabilityhellipThe ability to associate code to non-functional properties can be a powerful weapon in a software engineerrsquos arsenalrdquo

Click to edit Master title style CASTrsquos Application Intelligence Platform

Application Analysis

Evaluation of 1200+ coding amp

architectural rules

Application meta-data

Transferability

Changeability

Robustness

Performance

Security

Quality Measurements

Detected Violations

Expensive operation in loop Static vs pooled connections Complex query on big table Large indices on big table

Empty CATCH block Uncontrolled data access Poor memory management Opened resource not closed

SQL injection Cross-site scripting Buffer overflow Uncontrolled format string

Unstructured code Misuse of inheritance Lack of comments Violated naming convention

Highly coupled component Duplicated code Index modified in loop High cyclomatic complexity

Language Parsers

Oracle PLSQL Sybase T-SQL SQL Server T-SQL IBM SQLPSM C C++ C Pro C Cobol CICS Visual Basic VBNet ASPNet Java J2EE JSP XML HTML Javascript VBScript PHP PowerBuilder Oracle Forms PeopleSoft SAP ABAP Netweaver Tibco Business Objects Universal Analyzer for other languages

Click to edit Master title style Modern Apps are a Technology Stack Ar

chite

ctur

al C

ompl

ianc

e

Data Flow Transaction Risk

EJB PLSQL

Oracle

SQL Server

DB2

TSQL

Hibernate

Spring

Struts NET

COBOL

Sybase IMS

Messaging

Integration quality Architectural

compliance Risk propagation Application security Resiliency checks Transaction integrity

Function point Effort estimation Data access control SDK versioning Calibration across

technologies IT organization level

System Level 3

Code style amp layout Expression complexity Code documentation Class or program design Basic coding standards Developer level

Unit Level 1

Java Java Java

Java Java

Web Services Java Java Single languagetechnology layer

Intra-technology architecture Intra-layer dependencies Inter-program invocation Security vulnerabilities Development team level

Technology Level 2

JSP ASPNET APIs

Click to edit Master title style Architecturally Complex Defects

Primary cause of operational

problems

20x as many

fixes to correct

Architecturally Complex Defect

A structural flaw involving interactions among multiple components that reside in different application layers

48

52 92

8 Architecturally

Complex Defects

Code unit-level violations

of total app defects

of total repair effort

Most architecturally complex defects involve an architectural hotspot ndash a badly-built component that should be replaced

Click to edit Master title style Propagated Risk

The impact of violation 284 in Component B is very widely propagated throughout the system ndash

it presents very high risk

The impact of violation 342 in Component A is only narrowly propagated in the system

ndash it presents low risk

Propagated Risk Index

A ranking of the risk created by each violation based on the its severity and the breadth of its impact in the system

Violation 342 in component A

Violation 284 in component B

Remediating violation 284 in Component B will have

the greater impact on reducing risk and cost

Click to edit Master title style Transaction Risk

Customers care most about the dependability of their transactions

The risk of a transaction is determined by the number and severity of violations along its path

Transactions can be ranked by risk to prioritize their remediation

The Transaction Risk Index can be further tuned using the operational frequency of each transaction

Transaction Risk Index

A ranking of transactions based on the risk created by the number and severity of violations along the transaction path

Transaction entryexit in the user interface layer

Transaction fulfillment in the data storage layer

1 3 2 4 5

Transaction 1 poses greatest risk

Transaction processing in the business logic layer

Click to edit Master title style Risk-Based Prioritization

Violation Health Factor Severity Propagated Risk Index

In a high risk transaction

Frequency of path execution

148 Security 9

371 Robustness 9

062 Performance 8

284 Robustness 7

016 Changeability 7

241 Security 5

173 Transferability 4

359 Transferability 3

Click to edit Master title style Static Analysis in US DoD

John Keane (2013)

Testing finds lt13 of structural defects

Static analysis before testing is 85 efficient

With testing and static analysis combined test schedules are reduced by 50

Click to edit Master title style Reducing Operational Incidents amp Costs

Study of structural quality measures and maintenance effort across 20 customers in a large global system integrator

TQI increase of 24 decreased corrective maintenance effort by 50

Total Quality Index

Click to edit Master title style Reducing Operational Losses

0

20

40

60

80

100

120

140

160

180

100 150 200 250 300 350 400

Technical Quality Index

Pro

du

ctio

n In

cid

ents

+$2000000 Annual Loss

$50000 Annual Loss

$5000 Annual Loss

Large international investment bank Business critical applications

Click to edit Master title style Factors Affecting Structural Quality

2017 CRASH Report Request from infocastsoftwarecom

Factor apps Robust Security Perform Change Transfer

Technology 1606 20 3 16 26 7

Java-EE

Industry 677 5 5 7 4 6

App type 510 1 1 1 1 1

Source 580 1 1 4

Shore 540

Maturity 72 28 25 15 24 12

Method 208 10 6 14 6

Team size 186 7 5 8 8

of users 262 4 3 5 4

Percentage of variation in structural quality scores accounted for by various factors

Click to edit Master title style The Technical Debt Metaphor

Structural quality problems in production code

Technical Debt

Principal borrowed

Interest on the debt

Business Risk Liability from debt

Opportunity cost

Interestmdashcontinuing IT costs attributable to the violations causing technical debt ie higher maintenance costs greater resource usage etc

Principalcost of fixing problems remaining in the code after release that must be remediated

Opportunity costmdashbenefits that could have been achieved had resources been put on new capability rather than retiring technical debt

Liabilitymdashbusiness costs related to outages breaches corrupted data etc

Technical Debt the future cost of fixing defects remaining in code at release a component of the cost of ownership

Todayrsquos talk focuses on the principal

Curtis et al (2012) IEEE Software

Click to edit Master title style Tech Debt by Quality Factor

Transferability

40

Changeability

30

Security 7

Robustness

18

70 of Technical Debt is in IT Cost (Transferability Changeability)

30 of Technical Debt is in Business Risk

(Robustness Performance Security)

Health Factor proportions are mostly

consistent across technologies

Curtis et al (2012) IEEE Software

Click to edit Master title style Join CISQ wwwit-cisqorg

Click to edit Master title style Section

References

115

Click to edit Master title style References 1 Anderson DJ (2010) Kanban Sequim WA Blue Hole Press

Basili V Trendowicz A Kowalczyk M Heidrich J Seaman C Munch J amp Rombach D (2014) Aligning Organizations through Measurement Heidelberg Springer

Boehm B (1987) Increasing software productivity IEEE Computer 20 (9) 43-57

Boehm B et al (2000) Software Cost Estimation with COCOMO II Prentice-Hall

Chrissis MB Konrad M amp Shrum S (2011) CMMI Third Edition Guidelines for Process Integration and Product Improvement Reading MA Addison Wesley

Cohn M (2006) Agile Estimating and Planning Upper Saddle River NJ Prentice-Hall

Crosby P (1979) Quality Is Free New York McGraw-Hill

Curtis B Sapiddi J amp Szynkarski A (2012) Estimating the principle of an applicationrsquos technical debt IEEE Software 29 (6) 34-42

Dion R (1993) Process improvement and the corporate balance sheet IEEE Software 10 (4) 28-35

Duncker R (1992) Presentation at the 25th Annual Conference of the Singapore Computer Society Singapore November 1992

Dyne K (1999) ESSI Benefits from Continuous Software Improvements In Proceedings of ESEPGrsquo99 Milton Keyes UK ESPI Foundation

Flowe RM amp Thordahl JB (1994) A Correlational Study of the SEIrsquos Capability Maturity Model and Software Development Performance in DoD Contracts (AFITGSSLAR94D-2) Dayton OH Air Force Institute of Technology Wright Patterson Air Force Base

116

Click to edit Master title style References 2 Haley T Ireland B Wojtaszek E Nash D amp Dion R (1995) Raytheon Electronic Systems

Experience in Software Process Improvement (Tech Rep CMUSEI-95-TR-017) ) Pittsburgh Software Engineering Institute Carnegie Mellon University

Ebert C amp Dumke R (2007) Software Measurement Berlin Springer-Verlag

Fagan E amp Davis N (2005) TSP PSP at Intuit Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Fenton N (2006) New directions for Software Metrics CIO Symposium on Software Best Practices

Garmus D amp Herron D (2001) Function Point Analysis Boston Addison-Wesley

George E (2005) Project management with TSP Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Herbsleb J Carleton A Rozum J Siegel J amp Zubrow D (1994) Benefits of CMM-Based Software Process Improvement Initial Results (Tech Rep CMUSEI-94-TR-13) Pittsburgh Software Engineering Institute Carnegie Mellon University

Humphrey W (1997) Introduction to the Personal Software Process Boston Addison Wesley

Humphrey W (2000) Introduction to the Team Software Process Boston Addison Wesley

Humphrey W (2006) TSP Leading a Development Team Boston Addison Wesley

Jackson D (2009) A direct path to dependable software Communications of the ACM 52 (4)

Kan SH (1995) Metrics and Models in Software Quality Engineering Reading MA Addison-Wesley

Keeni B (2000) IEEE Software 16 (4)

117

Click to edit Master title style References 2 Larman C (2004) Agile and Iterative Development Boston Addison Wesley

Lydon T (1995) Productivity drivers Process and capital In Proceedings of the 1995 SEPG Conference Pittsburgh Software Engineering Institute Carnegie Mellon University

McGarry J et al (2002) Practical Software Measurement Reading MA Addison Wesley

Paulk M Weber C Curtis B amp Chrissis M (1995) The Capability Maturity Model Guidelines for Improving the Software Process Reading MA Addison-Wesley

Peterman W (2000) IEEE Software JulyAugust issue

Poppendieck M amp Poppendieck T (2007) Implementing Lean Software Development Boston Addison-Wesley

Sakamoto K Kishida K amp Nakakoji K (1996) Cultural adaptation of the CMM In Fuggetta A amp Wolf A (Eds) Software Process Chichester UK Wiley 137-154

Sinha M (2006) Performance models ndash Enabling organizations prediction capability Proceedings of SEPGrsquo06 Pittsburgh Software Engineering Institute Carnegie Mellon University

Schwaber K amp Beedle M (2002) Agile Software Development Using Scrum

Spinellis D (2006) Code Quality Addison-Wesley

Van Epps S amp Marshall R (2012) Using TSP at the end of the development cycle Proceedings of TSP 2012 Pittsburgh Software Engineering Institute Carnegie Mellon University

Vu JD (1996) Software process improvement A business case In Proceedings of the European SEPG Conference Milton Keynes UK European Software Process Improvement Foundation

118

  • Best Practices for Software Process and Product Measurement
  • Instructor Dr Bill Curtis
  • Agenda
  • Section 1
  • Why Does Measurement Languish
  • Measurement Process Guidance
  • Measurement Categories
  • Productivity Analysis Objectives
  • Productivity Analysis Measures
  • Software Productivity
  • Software Size Measures
  • How Many Lines of Code
  • Function Point Estimation
  • Effort mdash Weakest Measure
  • How Quality Affects Productivity
  • Carry-forward Rework
  • Example of Quality Impact
  • Quality-Adjusted Productivity
  • Best Practices for Analysis
  • 1 Evaluate Demographics
  • 2 Segment Baselines
  • 3 Beware Sampling Effects
  • 4 Understand Variation
  • 5 Investigate Distributions
  • 6 Account for Maturity Effects
  • 7 Caution for External Data Sets
  • Slide Number 27
  • Measuring Project Progress
  • Requirements Change Impact
  • Phase Defect Removal Model
  • SCRUM Board
  • Burndown Chart and Velocity
  • Analyzing Velocity
  • Measuring and Managing Flow
  • Trends Over Sprints
  • Sources and Measures
  • Recommended Books
  • Personal Software Process (PSP)
  • Team Software Process (TSP)
  • Aggregated Personal Data Best
  • Defect Detection at Intuit
  • TSP Benefits
  • Section 3
  • CMMI Maturity Transitions
  • Enhanced Maturity Framework
  • Maturity Level Transformation
  • Measurement by Level
  • Average Improvement Results
  • Schedule Performance
  • Cost Performance
  • Crosbyrsquos Cost of Quality Model
  • Raytheons Cost of Quality
  • Raytheonrsquos Productivity
  • Raytheonrsquos Cost Reduction
  • Raytheonrsquos Cost Predictability
  • OMRONrsquos Reuse gains
  • Section 4
  • What about the Product
  • Six Sigmarsquos Challenge
  • Testing is Not Enough
  • CASTrsquos Application Intelligence Platform
  • Modern Apps are a Technology Stack
  • Architecturally Complex Defects
  • Propagated Risk
  • Transaction Risk
  • Risk-Based Prioritization
  • Static Analysis in US DoD
  • Reducing Operational Incidents amp Costs
  • Reducing Operational Losses
  • Slide Number 70
  • The Technical Debt Metaphor
  • Tech Debt by Quality Factor
  • Join CISQ wwwit-cisqorg
  • Section
  • References 1
  • References 2
  • References 2

PHASE

Escapes Previous Phase

KSLOC

Defect Injection

KSLOC

Subtotal

KSLOC

Removal

Effective-ness

Defect Removal

KSLOC

Escapes at phase exit KSLOC

Rqmts

0

12

12

0

0

12

Design

12

180

192

76

146

46

Code amp UT

46

154

200

71

142

58

Inte-gration

58

0

58

67

39

19

Sys Test

19

0

19

58

11

08

Opera-tion

08

Requirements Added Requirements Changed Requirements Deleted
Feb 2 20 1
Mar 2 11 1
Apr 45 35 3
May 15 61 1
Jun 1 20 4
Jul 15 15 0
Aug 18 05 2
Feb Feb Feb
Mar Mar Mar
Apr Apr Apr
May May May
Jun Jun Jun
Jul Jul Jul
Aug Aug Aug
Page 14: Best Practices for Software Process and Product Measurement · Software Process and Product Measurement Dr. Bill Curtis ... measurement best practice • Best guidance for starting

Click to edit Master title style Effort mdash Weakest Measure

Effort Unreliable

Inconsistent

Effort

After the fact estimates

bull Memory lapses bull Time-splicing bull Inconsistency

Under-reporting

bull Contract issues

bull HR issues

bull Impressions

Lack of normalization

bull Roles included

bull Phases included

bull Hours in P-Year

Click to edit Master title style How Quality Affects Productivity

Original productivity baseline

Incremental increases in technical debt

Continuing decrease in productivity

Unless you take action

Assumption Productivity is a stable number

Reality Productivity is unstable tending to decline

Click to edit Master title style Carry-forward Rework

Release N+2

Develop N+2 Rework N+2 Rework N+1 Rework N+0

Develop N

Release N

Rework N

Unfixed defects release N

Release N+1

Develop N+1 Rework N+1 Rework N+0

Unfixed defects release N

Unfixed defects release N+1

Click to edit Master title style Example of Quality Impact Project B (Better Faster Cheaper)

ndash 20 developers 3 months

ndash $120k per FTE

ndash 4 FPs per staff month

ndash 240 FPs delivered

$2500FP cost

ndash 5 critical violations per FP

ndash $500 per fix

ndash Cost for 1200 fixes = $600k

ndash Total Cost to Own = $1200k

$5000FP of TCO

Project A (Plodders)

ndash 20 developers 3 months

ndash $120k per FTE

ndash 3 FPs per staff month

ndash 180 FPs delivered

$3333FP cost

ndash 2 critical violations per FP

ndash $500 per fix

ndash Cost for 360 fixes = $180k

ndash Total Cost to Own = $780k

$4333FP of TCO

Project B is 25 more productive

However

Project A is 134 more productive

Click to edit Master title style Quality-Adjusted Productivity

Productivity

Estimation

Benchmarks

Value amp ROI

Etc

Automated Enhancement

Points

Size of both functional and non-functional code segments

Corrective effort in future releases for defects injected in

this release

Must add future effort to fix bugs into productivity

calculations

Effort amp Cost

Quality-Adjusted

Productivity

Automated Technical Debt

Click to edit Master title style Best Practices for Analysis

1) Segment baselines

2) Beware sampling effects

3) Understand variation

4) Evaluate demographics

5) Investigate distributions

6) Account for maturity effects

7) Beware external data sets

Click to edit Master title style 1 Evaluate Demographics

Routine reports

COBOL apps

Function Points

Prod

uctiv

ity

Web-based apps

Innovative mobile

Custom ERP

Click to edit Master title style 2 Segment Baselines

Year Projects Productivity Total Corporate 1981 28 2342 1980 21 1939 Telecommunications 1981 14 1811 1980 12 1458 Engineering amp Defense 1981 8 2965 1980 6 2739 Business Applications 1981 6 3054 1980 3 1813

Multiple baselines are usually the most valid

Click to edit Master title style 3 Beware Sampling Effects

22

0

500

1000

1500

2000

2500

1980 1981 1982

Line

s of

cod

epe

rson

-yea

rs

Baselines

Lots of small

projects Lots of large

projects

Click to edit Master title style 4 Understand Variation

23

Product factors bull timing constraints bull memory utilization bull CPU occupancy bull resource constraints bull complexity bull product size

Project factors bull concurrent development bull development computer bull requirements definition bull requirements churn bull programming practices bull personnel experience bull client experience bull client participation

Project 49

Product 16

Not explained

35

R2product = 16

R2project = 49

R2

model = 65

J Vosburgh B Curtis et al (1984) Proceedings of ICSE 1994

Click to edit Master title style 5 Investigate Distributions

Low Medium High

Rel

ativ

e p

rod

uct

ivit

y (

of m

ean

)

Software Engineering Practices Usage

-100

-50

0

50

100

150

200

250

Click to edit Master title style 6 Account for Maturity Effects

Which organization will be required to spend more effort on correcting software flaws

Click to edit Master title style 7 Caution for External Data Sets

26

DACS Us

Productivity

Siz

e Data definitions

Data collection

Data validity

Data age

Demographics

Click to edit Master title style

1) Tracking traditional projects

2) Iterative and Agile measurement

3) Team Software Process measurement

33

Section 2 Measuring to Manage

Software Projects

Click to edit Master title style Measuring Project Progress

300K 250K 200K 150K 100K 50K 0K

Jan Feb Mar Apr May Jun

Planned rate of code Growth

Actual code growth

Completion of coding

ldquoThe measures say we are on schedulerdquo

300K

250K

200K

150K

100K

50K

0K Jan Feb Mar Apr May Jun

Planned rate of code Growth

Actual code growth

Completion of coding

ldquoBut the code keeps growing We should have measured the requirements growthrdquo

50

100

150

Defect reports

Total

Open

Closed

125

75

25

Jan Feb Mar Apr May Jun

Completion of coding

ldquoAnd all the defect reports remaining to be fixed will keep us far behind schedulerdquo

Click to edit Master title style Requirements Change Impact

Number of requirements

affected

Chart6

Requirements Added
Requirements Changed
Requirements Deleted
Person hours of work or rework
2
20
1
2
11
1
45
35
3
15
61
1
1
20
4
15
15
0
18
05
2

Sheet1

Sheet1

Requirements Added
Requirements Changed
Requirements Deleted
Person hours of work or rework

Sheet2

Sheet3

Click to edit Master title style Phase Defect Removal Model

PHASE

Escapes Previous Phase KSLOC

Defect Injection KSLOC

Subtotal KSLOC

Removal Effective-ness

Defect Removal KSLOC

Escapes at phase exit KSLOC

Rqmts 0 12 12 0 0 12

Design 12 180 192 76 146 46

Code amp UT 46 154 200 71 142 58

Inte-gration 58 0 58 67 39 19

Sys Test 19 0 19 58 11 08

Opera-tion 08

Kan (1999) Metrics and Model in Software Quality Engineering

Click to edit Master title style SCRUM Board Scrumboard displays

the path to completion for each story by showing task status

Story points are an estimate of days for each story

Agile estimating is by feature (story) rather than by task

lsquoDonersquo does not always mean all tasks are finished

Kanban methods restrict the work in progress to a limited number of paths

Click to edit Master title style Burndown Chart and Velocity

Burndown chart displays story completion by day

Burndown chart indicates actual versus estimated progress

Velocity is the rate at which stories are completed

Velocity indicates sustainable progress

Velocity results provide one source of historical data for improving estimates

Story points without the context of productivity factors are dangerous for estimating

Click to edit Master title style Team Velocity

0

1

2

3

4

5

1 2 2 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20Days

Story Points per Hour

0

1

2

3

4

5

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

HrsStryPt

Analyzing Velocity

Item Cycle Times

02468

101214161820

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

Stories Hours HrsStryPt

0

10

20

30

40

50

60

1 2 2 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

Burndown Chart

Days

Click to edit Master title style Measuring and Managing Flow

Released

In progress

Backlogged

Cumulative Flow Diagram Days to delivery

outliers

Anderson (2010) Kanban

Click to edit Master title style

0

10

20

30

40

50

60

70

80

1 2 3 4 5 6 7 8 9 10 11 12

Sprints

Sto

ry p

oin

ts d

eliv

ered

Trends Over Sprints

0

10

20

30

40

50

60

70

80

1 2 3 4 5 6 7 8 9 10 11 12

Sprints

Sto

ry p

oin

ts d

eliv

ered

0

05

1

15

2

25

3

35

4

1 2 3 4 5 6 7 8 9 10 11 12

Sprints

Tech

nic

al d

ebt

Track and correlate multiple measures

across sprints

Predictive models

Click to edit Master title style Sources and Measures

Stories submitted

Stories pulled back

Growth in size of stories

Stories added mid-sprint

Stories under review

Failed tests

Non-development tasks

Assists to other projects

Data sources

Measures Project tracking

Code control

Bug tracking

Build system

Test environment

Deployment tools

Operational monitoring

Click to edit Master title style Recommended Books

Click to edit Master title style Personal Software Process (PSP)

PSP 0 Current process Time recording

Defect recording Defect type standards

PSP 01 Coding standard

Size measurement Process improvement

proposal

Baseline Personal Process

Humphrey (1997) Introduction to the Personal Software Process

PSP 1 Size estimating

Test report

Personal Planning

PSP 2 Code reviews

Design reviews

Personal Quality Management

PSP 21 Cyclic development

Cyclic Personal Process

PSP 11 Task planning

Schedule planning

PSP 21 Design templates

Click to edit Master title style Team Software Process (TSP) bull Built from personal processes of team members

bull Well defined team roles

bull Project launch workshops for planning

bull Team measurement and tracking

bull Post-mortems for improvement

bull Team application of lean principles

Humphrey (2000) Introduction to the Team Software Process

Click to edit Master title style Aggregated Personal Data Best

R2s range from 70 to 87

The ability to predict the amount of time required to produce a piece of software is dramatically improved by estimating at the individual level first and then combining the estimates rather than developing a single team estimate averaged over individuals

R2 = 53

Click to edit Master title style Defect Detection at Intuit

Consequently test changes from defect

detection into correctness verification

PSPTSP shifts defect detection from the test phase back up into development

Fagan (2005)

Personal reviews

33

Compile 33

Team reviews 19

Unit test 15

Test 14

Click to edit Master title style

Dramatic reductions

ndash Delivered defects

ndash Days per KLOC

ndash Schedule deviation

ndash Test defectskloc

ndash Variation in results

Humphrey (2006) TSP Leading a Development Team

TSP Benefits

Click to edit Master title style Section 3 Section 3 Measuring Software

Improvement Programs

1) Improvement program methods

2) Maturity-based measurement

3) Improvement program results

61

Click to edit Master title style CMMI Maturity Transitions

Level 5 Optimizing

Innovation management

Level 4 Quantitatively

Managed

Capability management

Level 3 Defined

Process management

Level 2 Managed

Project management

Level 1 Initial

Inconsistent management

Chrisis Konrad amp Shrum (2005) CMMI

Click to edit Master title style Enhanced Maturity Framework

Level 5 Innovating

Innovation management

Level 4 Optimized

Capability management

Level 3 Standardized

Organizational management

Level 2 Stabilized

Work unit management

Level 1 Initial

Inconsistent management

Click to edit Master title style Maturity Level Transformation

Level 5 Opportunistic improvements

Empowered culture

Level 5 Proactive

improvements Agile culture

Organization

Projects

Individual

Level 4 Process and results

managed statistically Precision culture

Level 3 Organization develops

standard processes Common culture

Level 2 Project managers stabilize projects

Local culture

Level 1 Ad Hoc inconsistent

development practices Relationship culture

Trust

Click to edit Master title style Measurement by Level

Level 5 mdash Exploratory data ∆ t = N x ROI

Level 4 mdash Predictive data X plusmn (196σ radicn )

Level 3 mdash Process data 83 defects per review

Level 2 mdash Management data Effort = α LOCβ

Level 1 mdash Haphazard data ldquoabout a million linesrdquo

Measure-maturity mismatch slows improvement and creates

dysfunction

Click to edit Master title style Average Improvement Results

Productivity growth 4 35 9 - 67

Pre-test defect detection 3 22 6 - 25

Time to market 2 darr 19 15 - 23

Field error reports 5 darr 39 10 - 94

Return on investment 5 501 41 - 881

Annual Annual Improvement benefit Orgs median range

Click to edit Master title style Schedule Performance

0

02

04

06

08

1

12

14

16

1 2 3

Schedule performance

index

CMM maturity level

Budgeted cost of work performed

Budgeted cost of work scheduled

behind ahead

Air Force study of contractors

Flowe amp Thordahl (1994)

Click to edit Master title style Cost Performance

Cost performance

index

0

05

1

15

2

25

1 2 3 CMM Maturity Level

Budgeted cost of work performed Actual cost of

work performed

overrun underrun

Air Force study of contractors

Flowe amp Thordahl (1994)

Click to edit Master title style Crosbyrsquos Cost of Quality Model Project Cost

Performance Cost of Quality Generating plans documentation Development - requirements - design - code - integration Prevention Appraisal

Reviews - system - requirements - design - test plan - test scripts Walkthroughs Testing (first time) Independent Verification and Validation (first time) Audits

Training Policies Procedures and methods Tools Planning Quality Improvement Projects Data Gathering and Analysis Root Cause Analysis Quality Reporting

Nonconformance Conformance Fixing Defects Reworking any Document Updating Source Code Re-reviews Re-tests Lab costs for re-tests Patches(Internal and External) Engineering Changes Change Control Boards External Failures and fines Customer Support Help Desks

Dion (1993) Crosby (1979) Quality Is Free

Click to edit Master title style Raytheons Cost of Quality

bull Performance mdash cost of building it right first time bull Nonconformance mdash cost of rework bull Appraisal mdash cost of testing bull Prevention mdash cost of preventing nonconformance

Dion (1993) Haley (1995)

23

18

Year Level Perform Nonconf Appraise Prevent

1988 1 34 41 15 7

1990 2 55 18 15 12

1992 3 66 11

1994 4 76 6 75

Click to edit Master title style Raytheonrsquos Productivity

0

1

2

3

4

1988 1989 1990 1991 1992 1993 1994

Growth relative to 1989 baseline

Lydon (1995)

Click to edit Master title style Raytheonrsquos Cost Reduction

0

02

04

06

08

1

12

1988 1989 1990 1991 1992 1993 1994

Reduction relative to 1989 baseline

Lydon (1995)

Click to edit Master title style

Haley (1995)

1993 1992 1991 1990 1989 1988 1994

50

40

30

20

10

0

-10

Actual cost

Budgeted cost

over-run under-run

4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1

Raytheonrsquos Cost Predictability

Click to edit Master title style OMRONrsquos Reuse gains

Sakamoto et al (1996)

0

5 0 0

1 0 0 0

1 5 0 0

2 0 0 0

2 5 0 0

3 0 0 0

1 9 8 7 1 9 8 8 1 9 8 9 1 9 9 0 1 9 9 1 1 9 9 2 1 9 9 3 1 9 9 4

Steps

100

75

50

25

0

of reuse

Click to edit Master title style Section 4 Section 4 Measuring Quality and

Technical Debt

1) Structural quality measurement

2) Technical debt

3) Consortium for IT Software Quality (CISQ)

89

Click to edit Master title style What about the Product CMMIrsquos primary assumption

ldquoThe quality of a software system is governed by quality of the process used to develop itrdquo

- Watts Humphrey

CMMIrsquos product quality problem

1 Assessments do not verify product quality

2 Compliance focus undermines CMMIrsquos primary assumption

3 Product quality focus at Level 4 in CMM was lost in quantitative process control in CMMI

4 The assumptions underlying control charts are violated by software data

Click to edit Master title style Six Sigmarsquos Challenge

Process focus ndash process improvement ndash Six Sigma Product focus ndash product improvement ndash Design for 6σ

Product focus supplements product focus to unlock even more value from software

Six Sigma projects must have significant benefits Huge benefits

Large benefits

Good benefits

Okay benefits

Small benefits

Now what

Ultimately we run out of projects with significant enough benefits to continue the programhellip

so what is the solution that allows continual improvement

Click to edit Master title style Testing is Not Enough ldquoAs higher levels of assurance are demandedhelliptesting cannot deliver the level of confidence required at a reasonable costrdquo

ldquoThe correctness of the code is rarely the weakest linkrdquo

Jackson D (2009) Spinellis D (2006)

ldquohellipa failure to satisfy a non-functional requirement can be critical even catastrophichellipnon-functional requirements are sometimes difficult to verify We cannot write a test case to verify a systemrsquos reliabilityhellipThe ability to associate code to non-functional properties can be a powerful weapon in a software engineerrsquos arsenalrdquo

Click to edit Master title style CASTrsquos Application Intelligence Platform

Application Analysis

Evaluation of 1200+ coding amp

architectural rules

Application meta-data

Transferability

Changeability

Robustness

Performance

Security

Quality Measurements

Detected Violations

Expensive operation in loop Static vs pooled connections Complex query on big table Large indices on big table

Empty CATCH block Uncontrolled data access Poor memory management Opened resource not closed

SQL injection Cross-site scripting Buffer overflow Uncontrolled format string

Unstructured code Misuse of inheritance Lack of comments Violated naming convention

Highly coupled component Duplicated code Index modified in loop High cyclomatic complexity

Language Parsers

Oracle PLSQL Sybase T-SQL SQL Server T-SQL IBM SQLPSM C C++ C Pro C Cobol CICS Visual Basic VBNet ASPNet Java J2EE JSP XML HTML Javascript VBScript PHP PowerBuilder Oracle Forms PeopleSoft SAP ABAP Netweaver Tibco Business Objects Universal Analyzer for other languages

Click to edit Master title style Modern Apps are a Technology Stack Ar

chite

ctur

al C

ompl

ianc

e

Data Flow Transaction Risk

EJB PLSQL

Oracle

SQL Server

DB2

TSQL

Hibernate

Spring

Struts NET

COBOL

Sybase IMS

Messaging

Integration quality Architectural

compliance Risk propagation Application security Resiliency checks Transaction integrity

Function point Effort estimation Data access control SDK versioning Calibration across

technologies IT organization level

System Level 3

Code style amp layout Expression complexity Code documentation Class or program design Basic coding standards Developer level

Unit Level 1

Java Java Java

Java Java

Web Services Java Java Single languagetechnology layer

Intra-technology architecture Intra-layer dependencies Inter-program invocation Security vulnerabilities Development team level

Technology Level 2

JSP ASPNET APIs

Click to edit Master title style Architecturally Complex Defects

Primary cause of operational

problems

20x as many

fixes to correct

Architecturally Complex Defect

A structural flaw involving interactions among multiple components that reside in different application layers

48

52 92

8 Architecturally

Complex Defects

Code unit-level violations

of total app defects

of total repair effort

Most architecturally complex defects involve an architectural hotspot ndash a badly-built component that should be replaced

Click to edit Master title style Propagated Risk

The impact of violation 284 in Component B is very widely propagated throughout the system ndash

it presents very high risk

The impact of violation 342 in Component A is only narrowly propagated in the system

ndash it presents low risk

Propagated Risk Index

A ranking of the risk created by each violation based on the its severity and the breadth of its impact in the system

Violation 342 in component A

Violation 284 in component B

Remediating violation 284 in Component B will have

the greater impact on reducing risk and cost

Click to edit Master title style Transaction Risk

Customers care most about the dependability of their transactions

The risk of a transaction is determined by the number and severity of violations along its path

Transactions can be ranked by risk to prioritize their remediation

The Transaction Risk Index can be further tuned using the operational frequency of each transaction

Transaction Risk Index

A ranking of transactions based on the risk created by the number and severity of violations along the transaction path

Transaction entryexit in the user interface layer

Transaction fulfillment in the data storage layer

1 3 2 4 5

Transaction 1 poses greatest risk

Transaction processing in the business logic layer

Click to edit Master title style Risk-Based Prioritization

Violation Health Factor Severity Propagated Risk Index

In a high risk transaction

Frequency of path execution

148 Security 9

371 Robustness 9

062 Performance 8

284 Robustness 7

016 Changeability 7

241 Security 5

173 Transferability 4

359 Transferability 3

Click to edit Master title style Static Analysis in US DoD

John Keane (2013)

Testing finds lt13 of structural defects

Static analysis before testing is 85 efficient

With testing and static analysis combined test schedules are reduced by 50

Click to edit Master title style Reducing Operational Incidents amp Costs

Study of structural quality measures and maintenance effort across 20 customers in a large global system integrator

TQI increase of 24 decreased corrective maintenance effort by 50

Total Quality Index

Click to edit Master title style Reducing Operational Losses

0

20

40

60

80

100

120

140

160

180

100 150 200 250 300 350 400

Technical Quality Index

Pro

du

ctio

n In

cid

ents

+$2000000 Annual Loss

$50000 Annual Loss

$5000 Annual Loss

Large international investment bank Business critical applications

Click to edit Master title style Factors Affecting Structural Quality

2017 CRASH Report Request from infocastsoftwarecom

Factor apps Robust Security Perform Change Transfer

Technology 1606 20 3 16 26 7

Java-EE

Industry 677 5 5 7 4 6

App type 510 1 1 1 1 1

Source 580 1 1 4

Shore 540

Maturity 72 28 25 15 24 12

Method 208 10 6 14 6

Team size 186 7 5 8 8

of users 262 4 3 5 4

Percentage of variation in structural quality scores accounted for by various factors

Click to edit Master title style The Technical Debt Metaphor

Structural quality problems in production code

Technical Debt

Principal borrowed

Interest on the debt

Business Risk Liability from debt

Opportunity cost

Interestmdashcontinuing IT costs attributable to the violations causing technical debt ie higher maintenance costs greater resource usage etc

Principalcost of fixing problems remaining in the code after release that must be remediated

Opportunity costmdashbenefits that could have been achieved had resources been put on new capability rather than retiring technical debt

Liabilitymdashbusiness costs related to outages breaches corrupted data etc

Technical Debt the future cost of fixing defects remaining in code at release a component of the cost of ownership

Todayrsquos talk focuses on the principal

Curtis et al (2012) IEEE Software

Click to edit Master title style Tech Debt by Quality Factor

Transferability

40

Changeability

30

Security 7

Robustness

18

70 of Technical Debt is in IT Cost (Transferability Changeability)

30 of Technical Debt is in Business Risk

(Robustness Performance Security)

Health Factor proportions are mostly

consistent across technologies

Curtis et al (2012) IEEE Software

Click to edit Master title style Join CISQ wwwit-cisqorg

Click to edit Master title style Section

References

115

Click to edit Master title style References 1 Anderson DJ (2010) Kanban Sequim WA Blue Hole Press

Basili V Trendowicz A Kowalczyk M Heidrich J Seaman C Munch J amp Rombach D (2014) Aligning Organizations through Measurement Heidelberg Springer

Boehm B (1987) Increasing software productivity IEEE Computer 20 (9) 43-57

Boehm B et al (2000) Software Cost Estimation with COCOMO II Prentice-Hall

Chrissis MB Konrad M amp Shrum S (2011) CMMI Third Edition Guidelines for Process Integration and Product Improvement Reading MA Addison Wesley

Cohn M (2006) Agile Estimating and Planning Upper Saddle River NJ Prentice-Hall

Crosby P (1979) Quality Is Free New York McGraw-Hill

Curtis B Sapiddi J amp Szynkarski A (2012) Estimating the principle of an applicationrsquos technical debt IEEE Software 29 (6) 34-42

Dion R (1993) Process improvement and the corporate balance sheet IEEE Software 10 (4) 28-35

Duncker R (1992) Presentation at the 25th Annual Conference of the Singapore Computer Society Singapore November 1992

Dyne K (1999) ESSI Benefits from Continuous Software Improvements In Proceedings of ESEPGrsquo99 Milton Keyes UK ESPI Foundation

Flowe RM amp Thordahl JB (1994) A Correlational Study of the SEIrsquos Capability Maturity Model and Software Development Performance in DoD Contracts (AFITGSSLAR94D-2) Dayton OH Air Force Institute of Technology Wright Patterson Air Force Base

116

Click to edit Master title style References 2 Haley T Ireland B Wojtaszek E Nash D amp Dion R (1995) Raytheon Electronic Systems

Experience in Software Process Improvement (Tech Rep CMUSEI-95-TR-017) ) Pittsburgh Software Engineering Institute Carnegie Mellon University

Ebert C amp Dumke R (2007) Software Measurement Berlin Springer-Verlag

Fagan E amp Davis N (2005) TSP PSP at Intuit Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Fenton N (2006) New directions for Software Metrics CIO Symposium on Software Best Practices

Garmus D amp Herron D (2001) Function Point Analysis Boston Addison-Wesley

George E (2005) Project management with TSP Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Herbsleb J Carleton A Rozum J Siegel J amp Zubrow D (1994) Benefits of CMM-Based Software Process Improvement Initial Results (Tech Rep CMUSEI-94-TR-13) Pittsburgh Software Engineering Institute Carnegie Mellon University

Humphrey W (1997) Introduction to the Personal Software Process Boston Addison Wesley

Humphrey W (2000) Introduction to the Team Software Process Boston Addison Wesley

Humphrey W (2006) TSP Leading a Development Team Boston Addison Wesley

Jackson D (2009) A direct path to dependable software Communications of the ACM 52 (4)

Kan SH (1995) Metrics and Models in Software Quality Engineering Reading MA Addison-Wesley

Keeni B (2000) IEEE Software 16 (4)

117

Click to edit Master title style References 2 Larman C (2004) Agile and Iterative Development Boston Addison Wesley

Lydon T (1995) Productivity drivers Process and capital In Proceedings of the 1995 SEPG Conference Pittsburgh Software Engineering Institute Carnegie Mellon University

McGarry J et al (2002) Practical Software Measurement Reading MA Addison Wesley

Paulk M Weber C Curtis B amp Chrissis M (1995) The Capability Maturity Model Guidelines for Improving the Software Process Reading MA Addison-Wesley

Peterman W (2000) IEEE Software JulyAugust issue

Poppendieck M amp Poppendieck T (2007) Implementing Lean Software Development Boston Addison-Wesley

Sakamoto K Kishida K amp Nakakoji K (1996) Cultural adaptation of the CMM In Fuggetta A amp Wolf A (Eds) Software Process Chichester UK Wiley 137-154

Sinha M (2006) Performance models ndash Enabling organizations prediction capability Proceedings of SEPGrsquo06 Pittsburgh Software Engineering Institute Carnegie Mellon University

Schwaber K amp Beedle M (2002) Agile Software Development Using Scrum

Spinellis D (2006) Code Quality Addison-Wesley

Van Epps S amp Marshall R (2012) Using TSP at the end of the development cycle Proceedings of TSP 2012 Pittsburgh Software Engineering Institute Carnegie Mellon University

Vu JD (1996) Software process improvement A business case In Proceedings of the European SEPG Conference Milton Keynes UK European Software Process Improvement Foundation

118

  • Best Practices for Software Process and Product Measurement
  • Instructor Dr Bill Curtis
  • Agenda
  • Section 1
  • Why Does Measurement Languish
  • Measurement Process Guidance
  • Measurement Categories
  • Productivity Analysis Objectives
  • Productivity Analysis Measures
  • Software Productivity
  • Software Size Measures
  • How Many Lines of Code
  • Function Point Estimation
  • Effort mdash Weakest Measure
  • How Quality Affects Productivity
  • Carry-forward Rework
  • Example of Quality Impact
  • Quality-Adjusted Productivity
  • Best Practices for Analysis
  • 1 Evaluate Demographics
  • 2 Segment Baselines
  • 3 Beware Sampling Effects
  • 4 Understand Variation
  • 5 Investigate Distributions
  • 6 Account for Maturity Effects
  • 7 Caution for External Data Sets
  • Slide Number 27
  • Measuring Project Progress
  • Requirements Change Impact
  • Phase Defect Removal Model
  • SCRUM Board
  • Burndown Chart and Velocity
  • Analyzing Velocity
  • Measuring and Managing Flow
  • Trends Over Sprints
  • Sources and Measures
  • Recommended Books
  • Personal Software Process (PSP)
  • Team Software Process (TSP)
  • Aggregated Personal Data Best
  • Defect Detection at Intuit
  • TSP Benefits
  • Section 3
  • CMMI Maturity Transitions
  • Enhanced Maturity Framework
  • Maturity Level Transformation
  • Measurement by Level
  • Average Improvement Results
  • Schedule Performance
  • Cost Performance
  • Crosbyrsquos Cost of Quality Model
  • Raytheons Cost of Quality
  • Raytheonrsquos Productivity
  • Raytheonrsquos Cost Reduction
  • Raytheonrsquos Cost Predictability
  • OMRONrsquos Reuse gains
  • Section 4
  • What about the Product
  • Six Sigmarsquos Challenge
  • Testing is Not Enough
  • CASTrsquos Application Intelligence Platform
  • Modern Apps are a Technology Stack
  • Architecturally Complex Defects
  • Propagated Risk
  • Transaction Risk
  • Risk-Based Prioritization
  • Static Analysis in US DoD
  • Reducing Operational Incidents amp Costs
  • Reducing Operational Losses
  • Slide Number 70
  • The Technical Debt Metaphor
  • Tech Debt by Quality Factor
  • Join CISQ wwwit-cisqorg
  • Section
  • References 1
  • References 2
  • References 2

PHASE

Escapes Previous Phase

KSLOC

Defect Injection

KSLOC

Subtotal

KSLOC

Removal

Effective-ness

Defect Removal

KSLOC

Escapes at phase exit KSLOC

Rqmts

0

12

12

0

0

12

Design

12

180

192

76

146

46

Code amp UT

46

154

200

71

142

58

Inte-gration

58

0

58

67

39

19

Sys Test

19

0

19

58

11

08

Opera-tion

08

Requirements Added Requirements Changed Requirements Deleted
Feb 2 20 1
Mar 2 11 1
Apr 45 35 3
May 15 61 1
Jun 1 20 4
Jul 15 15 0
Aug 18 05 2
Feb Feb Feb
Mar Mar Mar
Apr Apr Apr
May May May
Jun Jun Jun
Jul Jul Jul
Aug Aug Aug
Page 15: Best Practices for Software Process and Product Measurement · Software Process and Product Measurement Dr. Bill Curtis ... measurement best practice • Best guidance for starting

Click to edit Master title style How Quality Affects Productivity

Original productivity baseline

Incremental increases in technical debt

Continuing decrease in productivity

Unless you take action

Assumption Productivity is a stable number

Reality Productivity is unstable tending to decline

Click to edit Master title style Carry-forward Rework

Release N+2

Develop N+2 Rework N+2 Rework N+1 Rework N+0

Develop N

Release N

Rework N

Unfixed defects release N

Release N+1

Develop N+1 Rework N+1 Rework N+0

Unfixed defects release N

Unfixed defects release N+1

Click to edit Master title style Example of Quality Impact Project B (Better Faster Cheaper)

ndash 20 developers 3 months

ndash $120k per FTE

ndash 4 FPs per staff month

ndash 240 FPs delivered

$2500FP cost

ndash 5 critical violations per FP

ndash $500 per fix

ndash Cost for 1200 fixes = $600k

ndash Total Cost to Own = $1200k

$5000FP of TCO

Project A (Plodders)

ndash 20 developers 3 months

ndash $120k per FTE

ndash 3 FPs per staff month

ndash 180 FPs delivered

$3333FP cost

ndash 2 critical violations per FP

ndash $500 per fix

ndash Cost for 360 fixes = $180k

ndash Total Cost to Own = $780k

$4333FP of TCO

Project B is 25 more productive

However

Project A is 134 more productive

Click to edit Master title style Quality-Adjusted Productivity

Productivity

Estimation

Benchmarks

Value amp ROI

Etc

Automated Enhancement

Points

Size of both functional and non-functional code segments

Corrective effort in future releases for defects injected in

this release

Must add future effort to fix bugs into productivity

calculations

Effort amp Cost

Quality-Adjusted

Productivity

Automated Technical Debt

Click to edit Master title style Best Practices for Analysis

1) Segment baselines

2) Beware sampling effects

3) Understand variation

4) Evaluate demographics

5) Investigate distributions

6) Account for maturity effects

7) Beware external data sets

Click to edit Master title style 1 Evaluate Demographics

Routine reports

COBOL apps

Function Points

Prod

uctiv

ity

Web-based apps

Innovative mobile

Custom ERP

Click to edit Master title style 2 Segment Baselines

Year Projects Productivity Total Corporate 1981 28 2342 1980 21 1939 Telecommunications 1981 14 1811 1980 12 1458 Engineering amp Defense 1981 8 2965 1980 6 2739 Business Applications 1981 6 3054 1980 3 1813

Multiple baselines are usually the most valid

Click to edit Master title style 3 Beware Sampling Effects

22

0

500

1000

1500

2000

2500

1980 1981 1982

Line

s of

cod

epe

rson

-yea

rs

Baselines

Lots of small

projects Lots of large

projects

Click to edit Master title style 4 Understand Variation

23

Product factors bull timing constraints bull memory utilization bull CPU occupancy bull resource constraints bull complexity bull product size

Project factors bull concurrent development bull development computer bull requirements definition bull requirements churn bull programming practices bull personnel experience bull client experience bull client participation

Project 49

Product 16

Not explained

35

R2product = 16

R2project = 49

R2

model = 65

J Vosburgh B Curtis et al (1984) Proceedings of ICSE 1994

Click to edit Master title style 5 Investigate Distributions

Low Medium High

Rel

ativ

e p

rod

uct

ivit

y (

of m

ean

)

Software Engineering Practices Usage

-100

-50

0

50

100

150

200

250

Click to edit Master title style 6 Account for Maturity Effects

Which organization will be required to spend more effort on correcting software flaws

Click to edit Master title style 7 Caution for External Data Sets

26

DACS Us

Productivity

Siz

e Data definitions

Data collection

Data validity

Data age

Demographics

Click to edit Master title style

1) Tracking traditional projects

2) Iterative and Agile measurement

3) Team Software Process measurement

33

Section 2 Measuring to Manage

Software Projects

Click to edit Master title style Measuring Project Progress

300K 250K 200K 150K 100K 50K 0K

Jan Feb Mar Apr May Jun

Planned rate of code Growth

Actual code growth

Completion of coding

ldquoThe measures say we are on schedulerdquo

300K

250K

200K

150K

100K

50K

0K Jan Feb Mar Apr May Jun

Planned rate of code Growth

Actual code growth

Completion of coding

ldquoBut the code keeps growing We should have measured the requirements growthrdquo

50

100

150

Defect reports

Total

Open

Closed

125

75

25

Jan Feb Mar Apr May Jun

Completion of coding

ldquoAnd all the defect reports remaining to be fixed will keep us far behind schedulerdquo

Click to edit Master title style Requirements Change Impact

Number of requirements

affected

Chart6

Requirements Added
Requirements Changed
Requirements Deleted
Person hours of work or rework
2
20
1
2
11
1
45
35
3
15
61
1
1
20
4
15
15
0
18
05
2

Sheet1

Sheet1

Requirements Added
Requirements Changed
Requirements Deleted
Person hours of work or rework

Sheet2

Sheet3

Click to edit Master title style Phase Defect Removal Model

PHASE

Escapes Previous Phase KSLOC

Defect Injection KSLOC

Subtotal KSLOC

Removal Effective-ness

Defect Removal KSLOC

Escapes at phase exit KSLOC

Rqmts 0 12 12 0 0 12

Design 12 180 192 76 146 46

Code amp UT 46 154 200 71 142 58

Inte-gration 58 0 58 67 39 19

Sys Test 19 0 19 58 11 08

Opera-tion 08

Kan (1999) Metrics and Model in Software Quality Engineering

Click to edit Master title style SCRUM Board Scrumboard displays

the path to completion for each story by showing task status

Story points are an estimate of days for each story

Agile estimating is by feature (story) rather than by task

lsquoDonersquo does not always mean all tasks are finished

Kanban methods restrict the work in progress to a limited number of paths

Click to edit Master title style Burndown Chart and Velocity

Burndown chart displays story completion by day

Burndown chart indicates actual versus estimated progress

Velocity is the rate at which stories are completed

Velocity indicates sustainable progress

Velocity results provide one source of historical data for improving estimates

Story points without the context of productivity factors are dangerous for estimating

Click to edit Master title style Team Velocity

0

1

2

3

4

5

1 2 2 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20Days

Story Points per Hour

0

1

2

3

4

5

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

HrsStryPt

Analyzing Velocity

Item Cycle Times

02468

101214161820

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

Stories Hours HrsStryPt

0

10

20

30

40

50

60

1 2 2 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

Burndown Chart

Days

Click to edit Master title style Measuring and Managing Flow

Released

In progress

Backlogged

Cumulative Flow Diagram Days to delivery

outliers

Anderson (2010) Kanban

Click to edit Master title style

0

10

20

30

40

50

60

70

80

1 2 3 4 5 6 7 8 9 10 11 12

Sprints

Sto

ry p

oin

ts d

eliv

ered

Trends Over Sprints

0

10

20

30

40

50

60

70

80

1 2 3 4 5 6 7 8 9 10 11 12

Sprints

Sto

ry p

oin

ts d

eliv

ered

0

05

1

15

2

25

3

35

4

1 2 3 4 5 6 7 8 9 10 11 12

Sprints

Tech

nic

al d

ebt

Track and correlate multiple measures

across sprints

Predictive models

Click to edit Master title style Sources and Measures

Stories submitted

Stories pulled back

Growth in size of stories

Stories added mid-sprint

Stories under review

Failed tests

Non-development tasks

Assists to other projects

Data sources

Measures Project tracking

Code control

Bug tracking

Build system

Test environment

Deployment tools

Operational monitoring

Click to edit Master title style Recommended Books

Click to edit Master title style Personal Software Process (PSP)

PSP 0 Current process Time recording

Defect recording Defect type standards

PSP 01 Coding standard

Size measurement Process improvement

proposal

Baseline Personal Process

Humphrey (1997) Introduction to the Personal Software Process

PSP 1 Size estimating

Test report

Personal Planning

PSP 2 Code reviews

Design reviews

Personal Quality Management

PSP 21 Cyclic development

Cyclic Personal Process

PSP 11 Task planning

Schedule planning

PSP 21 Design templates

Click to edit Master title style Team Software Process (TSP) bull Built from personal processes of team members

bull Well defined team roles

bull Project launch workshops for planning

bull Team measurement and tracking

bull Post-mortems for improvement

bull Team application of lean principles

Humphrey (2000) Introduction to the Team Software Process

Click to edit Master title style Aggregated Personal Data Best

R2s range from 70 to 87

The ability to predict the amount of time required to produce a piece of software is dramatically improved by estimating at the individual level first and then combining the estimates rather than developing a single team estimate averaged over individuals

R2 = 53

Click to edit Master title style Defect Detection at Intuit

Consequently test changes from defect

detection into correctness verification

PSPTSP shifts defect detection from the test phase back up into development

Fagan (2005)

Personal reviews

33

Compile 33

Team reviews 19

Unit test 15

Test 14

Click to edit Master title style

Dramatic reductions

ndash Delivered defects

ndash Days per KLOC

ndash Schedule deviation

ndash Test defectskloc

ndash Variation in results

Humphrey (2006) TSP Leading a Development Team

TSP Benefits

Click to edit Master title style Section 3 Section 3 Measuring Software

Improvement Programs

1) Improvement program methods

2) Maturity-based measurement

3) Improvement program results

61

Click to edit Master title style CMMI Maturity Transitions

Level 5 Optimizing

Innovation management

Level 4 Quantitatively

Managed

Capability management

Level 3 Defined

Process management

Level 2 Managed

Project management

Level 1 Initial

Inconsistent management

Chrisis Konrad amp Shrum (2005) CMMI

Click to edit Master title style Enhanced Maturity Framework

Level 5 Innovating

Innovation management

Level 4 Optimized

Capability management

Level 3 Standardized

Organizational management

Level 2 Stabilized

Work unit management

Level 1 Initial

Inconsistent management

Click to edit Master title style Maturity Level Transformation

Level 5 Opportunistic improvements

Empowered culture

Level 5 Proactive

improvements Agile culture

Organization

Projects

Individual

Level 4 Process and results

managed statistically Precision culture

Level 3 Organization develops

standard processes Common culture

Level 2 Project managers stabilize projects

Local culture

Level 1 Ad Hoc inconsistent

development practices Relationship culture

Trust

Click to edit Master title style Measurement by Level

Level 5 mdash Exploratory data ∆ t = N x ROI

Level 4 mdash Predictive data X plusmn (196σ radicn )

Level 3 mdash Process data 83 defects per review

Level 2 mdash Management data Effort = α LOCβ

Level 1 mdash Haphazard data ldquoabout a million linesrdquo

Measure-maturity mismatch slows improvement and creates

dysfunction

Click to edit Master title style Average Improvement Results

Productivity growth 4 35 9 - 67

Pre-test defect detection 3 22 6 - 25

Time to market 2 darr 19 15 - 23

Field error reports 5 darr 39 10 - 94

Return on investment 5 501 41 - 881

Annual Annual Improvement benefit Orgs median range

Click to edit Master title style Schedule Performance

0

02

04

06

08

1

12

14

16

1 2 3

Schedule performance

index

CMM maturity level

Budgeted cost of work performed

Budgeted cost of work scheduled

behind ahead

Air Force study of contractors

Flowe amp Thordahl (1994)

Click to edit Master title style Cost Performance

Cost performance

index

0

05

1

15

2

25

1 2 3 CMM Maturity Level

Budgeted cost of work performed Actual cost of

work performed

overrun underrun

Air Force study of contractors

Flowe amp Thordahl (1994)

Click to edit Master title style Crosbyrsquos Cost of Quality Model Project Cost

Performance Cost of Quality Generating plans documentation Development - requirements - design - code - integration Prevention Appraisal

Reviews - system - requirements - design - test plan - test scripts Walkthroughs Testing (first time) Independent Verification and Validation (first time) Audits

Training Policies Procedures and methods Tools Planning Quality Improvement Projects Data Gathering and Analysis Root Cause Analysis Quality Reporting

Nonconformance Conformance Fixing Defects Reworking any Document Updating Source Code Re-reviews Re-tests Lab costs for re-tests Patches(Internal and External) Engineering Changes Change Control Boards External Failures and fines Customer Support Help Desks

Dion (1993) Crosby (1979) Quality Is Free

Click to edit Master title style Raytheons Cost of Quality

bull Performance mdash cost of building it right first time bull Nonconformance mdash cost of rework bull Appraisal mdash cost of testing bull Prevention mdash cost of preventing nonconformance

Dion (1993) Haley (1995)

23

18

Year Level Perform Nonconf Appraise Prevent

1988 1 34 41 15 7

1990 2 55 18 15 12

1992 3 66 11

1994 4 76 6 75

Click to edit Master title style Raytheonrsquos Productivity

0

1

2

3

4

1988 1989 1990 1991 1992 1993 1994

Growth relative to 1989 baseline

Lydon (1995)

Click to edit Master title style Raytheonrsquos Cost Reduction

0

02

04

06

08

1

12

1988 1989 1990 1991 1992 1993 1994

Reduction relative to 1989 baseline

Lydon (1995)

Click to edit Master title style

Haley (1995)

1993 1992 1991 1990 1989 1988 1994

50

40

30

20

10

0

-10

Actual cost

Budgeted cost

over-run under-run

4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1

Raytheonrsquos Cost Predictability

Click to edit Master title style OMRONrsquos Reuse gains

Sakamoto et al (1996)

0

5 0 0

1 0 0 0

1 5 0 0

2 0 0 0

2 5 0 0

3 0 0 0

1 9 8 7 1 9 8 8 1 9 8 9 1 9 9 0 1 9 9 1 1 9 9 2 1 9 9 3 1 9 9 4

Steps

100

75

50

25

0

of reuse

Click to edit Master title style Section 4 Section 4 Measuring Quality and

Technical Debt

1) Structural quality measurement

2) Technical debt

3) Consortium for IT Software Quality (CISQ)

89

Click to edit Master title style What about the Product CMMIrsquos primary assumption

ldquoThe quality of a software system is governed by quality of the process used to develop itrdquo

- Watts Humphrey

CMMIrsquos product quality problem

1 Assessments do not verify product quality

2 Compliance focus undermines CMMIrsquos primary assumption

3 Product quality focus at Level 4 in CMM was lost in quantitative process control in CMMI

4 The assumptions underlying control charts are violated by software data

Click to edit Master title style Six Sigmarsquos Challenge

Process focus ndash process improvement ndash Six Sigma Product focus ndash product improvement ndash Design for 6σ

Product focus supplements product focus to unlock even more value from software

Six Sigma projects must have significant benefits Huge benefits

Large benefits

Good benefits

Okay benefits

Small benefits

Now what

Ultimately we run out of projects with significant enough benefits to continue the programhellip

so what is the solution that allows continual improvement

Click to edit Master title style Testing is Not Enough ldquoAs higher levels of assurance are demandedhelliptesting cannot deliver the level of confidence required at a reasonable costrdquo

ldquoThe correctness of the code is rarely the weakest linkrdquo

Jackson D (2009) Spinellis D (2006)

ldquohellipa failure to satisfy a non-functional requirement can be critical even catastrophichellipnon-functional requirements are sometimes difficult to verify We cannot write a test case to verify a systemrsquos reliabilityhellipThe ability to associate code to non-functional properties can be a powerful weapon in a software engineerrsquos arsenalrdquo

Click to edit Master title style CASTrsquos Application Intelligence Platform

Application Analysis

Evaluation of 1200+ coding amp

architectural rules

Application meta-data

Transferability

Changeability

Robustness

Performance

Security

Quality Measurements

Detected Violations

Expensive operation in loop Static vs pooled connections Complex query on big table Large indices on big table

Empty CATCH block Uncontrolled data access Poor memory management Opened resource not closed

SQL injection Cross-site scripting Buffer overflow Uncontrolled format string

Unstructured code Misuse of inheritance Lack of comments Violated naming convention

Highly coupled component Duplicated code Index modified in loop High cyclomatic complexity

Language Parsers

Oracle PLSQL Sybase T-SQL SQL Server T-SQL IBM SQLPSM C C++ C Pro C Cobol CICS Visual Basic VBNet ASPNet Java J2EE JSP XML HTML Javascript VBScript PHP PowerBuilder Oracle Forms PeopleSoft SAP ABAP Netweaver Tibco Business Objects Universal Analyzer for other languages

Click to edit Master title style Modern Apps are a Technology Stack Ar

chite

ctur

al C

ompl

ianc

e

Data Flow Transaction Risk

EJB PLSQL

Oracle

SQL Server

DB2

TSQL

Hibernate

Spring

Struts NET

COBOL

Sybase IMS

Messaging

Integration quality Architectural

compliance Risk propagation Application security Resiliency checks Transaction integrity

Function point Effort estimation Data access control SDK versioning Calibration across

technologies IT organization level

System Level 3

Code style amp layout Expression complexity Code documentation Class or program design Basic coding standards Developer level

Unit Level 1

Java Java Java

Java Java

Web Services Java Java Single languagetechnology layer

Intra-technology architecture Intra-layer dependencies Inter-program invocation Security vulnerabilities Development team level

Technology Level 2

JSP ASPNET APIs

Click to edit Master title style Architecturally Complex Defects

Primary cause of operational

problems

20x as many

fixes to correct

Architecturally Complex Defect

A structural flaw involving interactions among multiple components that reside in different application layers

48

52 92

8 Architecturally

Complex Defects

Code unit-level violations

of total app defects

of total repair effort

Most architecturally complex defects involve an architectural hotspot ndash a badly-built component that should be replaced

Click to edit Master title style Propagated Risk

The impact of violation 284 in Component B is very widely propagated throughout the system ndash

it presents very high risk

The impact of violation 342 in Component A is only narrowly propagated in the system

ndash it presents low risk

Propagated Risk Index

A ranking of the risk created by each violation based on the its severity and the breadth of its impact in the system

Violation 342 in component A

Violation 284 in component B

Remediating violation 284 in Component B will have

the greater impact on reducing risk and cost

Click to edit Master title style Transaction Risk

Customers care most about the dependability of their transactions

The risk of a transaction is determined by the number and severity of violations along its path

Transactions can be ranked by risk to prioritize their remediation

The Transaction Risk Index can be further tuned using the operational frequency of each transaction

Transaction Risk Index

A ranking of transactions based on the risk created by the number and severity of violations along the transaction path

Transaction entryexit in the user interface layer

Transaction fulfillment in the data storage layer

1 3 2 4 5

Transaction 1 poses greatest risk

Transaction processing in the business logic layer

Click to edit Master title style Risk-Based Prioritization

Violation Health Factor Severity Propagated Risk Index

In a high risk transaction

Frequency of path execution

148 Security 9

371 Robustness 9

062 Performance 8

284 Robustness 7

016 Changeability 7

241 Security 5

173 Transferability 4

359 Transferability 3

Click to edit Master title style Static Analysis in US DoD

John Keane (2013)

Testing finds lt13 of structural defects

Static analysis before testing is 85 efficient

With testing and static analysis combined test schedules are reduced by 50

Click to edit Master title style Reducing Operational Incidents amp Costs

Study of structural quality measures and maintenance effort across 20 customers in a large global system integrator

TQI increase of 24 decreased corrective maintenance effort by 50

Total Quality Index

Click to edit Master title style Reducing Operational Losses

0

20

40

60

80

100

120

140

160

180

100 150 200 250 300 350 400

Technical Quality Index

Pro

du

ctio

n In

cid

ents

+$2000000 Annual Loss

$50000 Annual Loss

$5000 Annual Loss

Large international investment bank Business critical applications

Click to edit Master title style Factors Affecting Structural Quality

2017 CRASH Report Request from infocastsoftwarecom

Factor apps Robust Security Perform Change Transfer

Technology 1606 20 3 16 26 7

Java-EE

Industry 677 5 5 7 4 6

App type 510 1 1 1 1 1

Source 580 1 1 4

Shore 540

Maturity 72 28 25 15 24 12

Method 208 10 6 14 6

Team size 186 7 5 8 8

of users 262 4 3 5 4

Percentage of variation in structural quality scores accounted for by various factors

Click to edit Master title style The Technical Debt Metaphor

Structural quality problems in production code

Technical Debt

Principal borrowed

Interest on the debt

Business Risk Liability from debt

Opportunity cost

Interestmdashcontinuing IT costs attributable to the violations causing technical debt ie higher maintenance costs greater resource usage etc

Principalcost of fixing problems remaining in the code after release that must be remediated

Opportunity costmdashbenefits that could have been achieved had resources been put on new capability rather than retiring technical debt

Liabilitymdashbusiness costs related to outages breaches corrupted data etc

Technical Debt the future cost of fixing defects remaining in code at release a component of the cost of ownership

Todayrsquos talk focuses on the principal

Curtis et al (2012) IEEE Software

Click to edit Master title style Tech Debt by Quality Factor

Transferability

40

Changeability

30

Security 7

Robustness

18

70 of Technical Debt is in IT Cost (Transferability Changeability)

30 of Technical Debt is in Business Risk

(Robustness Performance Security)

Health Factor proportions are mostly

consistent across technologies

Curtis et al (2012) IEEE Software

Click to edit Master title style Join CISQ wwwit-cisqorg

Click to edit Master title style Section

References

115

Click to edit Master title style References 1 Anderson DJ (2010) Kanban Sequim WA Blue Hole Press

Basili V Trendowicz A Kowalczyk M Heidrich J Seaman C Munch J amp Rombach D (2014) Aligning Organizations through Measurement Heidelberg Springer

Boehm B (1987) Increasing software productivity IEEE Computer 20 (9) 43-57

Boehm B et al (2000) Software Cost Estimation with COCOMO II Prentice-Hall

Chrissis MB Konrad M amp Shrum S (2011) CMMI Third Edition Guidelines for Process Integration and Product Improvement Reading MA Addison Wesley

Cohn M (2006) Agile Estimating and Planning Upper Saddle River NJ Prentice-Hall

Crosby P (1979) Quality Is Free New York McGraw-Hill

Curtis B Sapiddi J amp Szynkarski A (2012) Estimating the principle of an applicationrsquos technical debt IEEE Software 29 (6) 34-42

Dion R (1993) Process improvement and the corporate balance sheet IEEE Software 10 (4) 28-35

Duncker R (1992) Presentation at the 25th Annual Conference of the Singapore Computer Society Singapore November 1992

Dyne K (1999) ESSI Benefits from Continuous Software Improvements In Proceedings of ESEPGrsquo99 Milton Keyes UK ESPI Foundation

Flowe RM amp Thordahl JB (1994) A Correlational Study of the SEIrsquos Capability Maturity Model and Software Development Performance in DoD Contracts (AFITGSSLAR94D-2) Dayton OH Air Force Institute of Technology Wright Patterson Air Force Base

116

Click to edit Master title style References 2 Haley T Ireland B Wojtaszek E Nash D amp Dion R (1995) Raytheon Electronic Systems

Experience in Software Process Improvement (Tech Rep CMUSEI-95-TR-017) ) Pittsburgh Software Engineering Institute Carnegie Mellon University

Ebert C amp Dumke R (2007) Software Measurement Berlin Springer-Verlag

Fagan E amp Davis N (2005) TSP PSP at Intuit Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Fenton N (2006) New directions for Software Metrics CIO Symposium on Software Best Practices

Garmus D amp Herron D (2001) Function Point Analysis Boston Addison-Wesley

George E (2005) Project management with TSP Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Herbsleb J Carleton A Rozum J Siegel J amp Zubrow D (1994) Benefits of CMM-Based Software Process Improvement Initial Results (Tech Rep CMUSEI-94-TR-13) Pittsburgh Software Engineering Institute Carnegie Mellon University

Humphrey W (1997) Introduction to the Personal Software Process Boston Addison Wesley

Humphrey W (2000) Introduction to the Team Software Process Boston Addison Wesley

Humphrey W (2006) TSP Leading a Development Team Boston Addison Wesley

Jackson D (2009) A direct path to dependable software Communications of the ACM 52 (4)

Kan SH (1995) Metrics and Models in Software Quality Engineering Reading MA Addison-Wesley

Keeni B (2000) IEEE Software 16 (4)

117

Click to edit Master title style References 2 Larman C (2004) Agile and Iterative Development Boston Addison Wesley

Lydon T (1995) Productivity drivers Process and capital In Proceedings of the 1995 SEPG Conference Pittsburgh Software Engineering Institute Carnegie Mellon University

McGarry J et al (2002) Practical Software Measurement Reading MA Addison Wesley

Paulk M Weber C Curtis B amp Chrissis M (1995) The Capability Maturity Model Guidelines for Improving the Software Process Reading MA Addison-Wesley

Peterman W (2000) IEEE Software JulyAugust issue

Poppendieck M amp Poppendieck T (2007) Implementing Lean Software Development Boston Addison-Wesley

Sakamoto K Kishida K amp Nakakoji K (1996) Cultural adaptation of the CMM In Fuggetta A amp Wolf A (Eds) Software Process Chichester UK Wiley 137-154

Sinha M (2006) Performance models ndash Enabling organizations prediction capability Proceedings of SEPGrsquo06 Pittsburgh Software Engineering Institute Carnegie Mellon University

Schwaber K amp Beedle M (2002) Agile Software Development Using Scrum

Spinellis D (2006) Code Quality Addison-Wesley

Van Epps S amp Marshall R (2012) Using TSP at the end of the development cycle Proceedings of TSP 2012 Pittsburgh Software Engineering Institute Carnegie Mellon University

Vu JD (1996) Software process improvement A business case In Proceedings of the European SEPG Conference Milton Keynes UK European Software Process Improvement Foundation

118

  • Best Practices for Software Process and Product Measurement
  • Instructor Dr Bill Curtis
  • Agenda
  • Section 1
  • Why Does Measurement Languish
  • Measurement Process Guidance
  • Measurement Categories
  • Productivity Analysis Objectives
  • Productivity Analysis Measures
  • Software Productivity
  • Software Size Measures
  • How Many Lines of Code
  • Function Point Estimation
  • Effort mdash Weakest Measure
  • How Quality Affects Productivity
  • Carry-forward Rework
  • Example of Quality Impact
  • Quality-Adjusted Productivity
  • Best Practices for Analysis
  • 1 Evaluate Demographics
  • 2 Segment Baselines
  • 3 Beware Sampling Effects
  • 4 Understand Variation
  • 5 Investigate Distributions
  • 6 Account for Maturity Effects
  • 7 Caution for External Data Sets
  • Slide Number 27
  • Measuring Project Progress
  • Requirements Change Impact
  • Phase Defect Removal Model
  • SCRUM Board
  • Burndown Chart and Velocity
  • Analyzing Velocity
  • Measuring and Managing Flow
  • Trends Over Sprints
  • Sources and Measures
  • Recommended Books
  • Personal Software Process (PSP)
  • Team Software Process (TSP)
  • Aggregated Personal Data Best
  • Defect Detection at Intuit
  • TSP Benefits
  • Section 3
  • CMMI Maturity Transitions
  • Enhanced Maturity Framework
  • Maturity Level Transformation
  • Measurement by Level
  • Average Improvement Results
  • Schedule Performance
  • Cost Performance
  • Crosbyrsquos Cost of Quality Model
  • Raytheons Cost of Quality
  • Raytheonrsquos Productivity
  • Raytheonrsquos Cost Reduction
  • Raytheonrsquos Cost Predictability
  • OMRONrsquos Reuse gains
  • Section 4
  • What about the Product
  • Six Sigmarsquos Challenge
  • Testing is Not Enough
  • CASTrsquos Application Intelligence Platform
  • Modern Apps are a Technology Stack
  • Architecturally Complex Defects
  • Propagated Risk
  • Transaction Risk
  • Risk-Based Prioritization
  • Static Analysis in US DoD
  • Reducing Operational Incidents amp Costs
  • Reducing Operational Losses
  • Slide Number 70
  • The Technical Debt Metaphor
  • Tech Debt by Quality Factor
  • Join CISQ wwwit-cisqorg
  • Section
  • References 1
  • References 2
  • References 2

PHASE

Escapes Previous Phase

KSLOC

Defect Injection

KSLOC

Subtotal

KSLOC

Removal

Effective-ness

Defect Removal

KSLOC

Escapes at phase exit KSLOC

Rqmts

0

12

12

0

0

12

Design

12

180

192

76

146

46

Code amp UT

46

154

200

71

142

58

Inte-gration

58

0

58

67

39

19

Sys Test

19

0

19

58

11

08

Opera-tion

08

Requirements Added Requirements Changed Requirements Deleted
Feb 2 20 1
Mar 2 11 1
Apr 45 35 3
May 15 61 1
Jun 1 20 4
Jul 15 15 0
Aug 18 05 2
Feb Feb Feb
Mar Mar Mar
Apr Apr Apr
May May May
Jun Jun Jun
Jul Jul Jul
Aug Aug Aug
Page 16: Best Practices for Software Process and Product Measurement · Software Process and Product Measurement Dr. Bill Curtis ... measurement best practice • Best guidance for starting

Click to edit Master title style Carry-forward Rework

Release N+2

Develop N+2 Rework N+2 Rework N+1 Rework N+0

Develop N

Release N

Rework N

Unfixed defects release N

Release N+1

Develop N+1 Rework N+1 Rework N+0

Unfixed defects release N

Unfixed defects release N+1

Click to edit Master title style Example of Quality Impact Project B (Better Faster Cheaper)

ndash 20 developers 3 months

ndash $120k per FTE

ndash 4 FPs per staff month

ndash 240 FPs delivered

$2500FP cost

ndash 5 critical violations per FP

ndash $500 per fix

ndash Cost for 1200 fixes = $600k

ndash Total Cost to Own = $1200k

$5000FP of TCO

Project A (Plodders)

ndash 20 developers 3 months

ndash $120k per FTE

ndash 3 FPs per staff month

ndash 180 FPs delivered

$3333FP cost

ndash 2 critical violations per FP

ndash $500 per fix

ndash Cost for 360 fixes = $180k

ndash Total Cost to Own = $780k

$4333FP of TCO

Project B is 25 more productive

However

Project A is 134 more productive

Click to edit Master title style Quality-Adjusted Productivity

Productivity

Estimation

Benchmarks

Value amp ROI

Etc

Automated Enhancement

Points

Size of both functional and non-functional code segments

Corrective effort in future releases for defects injected in

this release

Must add future effort to fix bugs into productivity

calculations

Effort amp Cost

Quality-Adjusted

Productivity

Automated Technical Debt

Click to edit Master title style Best Practices for Analysis

1) Segment baselines

2) Beware sampling effects

3) Understand variation

4) Evaluate demographics

5) Investigate distributions

6) Account for maturity effects

7) Beware external data sets

Click to edit Master title style 1 Evaluate Demographics

Routine reports

COBOL apps

Function Points

Prod

uctiv

ity

Web-based apps

Innovative mobile

Custom ERP

Click to edit Master title style 2 Segment Baselines

Year Projects Productivity Total Corporate 1981 28 2342 1980 21 1939 Telecommunications 1981 14 1811 1980 12 1458 Engineering amp Defense 1981 8 2965 1980 6 2739 Business Applications 1981 6 3054 1980 3 1813

Multiple baselines are usually the most valid

Click to edit Master title style 3 Beware Sampling Effects

22

0

500

1000

1500

2000

2500

1980 1981 1982

Line

s of

cod

epe

rson

-yea

rs

Baselines

Lots of small

projects Lots of large

projects

Click to edit Master title style 4 Understand Variation

23

Product factors bull timing constraints bull memory utilization bull CPU occupancy bull resource constraints bull complexity bull product size

Project factors bull concurrent development bull development computer bull requirements definition bull requirements churn bull programming practices bull personnel experience bull client experience bull client participation

Project 49

Product 16

Not explained

35

R2product = 16

R2project = 49

R2

model = 65

J Vosburgh B Curtis et al (1984) Proceedings of ICSE 1994

Click to edit Master title style 5 Investigate Distributions

Low Medium High

Rel

ativ

e p

rod

uct

ivit

y (

of m

ean

)

Software Engineering Practices Usage

-100

-50

0

50

100

150

200

250

Click to edit Master title style 6 Account for Maturity Effects

Which organization will be required to spend more effort on correcting software flaws

Click to edit Master title style 7 Caution for External Data Sets

26

DACS Us

Productivity

Siz

e Data definitions

Data collection

Data validity

Data age

Demographics

Click to edit Master title style

1) Tracking traditional projects

2) Iterative and Agile measurement

3) Team Software Process measurement

33

Section 2 Measuring to Manage

Software Projects

Click to edit Master title style Measuring Project Progress

300K 250K 200K 150K 100K 50K 0K

Jan Feb Mar Apr May Jun

Planned rate of code Growth

Actual code growth

Completion of coding

ldquoThe measures say we are on schedulerdquo

300K

250K

200K

150K

100K

50K

0K Jan Feb Mar Apr May Jun

Planned rate of code Growth

Actual code growth

Completion of coding

ldquoBut the code keeps growing We should have measured the requirements growthrdquo

50

100

150

Defect reports

Total

Open

Closed

125

75

25

Jan Feb Mar Apr May Jun

Completion of coding

ldquoAnd all the defect reports remaining to be fixed will keep us far behind schedulerdquo

Click to edit Master title style Requirements Change Impact

Number of requirements

affected

Chart6

Requirements Added
Requirements Changed
Requirements Deleted
Person hours of work or rework
2
20
1
2
11
1
45
35
3
15
61
1
1
20
4
15
15
0
18
05
2

Sheet1

Sheet1

Requirements Added
Requirements Changed
Requirements Deleted
Person hours of work or rework

Sheet2

Sheet3

Click to edit Master title style Phase Defect Removal Model

PHASE

Escapes Previous Phase KSLOC

Defect Injection KSLOC

Subtotal KSLOC

Removal Effective-ness

Defect Removal KSLOC

Escapes at phase exit KSLOC

Rqmts 0 12 12 0 0 12

Design 12 180 192 76 146 46

Code amp UT 46 154 200 71 142 58

Inte-gration 58 0 58 67 39 19

Sys Test 19 0 19 58 11 08

Opera-tion 08

Kan (1999) Metrics and Model in Software Quality Engineering

Click to edit Master title style SCRUM Board Scrumboard displays

the path to completion for each story by showing task status

Story points are an estimate of days for each story

Agile estimating is by feature (story) rather than by task

lsquoDonersquo does not always mean all tasks are finished

Kanban methods restrict the work in progress to a limited number of paths

Click to edit Master title style Burndown Chart and Velocity

Burndown chart displays story completion by day

Burndown chart indicates actual versus estimated progress

Velocity is the rate at which stories are completed

Velocity indicates sustainable progress

Velocity results provide one source of historical data for improving estimates

Story points without the context of productivity factors are dangerous for estimating

Click to edit Master title style Team Velocity

0

1

2

3

4

5

1 2 2 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20Days

Story Points per Hour

0

1

2

3

4

5

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

HrsStryPt

Analyzing Velocity

Item Cycle Times

02468

101214161820

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

Stories Hours HrsStryPt

0

10

20

30

40

50

60

1 2 2 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

Burndown Chart

Days

Click to edit Master title style Measuring and Managing Flow

Released

In progress

Backlogged

Cumulative Flow Diagram Days to delivery

outliers

Anderson (2010) Kanban

Click to edit Master title style

0

10

20

30

40

50

60

70

80

1 2 3 4 5 6 7 8 9 10 11 12

Sprints

Sto

ry p

oin

ts d

eliv

ered

Trends Over Sprints

0

10

20

30

40

50

60

70

80

1 2 3 4 5 6 7 8 9 10 11 12

Sprints

Sto

ry p

oin

ts d

eliv

ered

0

05

1

15

2

25

3

35

4

1 2 3 4 5 6 7 8 9 10 11 12

Sprints

Tech

nic

al d

ebt

Track and correlate multiple measures

across sprints

Predictive models

Click to edit Master title style Sources and Measures

Stories submitted

Stories pulled back

Growth in size of stories

Stories added mid-sprint

Stories under review

Failed tests

Non-development tasks

Assists to other projects

Data sources

Measures Project tracking

Code control

Bug tracking

Build system

Test environment

Deployment tools

Operational monitoring

Click to edit Master title style Recommended Books

Click to edit Master title style Personal Software Process (PSP)

PSP 0 Current process Time recording

Defect recording Defect type standards

PSP 01 Coding standard

Size measurement Process improvement

proposal

Baseline Personal Process

Humphrey (1997) Introduction to the Personal Software Process

PSP 1 Size estimating

Test report

Personal Planning

PSP 2 Code reviews

Design reviews

Personal Quality Management

PSP 21 Cyclic development

Cyclic Personal Process

PSP 11 Task planning

Schedule planning

PSP 21 Design templates

Click to edit Master title style Team Software Process (TSP) bull Built from personal processes of team members

bull Well defined team roles

bull Project launch workshops for planning

bull Team measurement and tracking

bull Post-mortems for improvement

bull Team application of lean principles

Humphrey (2000) Introduction to the Team Software Process

Click to edit Master title style Aggregated Personal Data Best

R2s range from 70 to 87

The ability to predict the amount of time required to produce a piece of software is dramatically improved by estimating at the individual level first and then combining the estimates rather than developing a single team estimate averaged over individuals

R2 = 53

Click to edit Master title style Defect Detection at Intuit

Consequently test changes from defect

detection into correctness verification

PSPTSP shifts defect detection from the test phase back up into development

Fagan (2005)

Personal reviews

33

Compile 33

Team reviews 19

Unit test 15

Test 14

Click to edit Master title style

Dramatic reductions

ndash Delivered defects

ndash Days per KLOC

ndash Schedule deviation

ndash Test defectskloc

ndash Variation in results

Humphrey (2006) TSP Leading a Development Team

TSP Benefits

Click to edit Master title style Section 3 Section 3 Measuring Software

Improvement Programs

1) Improvement program methods

2) Maturity-based measurement

3) Improvement program results

61

Click to edit Master title style CMMI Maturity Transitions

Level 5 Optimizing

Innovation management

Level 4 Quantitatively

Managed

Capability management

Level 3 Defined

Process management

Level 2 Managed

Project management

Level 1 Initial

Inconsistent management

Chrisis Konrad amp Shrum (2005) CMMI

Click to edit Master title style Enhanced Maturity Framework

Level 5 Innovating

Innovation management

Level 4 Optimized

Capability management

Level 3 Standardized

Organizational management

Level 2 Stabilized

Work unit management

Level 1 Initial

Inconsistent management

Click to edit Master title style Maturity Level Transformation

Level 5 Opportunistic improvements

Empowered culture

Level 5 Proactive

improvements Agile culture

Organization

Projects

Individual

Level 4 Process and results

managed statistically Precision culture

Level 3 Organization develops

standard processes Common culture

Level 2 Project managers stabilize projects

Local culture

Level 1 Ad Hoc inconsistent

development practices Relationship culture

Trust

Click to edit Master title style Measurement by Level

Level 5 mdash Exploratory data ∆ t = N x ROI

Level 4 mdash Predictive data X plusmn (196σ radicn )

Level 3 mdash Process data 83 defects per review

Level 2 mdash Management data Effort = α LOCβ

Level 1 mdash Haphazard data ldquoabout a million linesrdquo

Measure-maturity mismatch slows improvement and creates

dysfunction

Click to edit Master title style Average Improvement Results

Productivity growth 4 35 9 - 67

Pre-test defect detection 3 22 6 - 25

Time to market 2 darr 19 15 - 23

Field error reports 5 darr 39 10 - 94

Return on investment 5 501 41 - 881

Annual Annual Improvement benefit Orgs median range

Click to edit Master title style Schedule Performance

0

02

04

06

08

1

12

14

16

1 2 3

Schedule performance

index

CMM maturity level

Budgeted cost of work performed

Budgeted cost of work scheduled

behind ahead

Air Force study of contractors

Flowe amp Thordahl (1994)

Click to edit Master title style Cost Performance

Cost performance

index

0

05

1

15

2

25

1 2 3 CMM Maturity Level

Budgeted cost of work performed Actual cost of

work performed

overrun underrun

Air Force study of contractors

Flowe amp Thordahl (1994)

Click to edit Master title style Crosbyrsquos Cost of Quality Model Project Cost

Performance Cost of Quality Generating plans documentation Development - requirements - design - code - integration Prevention Appraisal

Reviews - system - requirements - design - test plan - test scripts Walkthroughs Testing (first time) Independent Verification and Validation (first time) Audits

Training Policies Procedures and methods Tools Planning Quality Improvement Projects Data Gathering and Analysis Root Cause Analysis Quality Reporting

Nonconformance Conformance Fixing Defects Reworking any Document Updating Source Code Re-reviews Re-tests Lab costs for re-tests Patches(Internal and External) Engineering Changes Change Control Boards External Failures and fines Customer Support Help Desks

Dion (1993) Crosby (1979) Quality Is Free

Click to edit Master title style Raytheons Cost of Quality

bull Performance mdash cost of building it right first time bull Nonconformance mdash cost of rework bull Appraisal mdash cost of testing bull Prevention mdash cost of preventing nonconformance

Dion (1993) Haley (1995)

23

18

Year Level Perform Nonconf Appraise Prevent

1988 1 34 41 15 7

1990 2 55 18 15 12

1992 3 66 11

1994 4 76 6 75

Click to edit Master title style Raytheonrsquos Productivity

0

1

2

3

4

1988 1989 1990 1991 1992 1993 1994

Growth relative to 1989 baseline

Lydon (1995)

Click to edit Master title style Raytheonrsquos Cost Reduction

0

02

04

06

08

1

12

1988 1989 1990 1991 1992 1993 1994

Reduction relative to 1989 baseline

Lydon (1995)

Click to edit Master title style

Haley (1995)

1993 1992 1991 1990 1989 1988 1994

50

40

30

20

10

0

-10

Actual cost

Budgeted cost

over-run under-run

4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1

Raytheonrsquos Cost Predictability

Click to edit Master title style OMRONrsquos Reuse gains

Sakamoto et al (1996)

0

5 0 0

1 0 0 0

1 5 0 0

2 0 0 0

2 5 0 0

3 0 0 0

1 9 8 7 1 9 8 8 1 9 8 9 1 9 9 0 1 9 9 1 1 9 9 2 1 9 9 3 1 9 9 4

Steps

100

75

50

25

0

of reuse

Click to edit Master title style Section 4 Section 4 Measuring Quality and

Technical Debt

1) Structural quality measurement

2) Technical debt

3) Consortium for IT Software Quality (CISQ)

89

Click to edit Master title style What about the Product CMMIrsquos primary assumption

ldquoThe quality of a software system is governed by quality of the process used to develop itrdquo

- Watts Humphrey

CMMIrsquos product quality problem

1 Assessments do not verify product quality

2 Compliance focus undermines CMMIrsquos primary assumption

3 Product quality focus at Level 4 in CMM was lost in quantitative process control in CMMI

4 The assumptions underlying control charts are violated by software data

Click to edit Master title style Six Sigmarsquos Challenge

Process focus ndash process improvement ndash Six Sigma Product focus ndash product improvement ndash Design for 6σ

Product focus supplements product focus to unlock even more value from software

Six Sigma projects must have significant benefits Huge benefits

Large benefits

Good benefits

Okay benefits

Small benefits

Now what

Ultimately we run out of projects with significant enough benefits to continue the programhellip

so what is the solution that allows continual improvement

Click to edit Master title style Testing is Not Enough ldquoAs higher levels of assurance are demandedhelliptesting cannot deliver the level of confidence required at a reasonable costrdquo

ldquoThe correctness of the code is rarely the weakest linkrdquo

Jackson D (2009) Spinellis D (2006)

ldquohellipa failure to satisfy a non-functional requirement can be critical even catastrophichellipnon-functional requirements are sometimes difficult to verify We cannot write a test case to verify a systemrsquos reliabilityhellipThe ability to associate code to non-functional properties can be a powerful weapon in a software engineerrsquos arsenalrdquo

Click to edit Master title style CASTrsquos Application Intelligence Platform

Application Analysis

Evaluation of 1200+ coding amp

architectural rules

Application meta-data

Transferability

Changeability

Robustness

Performance

Security

Quality Measurements

Detected Violations

Expensive operation in loop Static vs pooled connections Complex query on big table Large indices on big table

Empty CATCH block Uncontrolled data access Poor memory management Opened resource not closed

SQL injection Cross-site scripting Buffer overflow Uncontrolled format string

Unstructured code Misuse of inheritance Lack of comments Violated naming convention

Highly coupled component Duplicated code Index modified in loop High cyclomatic complexity

Language Parsers

Oracle PLSQL Sybase T-SQL SQL Server T-SQL IBM SQLPSM C C++ C Pro C Cobol CICS Visual Basic VBNet ASPNet Java J2EE JSP XML HTML Javascript VBScript PHP PowerBuilder Oracle Forms PeopleSoft SAP ABAP Netweaver Tibco Business Objects Universal Analyzer for other languages

Click to edit Master title style Modern Apps are a Technology Stack Ar

chite

ctur

al C

ompl

ianc

e

Data Flow Transaction Risk

EJB PLSQL

Oracle

SQL Server

DB2

TSQL

Hibernate

Spring

Struts NET

COBOL

Sybase IMS

Messaging

Integration quality Architectural

compliance Risk propagation Application security Resiliency checks Transaction integrity

Function point Effort estimation Data access control SDK versioning Calibration across

technologies IT organization level

System Level 3

Code style amp layout Expression complexity Code documentation Class or program design Basic coding standards Developer level

Unit Level 1

Java Java Java

Java Java

Web Services Java Java Single languagetechnology layer

Intra-technology architecture Intra-layer dependencies Inter-program invocation Security vulnerabilities Development team level

Technology Level 2

JSP ASPNET APIs

Click to edit Master title style Architecturally Complex Defects

Primary cause of operational

problems

20x as many

fixes to correct

Architecturally Complex Defect

A structural flaw involving interactions among multiple components that reside in different application layers

48

52 92

8 Architecturally

Complex Defects

Code unit-level violations

of total app defects

of total repair effort

Most architecturally complex defects involve an architectural hotspot ndash a badly-built component that should be replaced

Click to edit Master title style Propagated Risk

The impact of violation 284 in Component B is very widely propagated throughout the system ndash

it presents very high risk

The impact of violation 342 in Component A is only narrowly propagated in the system

ndash it presents low risk

Propagated Risk Index

A ranking of the risk created by each violation based on the its severity and the breadth of its impact in the system

Violation 342 in component A

Violation 284 in component B

Remediating violation 284 in Component B will have

the greater impact on reducing risk and cost

Click to edit Master title style Transaction Risk

Customers care most about the dependability of their transactions

The risk of a transaction is determined by the number and severity of violations along its path

Transactions can be ranked by risk to prioritize their remediation

The Transaction Risk Index can be further tuned using the operational frequency of each transaction

Transaction Risk Index

A ranking of transactions based on the risk created by the number and severity of violations along the transaction path

Transaction entryexit in the user interface layer

Transaction fulfillment in the data storage layer

1 3 2 4 5

Transaction 1 poses greatest risk

Transaction processing in the business logic layer

Click to edit Master title style Risk-Based Prioritization

Violation Health Factor Severity Propagated Risk Index

In a high risk transaction

Frequency of path execution

148 Security 9

371 Robustness 9

062 Performance 8

284 Robustness 7

016 Changeability 7

241 Security 5

173 Transferability 4

359 Transferability 3

Click to edit Master title style Static Analysis in US DoD

John Keane (2013)

Testing finds lt13 of structural defects

Static analysis before testing is 85 efficient

With testing and static analysis combined test schedules are reduced by 50

Click to edit Master title style Reducing Operational Incidents amp Costs

Study of structural quality measures and maintenance effort across 20 customers in a large global system integrator

TQI increase of 24 decreased corrective maintenance effort by 50

Total Quality Index

Click to edit Master title style Reducing Operational Losses

0

20

40

60

80

100

120

140

160

180

100 150 200 250 300 350 400

Technical Quality Index

Pro

du

ctio

n In

cid

ents

+$2000000 Annual Loss

$50000 Annual Loss

$5000 Annual Loss

Large international investment bank Business critical applications

Click to edit Master title style Factors Affecting Structural Quality

2017 CRASH Report Request from infocastsoftwarecom

Factor apps Robust Security Perform Change Transfer

Technology 1606 20 3 16 26 7

Java-EE

Industry 677 5 5 7 4 6

App type 510 1 1 1 1 1

Source 580 1 1 4

Shore 540

Maturity 72 28 25 15 24 12

Method 208 10 6 14 6

Team size 186 7 5 8 8

of users 262 4 3 5 4

Percentage of variation in structural quality scores accounted for by various factors

Click to edit Master title style The Technical Debt Metaphor

Structural quality problems in production code

Technical Debt

Principal borrowed

Interest on the debt

Business Risk Liability from debt

Opportunity cost

Interestmdashcontinuing IT costs attributable to the violations causing technical debt ie higher maintenance costs greater resource usage etc

Principalcost of fixing problems remaining in the code after release that must be remediated

Opportunity costmdashbenefits that could have been achieved had resources been put on new capability rather than retiring technical debt

Liabilitymdashbusiness costs related to outages breaches corrupted data etc

Technical Debt the future cost of fixing defects remaining in code at release a component of the cost of ownership

Todayrsquos talk focuses on the principal

Curtis et al (2012) IEEE Software

Click to edit Master title style Tech Debt by Quality Factor

Transferability

40

Changeability

30

Security 7

Robustness

18

70 of Technical Debt is in IT Cost (Transferability Changeability)

30 of Technical Debt is in Business Risk

(Robustness Performance Security)

Health Factor proportions are mostly

consistent across technologies

Curtis et al (2012) IEEE Software

Click to edit Master title style Join CISQ wwwit-cisqorg

Click to edit Master title style Section

References

115

Click to edit Master title style References 1 Anderson DJ (2010) Kanban Sequim WA Blue Hole Press

Basili V Trendowicz A Kowalczyk M Heidrich J Seaman C Munch J amp Rombach D (2014) Aligning Organizations through Measurement Heidelberg Springer

Boehm B (1987) Increasing software productivity IEEE Computer 20 (9) 43-57

Boehm B et al (2000) Software Cost Estimation with COCOMO II Prentice-Hall

Chrissis MB Konrad M amp Shrum S (2011) CMMI Third Edition Guidelines for Process Integration and Product Improvement Reading MA Addison Wesley

Cohn M (2006) Agile Estimating and Planning Upper Saddle River NJ Prentice-Hall

Crosby P (1979) Quality Is Free New York McGraw-Hill

Curtis B Sapiddi J amp Szynkarski A (2012) Estimating the principle of an applicationrsquos technical debt IEEE Software 29 (6) 34-42

Dion R (1993) Process improvement and the corporate balance sheet IEEE Software 10 (4) 28-35

Duncker R (1992) Presentation at the 25th Annual Conference of the Singapore Computer Society Singapore November 1992

Dyne K (1999) ESSI Benefits from Continuous Software Improvements In Proceedings of ESEPGrsquo99 Milton Keyes UK ESPI Foundation

Flowe RM amp Thordahl JB (1994) A Correlational Study of the SEIrsquos Capability Maturity Model and Software Development Performance in DoD Contracts (AFITGSSLAR94D-2) Dayton OH Air Force Institute of Technology Wright Patterson Air Force Base

116

Click to edit Master title style References 2 Haley T Ireland B Wojtaszek E Nash D amp Dion R (1995) Raytheon Electronic Systems

Experience in Software Process Improvement (Tech Rep CMUSEI-95-TR-017) ) Pittsburgh Software Engineering Institute Carnegie Mellon University

Ebert C amp Dumke R (2007) Software Measurement Berlin Springer-Verlag

Fagan E amp Davis N (2005) TSP PSP at Intuit Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Fenton N (2006) New directions for Software Metrics CIO Symposium on Software Best Practices

Garmus D amp Herron D (2001) Function Point Analysis Boston Addison-Wesley

George E (2005) Project management with TSP Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Herbsleb J Carleton A Rozum J Siegel J amp Zubrow D (1994) Benefits of CMM-Based Software Process Improvement Initial Results (Tech Rep CMUSEI-94-TR-13) Pittsburgh Software Engineering Institute Carnegie Mellon University

Humphrey W (1997) Introduction to the Personal Software Process Boston Addison Wesley

Humphrey W (2000) Introduction to the Team Software Process Boston Addison Wesley

Humphrey W (2006) TSP Leading a Development Team Boston Addison Wesley

Jackson D (2009) A direct path to dependable software Communications of the ACM 52 (4)

Kan SH (1995) Metrics and Models in Software Quality Engineering Reading MA Addison-Wesley

Keeni B (2000) IEEE Software 16 (4)

117

Click to edit Master title style References 2 Larman C (2004) Agile and Iterative Development Boston Addison Wesley

Lydon T (1995) Productivity drivers Process and capital In Proceedings of the 1995 SEPG Conference Pittsburgh Software Engineering Institute Carnegie Mellon University

McGarry J et al (2002) Practical Software Measurement Reading MA Addison Wesley

Paulk M Weber C Curtis B amp Chrissis M (1995) The Capability Maturity Model Guidelines for Improving the Software Process Reading MA Addison-Wesley

Peterman W (2000) IEEE Software JulyAugust issue

Poppendieck M amp Poppendieck T (2007) Implementing Lean Software Development Boston Addison-Wesley

Sakamoto K Kishida K amp Nakakoji K (1996) Cultural adaptation of the CMM In Fuggetta A amp Wolf A (Eds) Software Process Chichester UK Wiley 137-154

Sinha M (2006) Performance models ndash Enabling organizations prediction capability Proceedings of SEPGrsquo06 Pittsburgh Software Engineering Institute Carnegie Mellon University

Schwaber K amp Beedle M (2002) Agile Software Development Using Scrum

Spinellis D (2006) Code Quality Addison-Wesley

Van Epps S amp Marshall R (2012) Using TSP at the end of the development cycle Proceedings of TSP 2012 Pittsburgh Software Engineering Institute Carnegie Mellon University

Vu JD (1996) Software process improvement A business case In Proceedings of the European SEPG Conference Milton Keynes UK European Software Process Improvement Foundation

118

  • Best Practices for Software Process and Product Measurement
  • Instructor Dr Bill Curtis
  • Agenda
  • Section 1
  • Why Does Measurement Languish
  • Measurement Process Guidance
  • Measurement Categories
  • Productivity Analysis Objectives
  • Productivity Analysis Measures
  • Software Productivity
  • Software Size Measures
  • How Many Lines of Code
  • Function Point Estimation
  • Effort mdash Weakest Measure
  • How Quality Affects Productivity
  • Carry-forward Rework
  • Example of Quality Impact
  • Quality-Adjusted Productivity
  • Best Practices for Analysis
  • 1 Evaluate Demographics
  • 2 Segment Baselines
  • 3 Beware Sampling Effects
  • 4 Understand Variation
  • 5 Investigate Distributions
  • 6 Account for Maturity Effects
  • 7 Caution for External Data Sets
  • Slide Number 27
  • Measuring Project Progress
  • Requirements Change Impact
  • Phase Defect Removal Model
  • SCRUM Board
  • Burndown Chart and Velocity
  • Analyzing Velocity
  • Measuring and Managing Flow
  • Trends Over Sprints
  • Sources and Measures
  • Recommended Books
  • Personal Software Process (PSP)
  • Team Software Process (TSP)
  • Aggregated Personal Data Best
  • Defect Detection at Intuit
  • TSP Benefits
  • Section 3
  • CMMI Maturity Transitions
  • Enhanced Maturity Framework
  • Maturity Level Transformation
  • Measurement by Level
  • Average Improvement Results
  • Schedule Performance
  • Cost Performance
  • Crosbyrsquos Cost of Quality Model
  • Raytheons Cost of Quality
  • Raytheonrsquos Productivity
  • Raytheonrsquos Cost Reduction
  • Raytheonrsquos Cost Predictability
  • OMRONrsquos Reuse gains
  • Section 4
  • What about the Product
  • Six Sigmarsquos Challenge
  • Testing is Not Enough
  • CASTrsquos Application Intelligence Platform
  • Modern Apps are a Technology Stack
  • Architecturally Complex Defects
  • Propagated Risk
  • Transaction Risk
  • Risk-Based Prioritization
  • Static Analysis in US DoD
  • Reducing Operational Incidents amp Costs
  • Reducing Operational Losses
  • Slide Number 70
  • The Technical Debt Metaphor
  • Tech Debt by Quality Factor
  • Join CISQ wwwit-cisqorg
  • Section
  • References 1
  • References 2
  • References 2

PHASE

Escapes Previous Phase

KSLOC

Defect Injection

KSLOC

Subtotal

KSLOC

Removal

Effective-ness

Defect Removal

KSLOC

Escapes at phase exit KSLOC

Rqmts

0

12

12

0

0

12

Design

12

180

192

76

146

46

Code amp UT

46

154

200

71

142

58

Inte-gration

58

0

58

67

39

19

Sys Test

19

0

19

58

11

08

Opera-tion

08

Requirements Added Requirements Changed Requirements Deleted
Feb 2 20 1
Mar 2 11 1
Apr 45 35 3
May 15 61 1
Jun 1 20 4
Jul 15 15 0
Aug 18 05 2
Feb Feb Feb
Mar Mar Mar
Apr Apr Apr
May May May
Jun Jun Jun
Jul Jul Jul
Aug Aug Aug
Page 17: Best Practices for Software Process and Product Measurement · Software Process and Product Measurement Dr. Bill Curtis ... measurement best practice • Best guidance for starting

Click to edit Master title style Example of Quality Impact Project B (Better Faster Cheaper)

ndash 20 developers 3 months

ndash $120k per FTE

ndash 4 FPs per staff month

ndash 240 FPs delivered

$2500FP cost

ndash 5 critical violations per FP

ndash $500 per fix

ndash Cost for 1200 fixes = $600k

ndash Total Cost to Own = $1200k

$5000FP of TCO

Project A (Plodders)

ndash 20 developers 3 months

ndash $120k per FTE

ndash 3 FPs per staff month

ndash 180 FPs delivered

$3333FP cost

ndash 2 critical violations per FP

ndash $500 per fix

ndash Cost for 360 fixes = $180k

ndash Total Cost to Own = $780k

$4333FP of TCO

Project B is 25 more productive

However

Project A is 134 more productive

Click to edit Master title style Quality-Adjusted Productivity

Productivity

Estimation

Benchmarks

Value amp ROI

Etc

Automated Enhancement

Points

Size of both functional and non-functional code segments

Corrective effort in future releases for defects injected in

this release

Must add future effort to fix bugs into productivity

calculations

Effort amp Cost

Quality-Adjusted

Productivity

Automated Technical Debt

Click to edit Master title style Best Practices for Analysis

1) Segment baselines

2) Beware sampling effects

3) Understand variation

4) Evaluate demographics

5) Investigate distributions

6) Account for maturity effects

7) Beware external data sets

Click to edit Master title style 1 Evaluate Demographics

Routine reports

COBOL apps

Function Points

Prod

uctiv

ity

Web-based apps

Innovative mobile

Custom ERP

Click to edit Master title style 2 Segment Baselines

Year Projects Productivity Total Corporate 1981 28 2342 1980 21 1939 Telecommunications 1981 14 1811 1980 12 1458 Engineering amp Defense 1981 8 2965 1980 6 2739 Business Applications 1981 6 3054 1980 3 1813

Multiple baselines are usually the most valid

Click to edit Master title style 3 Beware Sampling Effects

22

0

500

1000

1500

2000

2500

1980 1981 1982

Line

s of

cod

epe

rson

-yea

rs

Baselines

Lots of small

projects Lots of large

projects

Click to edit Master title style 4 Understand Variation

23

Product factors bull timing constraints bull memory utilization bull CPU occupancy bull resource constraints bull complexity bull product size

Project factors bull concurrent development bull development computer bull requirements definition bull requirements churn bull programming practices bull personnel experience bull client experience bull client participation

Project 49

Product 16

Not explained

35

R2product = 16

R2project = 49

R2

model = 65

J Vosburgh B Curtis et al (1984) Proceedings of ICSE 1994

Click to edit Master title style 5 Investigate Distributions

Low Medium High

Rel

ativ

e p

rod

uct

ivit

y (

of m

ean

)

Software Engineering Practices Usage

-100

-50

0

50

100

150

200

250

Click to edit Master title style 6 Account for Maturity Effects

Which organization will be required to spend more effort on correcting software flaws

Click to edit Master title style 7 Caution for External Data Sets

26

DACS Us

Productivity

Siz

e Data definitions

Data collection

Data validity

Data age

Demographics

Click to edit Master title style

1) Tracking traditional projects

2) Iterative and Agile measurement

3) Team Software Process measurement

33

Section 2 Measuring to Manage

Software Projects

Click to edit Master title style Measuring Project Progress

300K 250K 200K 150K 100K 50K 0K

Jan Feb Mar Apr May Jun

Planned rate of code Growth

Actual code growth

Completion of coding

ldquoThe measures say we are on schedulerdquo

300K

250K

200K

150K

100K

50K

0K Jan Feb Mar Apr May Jun

Planned rate of code Growth

Actual code growth

Completion of coding

ldquoBut the code keeps growing We should have measured the requirements growthrdquo

50

100

150

Defect reports

Total

Open

Closed

125

75

25

Jan Feb Mar Apr May Jun

Completion of coding

ldquoAnd all the defect reports remaining to be fixed will keep us far behind schedulerdquo

Click to edit Master title style Requirements Change Impact

Number of requirements

affected

Chart6

Requirements Added
Requirements Changed
Requirements Deleted
Person hours of work or rework
2
20
1
2
11
1
45
35
3
15
61
1
1
20
4
15
15
0
18
05
2

Sheet1

Sheet1

Requirements Added
Requirements Changed
Requirements Deleted
Person hours of work or rework

Sheet2

Sheet3

Click to edit Master title style Phase Defect Removal Model

PHASE

Escapes Previous Phase KSLOC

Defect Injection KSLOC

Subtotal KSLOC

Removal Effective-ness

Defect Removal KSLOC

Escapes at phase exit KSLOC

Rqmts 0 12 12 0 0 12

Design 12 180 192 76 146 46

Code amp UT 46 154 200 71 142 58

Inte-gration 58 0 58 67 39 19

Sys Test 19 0 19 58 11 08

Opera-tion 08

Kan (1999) Metrics and Model in Software Quality Engineering

Click to edit Master title style SCRUM Board Scrumboard displays

the path to completion for each story by showing task status

Story points are an estimate of days for each story

Agile estimating is by feature (story) rather than by task

lsquoDonersquo does not always mean all tasks are finished

Kanban methods restrict the work in progress to a limited number of paths

Click to edit Master title style Burndown Chart and Velocity

Burndown chart displays story completion by day

Burndown chart indicates actual versus estimated progress

Velocity is the rate at which stories are completed

Velocity indicates sustainable progress

Velocity results provide one source of historical data for improving estimates

Story points without the context of productivity factors are dangerous for estimating

Click to edit Master title style Team Velocity

0

1

2

3

4

5

1 2 2 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20Days

Story Points per Hour

0

1

2

3

4

5

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

HrsStryPt

Analyzing Velocity

Item Cycle Times

02468

101214161820

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

Stories Hours HrsStryPt

0

10

20

30

40

50

60

1 2 2 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

Burndown Chart

Days

Click to edit Master title style Measuring and Managing Flow

Released

In progress

Backlogged

Cumulative Flow Diagram Days to delivery

outliers

Anderson (2010) Kanban

Click to edit Master title style

0

10

20

30

40

50

60

70

80

1 2 3 4 5 6 7 8 9 10 11 12

Sprints

Sto

ry p

oin

ts d

eliv

ered

Trends Over Sprints

0

10

20

30

40

50

60

70

80

1 2 3 4 5 6 7 8 9 10 11 12

Sprints

Sto

ry p

oin

ts d

eliv

ered

0

05

1

15

2

25

3

35

4

1 2 3 4 5 6 7 8 9 10 11 12

Sprints

Tech

nic

al d

ebt

Track and correlate multiple measures

across sprints

Predictive models

Click to edit Master title style Sources and Measures

Stories submitted

Stories pulled back

Growth in size of stories

Stories added mid-sprint

Stories under review

Failed tests

Non-development tasks

Assists to other projects

Data sources

Measures Project tracking

Code control

Bug tracking

Build system

Test environment

Deployment tools

Operational monitoring

Click to edit Master title style Recommended Books

Click to edit Master title style Personal Software Process (PSP)

PSP 0 Current process Time recording

Defect recording Defect type standards

PSP 01 Coding standard

Size measurement Process improvement

proposal

Baseline Personal Process

Humphrey (1997) Introduction to the Personal Software Process

PSP 1 Size estimating

Test report

Personal Planning

PSP 2 Code reviews

Design reviews

Personal Quality Management

PSP 21 Cyclic development

Cyclic Personal Process

PSP 11 Task planning

Schedule planning

PSP 21 Design templates

Click to edit Master title style Team Software Process (TSP) bull Built from personal processes of team members

bull Well defined team roles

bull Project launch workshops for planning

bull Team measurement and tracking

bull Post-mortems for improvement

bull Team application of lean principles

Humphrey (2000) Introduction to the Team Software Process

Click to edit Master title style Aggregated Personal Data Best

R2s range from 70 to 87

The ability to predict the amount of time required to produce a piece of software is dramatically improved by estimating at the individual level first and then combining the estimates rather than developing a single team estimate averaged over individuals

R2 = 53

Click to edit Master title style Defect Detection at Intuit

Consequently test changes from defect

detection into correctness verification

PSPTSP shifts defect detection from the test phase back up into development

Fagan (2005)

Personal reviews

33

Compile 33

Team reviews 19

Unit test 15

Test 14

Click to edit Master title style

Dramatic reductions

ndash Delivered defects

ndash Days per KLOC

ndash Schedule deviation

ndash Test defectskloc

ndash Variation in results

Humphrey (2006) TSP Leading a Development Team

TSP Benefits

Click to edit Master title style Section 3 Section 3 Measuring Software

Improvement Programs

1) Improvement program methods

2) Maturity-based measurement

3) Improvement program results

61

Click to edit Master title style CMMI Maturity Transitions

Level 5 Optimizing

Innovation management

Level 4 Quantitatively

Managed

Capability management

Level 3 Defined

Process management

Level 2 Managed

Project management

Level 1 Initial

Inconsistent management

Chrisis Konrad amp Shrum (2005) CMMI

Click to edit Master title style Enhanced Maturity Framework

Level 5 Innovating

Innovation management

Level 4 Optimized

Capability management

Level 3 Standardized

Organizational management

Level 2 Stabilized

Work unit management

Level 1 Initial

Inconsistent management

Click to edit Master title style Maturity Level Transformation

Level 5 Opportunistic improvements

Empowered culture

Level 5 Proactive

improvements Agile culture

Organization

Projects

Individual

Level 4 Process and results

managed statistically Precision culture

Level 3 Organization develops

standard processes Common culture

Level 2 Project managers stabilize projects

Local culture

Level 1 Ad Hoc inconsistent

development practices Relationship culture

Trust

Click to edit Master title style Measurement by Level

Level 5 mdash Exploratory data ∆ t = N x ROI

Level 4 mdash Predictive data X plusmn (196σ radicn )

Level 3 mdash Process data 83 defects per review

Level 2 mdash Management data Effort = α LOCβ

Level 1 mdash Haphazard data ldquoabout a million linesrdquo

Measure-maturity mismatch slows improvement and creates

dysfunction

Click to edit Master title style Average Improvement Results

Productivity growth 4 35 9 - 67

Pre-test defect detection 3 22 6 - 25

Time to market 2 darr 19 15 - 23

Field error reports 5 darr 39 10 - 94

Return on investment 5 501 41 - 881

Annual Annual Improvement benefit Orgs median range

Click to edit Master title style Schedule Performance

0

02

04

06

08

1

12

14

16

1 2 3

Schedule performance

index

CMM maturity level

Budgeted cost of work performed

Budgeted cost of work scheduled

behind ahead

Air Force study of contractors

Flowe amp Thordahl (1994)

Click to edit Master title style Cost Performance

Cost performance

index

0

05

1

15

2

25

1 2 3 CMM Maturity Level

Budgeted cost of work performed Actual cost of

work performed

overrun underrun

Air Force study of contractors

Flowe amp Thordahl (1994)

Click to edit Master title style Crosbyrsquos Cost of Quality Model Project Cost

Performance Cost of Quality Generating plans documentation Development - requirements - design - code - integration Prevention Appraisal

Reviews - system - requirements - design - test plan - test scripts Walkthroughs Testing (first time) Independent Verification and Validation (first time) Audits

Training Policies Procedures and methods Tools Planning Quality Improvement Projects Data Gathering and Analysis Root Cause Analysis Quality Reporting

Nonconformance Conformance Fixing Defects Reworking any Document Updating Source Code Re-reviews Re-tests Lab costs for re-tests Patches(Internal and External) Engineering Changes Change Control Boards External Failures and fines Customer Support Help Desks

Dion (1993) Crosby (1979) Quality Is Free

Click to edit Master title style Raytheons Cost of Quality

bull Performance mdash cost of building it right first time bull Nonconformance mdash cost of rework bull Appraisal mdash cost of testing bull Prevention mdash cost of preventing nonconformance

Dion (1993) Haley (1995)

23

18

Year Level Perform Nonconf Appraise Prevent

1988 1 34 41 15 7

1990 2 55 18 15 12

1992 3 66 11

1994 4 76 6 75

Click to edit Master title style Raytheonrsquos Productivity

0

1

2

3

4

1988 1989 1990 1991 1992 1993 1994

Growth relative to 1989 baseline

Lydon (1995)

Click to edit Master title style Raytheonrsquos Cost Reduction

0

02

04

06

08

1

12

1988 1989 1990 1991 1992 1993 1994

Reduction relative to 1989 baseline

Lydon (1995)

Click to edit Master title style

Haley (1995)

1993 1992 1991 1990 1989 1988 1994

50

40

30

20

10

0

-10

Actual cost

Budgeted cost

over-run under-run

4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1

Raytheonrsquos Cost Predictability

Click to edit Master title style OMRONrsquos Reuse gains

Sakamoto et al (1996)

0

5 0 0

1 0 0 0

1 5 0 0

2 0 0 0

2 5 0 0

3 0 0 0

1 9 8 7 1 9 8 8 1 9 8 9 1 9 9 0 1 9 9 1 1 9 9 2 1 9 9 3 1 9 9 4

Steps

100

75

50

25

0

of reuse

Click to edit Master title style Section 4 Section 4 Measuring Quality and

Technical Debt

1) Structural quality measurement

2) Technical debt

3) Consortium for IT Software Quality (CISQ)

89

Click to edit Master title style What about the Product CMMIrsquos primary assumption

ldquoThe quality of a software system is governed by quality of the process used to develop itrdquo

- Watts Humphrey

CMMIrsquos product quality problem

1 Assessments do not verify product quality

2 Compliance focus undermines CMMIrsquos primary assumption

3 Product quality focus at Level 4 in CMM was lost in quantitative process control in CMMI

4 The assumptions underlying control charts are violated by software data

Click to edit Master title style Six Sigmarsquos Challenge

Process focus ndash process improvement ndash Six Sigma Product focus ndash product improvement ndash Design for 6σ

Product focus supplements product focus to unlock even more value from software

Six Sigma projects must have significant benefits Huge benefits

Large benefits

Good benefits

Okay benefits

Small benefits

Now what

Ultimately we run out of projects with significant enough benefits to continue the programhellip

so what is the solution that allows continual improvement

Click to edit Master title style Testing is Not Enough ldquoAs higher levels of assurance are demandedhelliptesting cannot deliver the level of confidence required at a reasonable costrdquo

ldquoThe correctness of the code is rarely the weakest linkrdquo

Jackson D (2009) Spinellis D (2006)

ldquohellipa failure to satisfy a non-functional requirement can be critical even catastrophichellipnon-functional requirements are sometimes difficult to verify We cannot write a test case to verify a systemrsquos reliabilityhellipThe ability to associate code to non-functional properties can be a powerful weapon in a software engineerrsquos arsenalrdquo

Click to edit Master title style CASTrsquos Application Intelligence Platform

Application Analysis

Evaluation of 1200+ coding amp

architectural rules

Application meta-data

Transferability

Changeability

Robustness

Performance

Security

Quality Measurements

Detected Violations

Expensive operation in loop Static vs pooled connections Complex query on big table Large indices on big table

Empty CATCH block Uncontrolled data access Poor memory management Opened resource not closed

SQL injection Cross-site scripting Buffer overflow Uncontrolled format string

Unstructured code Misuse of inheritance Lack of comments Violated naming convention

Highly coupled component Duplicated code Index modified in loop High cyclomatic complexity

Language Parsers

Oracle PLSQL Sybase T-SQL SQL Server T-SQL IBM SQLPSM C C++ C Pro C Cobol CICS Visual Basic VBNet ASPNet Java J2EE JSP XML HTML Javascript VBScript PHP PowerBuilder Oracle Forms PeopleSoft SAP ABAP Netweaver Tibco Business Objects Universal Analyzer for other languages

Click to edit Master title style Modern Apps are a Technology Stack Ar

chite

ctur

al C

ompl

ianc

e

Data Flow Transaction Risk

EJB PLSQL

Oracle

SQL Server

DB2

TSQL

Hibernate

Spring

Struts NET

COBOL

Sybase IMS

Messaging

Integration quality Architectural

compliance Risk propagation Application security Resiliency checks Transaction integrity

Function point Effort estimation Data access control SDK versioning Calibration across

technologies IT organization level

System Level 3

Code style amp layout Expression complexity Code documentation Class or program design Basic coding standards Developer level

Unit Level 1

Java Java Java

Java Java

Web Services Java Java Single languagetechnology layer

Intra-technology architecture Intra-layer dependencies Inter-program invocation Security vulnerabilities Development team level

Technology Level 2

JSP ASPNET APIs

Click to edit Master title style Architecturally Complex Defects

Primary cause of operational

problems

20x as many

fixes to correct

Architecturally Complex Defect

A structural flaw involving interactions among multiple components that reside in different application layers

48

52 92

8 Architecturally

Complex Defects

Code unit-level violations

of total app defects

of total repair effort

Most architecturally complex defects involve an architectural hotspot ndash a badly-built component that should be replaced

Click to edit Master title style Propagated Risk

The impact of violation 284 in Component B is very widely propagated throughout the system ndash

it presents very high risk

The impact of violation 342 in Component A is only narrowly propagated in the system

ndash it presents low risk

Propagated Risk Index

A ranking of the risk created by each violation based on the its severity and the breadth of its impact in the system

Violation 342 in component A

Violation 284 in component B

Remediating violation 284 in Component B will have

the greater impact on reducing risk and cost

Click to edit Master title style Transaction Risk

Customers care most about the dependability of their transactions

The risk of a transaction is determined by the number and severity of violations along its path

Transactions can be ranked by risk to prioritize their remediation

The Transaction Risk Index can be further tuned using the operational frequency of each transaction

Transaction Risk Index

A ranking of transactions based on the risk created by the number and severity of violations along the transaction path

Transaction entryexit in the user interface layer

Transaction fulfillment in the data storage layer

1 3 2 4 5

Transaction 1 poses greatest risk

Transaction processing in the business logic layer

Click to edit Master title style Risk-Based Prioritization

Violation Health Factor Severity Propagated Risk Index

In a high risk transaction

Frequency of path execution

148 Security 9

371 Robustness 9

062 Performance 8

284 Robustness 7

016 Changeability 7

241 Security 5

173 Transferability 4

359 Transferability 3

Click to edit Master title style Static Analysis in US DoD

John Keane (2013)

Testing finds lt13 of structural defects

Static analysis before testing is 85 efficient

With testing and static analysis combined test schedules are reduced by 50

Click to edit Master title style Reducing Operational Incidents amp Costs

Study of structural quality measures and maintenance effort across 20 customers in a large global system integrator

TQI increase of 24 decreased corrective maintenance effort by 50

Total Quality Index

Click to edit Master title style Reducing Operational Losses

0

20

40

60

80

100

120

140

160

180

100 150 200 250 300 350 400

Technical Quality Index

Pro

du

ctio

n In

cid

ents

+$2000000 Annual Loss

$50000 Annual Loss

$5000 Annual Loss

Large international investment bank Business critical applications

Click to edit Master title style Factors Affecting Structural Quality

2017 CRASH Report Request from infocastsoftwarecom

Factor apps Robust Security Perform Change Transfer

Technology 1606 20 3 16 26 7

Java-EE

Industry 677 5 5 7 4 6

App type 510 1 1 1 1 1

Source 580 1 1 4

Shore 540

Maturity 72 28 25 15 24 12

Method 208 10 6 14 6

Team size 186 7 5 8 8

of users 262 4 3 5 4

Percentage of variation in structural quality scores accounted for by various factors

Click to edit Master title style The Technical Debt Metaphor

Structural quality problems in production code

Technical Debt

Principal borrowed

Interest on the debt

Business Risk Liability from debt

Opportunity cost

Interestmdashcontinuing IT costs attributable to the violations causing technical debt ie higher maintenance costs greater resource usage etc

Principalcost of fixing problems remaining in the code after release that must be remediated

Opportunity costmdashbenefits that could have been achieved had resources been put on new capability rather than retiring technical debt

Liabilitymdashbusiness costs related to outages breaches corrupted data etc

Technical Debt the future cost of fixing defects remaining in code at release a component of the cost of ownership

Todayrsquos talk focuses on the principal

Curtis et al (2012) IEEE Software

Click to edit Master title style Tech Debt by Quality Factor

Transferability

40

Changeability

30

Security 7

Robustness

18

70 of Technical Debt is in IT Cost (Transferability Changeability)

30 of Technical Debt is in Business Risk

(Robustness Performance Security)

Health Factor proportions are mostly

consistent across technologies

Curtis et al (2012) IEEE Software

Click to edit Master title style Join CISQ wwwit-cisqorg

Click to edit Master title style Section

References

115

Click to edit Master title style References 1 Anderson DJ (2010) Kanban Sequim WA Blue Hole Press

Basili V Trendowicz A Kowalczyk M Heidrich J Seaman C Munch J amp Rombach D (2014) Aligning Organizations through Measurement Heidelberg Springer

Boehm B (1987) Increasing software productivity IEEE Computer 20 (9) 43-57

Boehm B et al (2000) Software Cost Estimation with COCOMO II Prentice-Hall

Chrissis MB Konrad M amp Shrum S (2011) CMMI Third Edition Guidelines for Process Integration and Product Improvement Reading MA Addison Wesley

Cohn M (2006) Agile Estimating and Planning Upper Saddle River NJ Prentice-Hall

Crosby P (1979) Quality Is Free New York McGraw-Hill

Curtis B Sapiddi J amp Szynkarski A (2012) Estimating the principle of an applicationrsquos technical debt IEEE Software 29 (6) 34-42

Dion R (1993) Process improvement and the corporate balance sheet IEEE Software 10 (4) 28-35

Duncker R (1992) Presentation at the 25th Annual Conference of the Singapore Computer Society Singapore November 1992

Dyne K (1999) ESSI Benefits from Continuous Software Improvements In Proceedings of ESEPGrsquo99 Milton Keyes UK ESPI Foundation

Flowe RM amp Thordahl JB (1994) A Correlational Study of the SEIrsquos Capability Maturity Model and Software Development Performance in DoD Contracts (AFITGSSLAR94D-2) Dayton OH Air Force Institute of Technology Wright Patterson Air Force Base

116

Click to edit Master title style References 2 Haley T Ireland B Wojtaszek E Nash D amp Dion R (1995) Raytheon Electronic Systems

Experience in Software Process Improvement (Tech Rep CMUSEI-95-TR-017) ) Pittsburgh Software Engineering Institute Carnegie Mellon University

Ebert C amp Dumke R (2007) Software Measurement Berlin Springer-Verlag

Fagan E amp Davis N (2005) TSP PSP at Intuit Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Fenton N (2006) New directions for Software Metrics CIO Symposium on Software Best Practices

Garmus D amp Herron D (2001) Function Point Analysis Boston Addison-Wesley

George E (2005) Project management with TSP Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Herbsleb J Carleton A Rozum J Siegel J amp Zubrow D (1994) Benefits of CMM-Based Software Process Improvement Initial Results (Tech Rep CMUSEI-94-TR-13) Pittsburgh Software Engineering Institute Carnegie Mellon University

Humphrey W (1997) Introduction to the Personal Software Process Boston Addison Wesley

Humphrey W (2000) Introduction to the Team Software Process Boston Addison Wesley

Humphrey W (2006) TSP Leading a Development Team Boston Addison Wesley

Jackson D (2009) A direct path to dependable software Communications of the ACM 52 (4)

Kan SH (1995) Metrics and Models in Software Quality Engineering Reading MA Addison-Wesley

Keeni B (2000) IEEE Software 16 (4)

117

Click to edit Master title style References 2 Larman C (2004) Agile and Iterative Development Boston Addison Wesley

Lydon T (1995) Productivity drivers Process and capital In Proceedings of the 1995 SEPG Conference Pittsburgh Software Engineering Institute Carnegie Mellon University

McGarry J et al (2002) Practical Software Measurement Reading MA Addison Wesley

Paulk M Weber C Curtis B amp Chrissis M (1995) The Capability Maturity Model Guidelines for Improving the Software Process Reading MA Addison-Wesley

Peterman W (2000) IEEE Software JulyAugust issue

Poppendieck M amp Poppendieck T (2007) Implementing Lean Software Development Boston Addison-Wesley

Sakamoto K Kishida K amp Nakakoji K (1996) Cultural adaptation of the CMM In Fuggetta A amp Wolf A (Eds) Software Process Chichester UK Wiley 137-154

Sinha M (2006) Performance models ndash Enabling organizations prediction capability Proceedings of SEPGrsquo06 Pittsburgh Software Engineering Institute Carnegie Mellon University

Schwaber K amp Beedle M (2002) Agile Software Development Using Scrum

Spinellis D (2006) Code Quality Addison-Wesley

Van Epps S amp Marshall R (2012) Using TSP at the end of the development cycle Proceedings of TSP 2012 Pittsburgh Software Engineering Institute Carnegie Mellon University

Vu JD (1996) Software process improvement A business case In Proceedings of the European SEPG Conference Milton Keynes UK European Software Process Improvement Foundation

118

  • Best Practices for Software Process and Product Measurement
  • Instructor Dr Bill Curtis
  • Agenda
  • Section 1
  • Why Does Measurement Languish
  • Measurement Process Guidance
  • Measurement Categories
  • Productivity Analysis Objectives
  • Productivity Analysis Measures
  • Software Productivity
  • Software Size Measures
  • How Many Lines of Code
  • Function Point Estimation
  • Effort mdash Weakest Measure
  • How Quality Affects Productivity
  • Carry-forward Rework
  • Example of Quality Impact
  • Quality-Adjusted Productivity
  • Best Practices for Analysis
  • 1 Evaluate Demographics
  • 2 Segment Baselines
  • 3 Beware Sampling Effects
  • 4 Understand Variation
  • 5 Investigate Distributions
  • 6 Account for Maturity Effects
  • 7 Caution for External Data Sets
  • Slide Number 27
  • Measuring Project Progress
  • Requirements Change Impact
  • Phase Defect Removal Model
  • SCRUM Board
  • Burndown Chart and Velocity
  • Analyzing Velocity
  • Measuring and Managing Flow
  • Trends Over Sprints
  • Sources and Measures
  • Recommended Books
  • Personal Software Process (PSP)
  • Team Software Process (TSP)
  • Aggregated Personal Data Best
  • Defect Detection at Intuit
  • TSP Benefits
  • Section 3
  • CMMI Maturity Transitions
  • Enhanced Maturity Framework
  • Maturity Level Transformation
  • Measurement by Level
  • Average Improvement Results
  • Schedule Performance
  • Cost Performance
  • Crosbyrsquos Cost of Quality Model
  • Raytheons Cost of Quality
  • Raytheonrsquos Productivity
  • Raytheonrsquos Cost Reduction
  • Raytheonrsquos Cost Predictability
  • OMRONrsquos Reuse gains
  • Section 4
  • What about the Product
  • Six Sigmarsquos Challenge
  • Testing is Not Enough
  • CASTrsquos Application Intelligence Platform
  • Modern Apps are a Technology Stack
  • Architecturally Complex Defects
  • Propagated Risk
  • Transaction Risk
  • Risk-Based Prioritization
  • Static Analysis in US DoD
  • Reducing Operational Incidents amp Costs
  • Reducing Operational Losses
  • Slide Number 70
  • The Technical Debt Metaphor
  • Tech Debt by Quality Factor
  • Join CISQ wwwit-cisqorg
  • Section
  • References 1
  • References 2
  • References 2

PHASE

Escapes Previous Phase

KSLOC

Defect Injection

KSLOC

Subtotal

KSLOC

Removal

Effective-ness

Defect Removal

KSLOC

Escapes at phase exit KSLOC

Rqmts

0

12

12

0

0

12

Design

12

180

192

76

146

46

Code amp UT

46

154

200

71

142

58

Inte-gration

58

0

58

67

39

19

Sys Test

19

0

19

58

11

08

Opera-tion

08

Requirements Added Requirements Changed Requirements Deleted
Feb 2 20 1
Mar 2 11 1
Apr 45 35 3
May 15 61 1
Jun 1 20 4
Jul 15 15 0
Aug 18 05 2
Feb Feb Feb
Mar Mar Mar
Apr Apr Apr
May May May
Jun Jun Jun
Jul Jul Jul
Aug Aug Aug
Page 18: Best Practices for Software Process and Product Measurement · Software Process and Product Measurement Dr. Bill Curtis ... measurement best practice • Best guidance for starting

Click to edit Master title style Quality-Adjusted Productivity

Productivity

Estimation

Benchmarks

Value amp ROI

Etc

Automated Enhancement

Points

Size of both functional and non-functional code segments

Corrective effort in future releases for defects injected in

this release

Must add future effort to fix bugs into productivity

calculations

Effort amp Cost

Quality-Adjusted

Productivity

Automated Technical Debt

Click to edit Master title style Best Practices for Analysis

1) Segment baselines

2) Beware sampling effects

3) Understand variation

4) Evaluate demographics

5) Investigate distributions

6) Account for maturity effects

7) Beware external data sets

Click to edit Master title style 1 Evaluate Demographics

Routine reports

COBOL apps

Function Points

Prod

uctiv

ity

Web-based apps

Innovative mobile

Custom ERP

Click to edit Master title style 2 Segment Baselines

Year Projects Productivity Total Corporate 1981 28 2342 1980 21 1939 Telecommunications 1981 14 1811 1980 12 1458 Engineering amp Defense 1981 8 2965 1980 6 2739 Business Applications 1981 6 3054 1980 3 1813

Multiple baselines are usually the most valid

Click to edit Master title style 3 Beware Sampling Effects

22

0

500

1000

1500

2000

2500

1980 1981 1982

Line

s of

cod

epe

rson

-yea

rs

Baselines

Lots of small

projects Lots of large

projects

Click to edit Master title style 4 Understand Variation

23

Product factors bull timing constraints bull memory utilization bull CPU occupancy bull resource constraints bull complexity bull product size

Project factors bull concurrent development bull development computer bull requirements definition bull requirements churn bull programming practices bull personnel experience bull client experience bull client participation

Project 49

Product 16

Not explained

35

R2product = 16

R2project = 49

R2

model = 65

J Vosburgh B Curtis et al (1984) Proceedings of ICSE 1994

Click to edit Master title style 5 Investigate Distributions

Low Medium High

Rel

ativ

e p

rod

uct

ivit

y (

of m

ean

)

Software Engineering Practices Usage

-100

-50

0

50

100

150

200

250

Click to edit Master title style 6 Account for Maturity Effects

Which organization will be required to spend more effort on correcting software flaws

Click to edit Master title style 7 Caution for External Data Sets

26

DACS Us

Productivity

Siz

e Data definitions

Data collection

Data validity

Data age

Demographics

Click to edit Master title style

1) Tracking traditional projects

2) Iterative and Agile measurement

3) Team Software Process measurement

33

Section 2 Measuring to Manage

Software Projects

Click to edit Master title style Measuring Project Progress

300K 250K 200K 150K 100K 50K 0K

Jan Feb Mar Apr May Jun

Planned rate of code Growth

Actual code growth

Completion of coding

ldquoThe measures say we are on schedulerdquo

300K

250K

200K

150K

100K

50K

0K Jan Feb Mar Apr May Jun

Planned rate of code Growth

Actual code growth

Completion of coding

ldquoBut the code keeps growing We should have measured the requirements growthrdquo

50

100

150

Defect reports

Total

Open

Closed

125

75

25

Jan Feb Mar Apr May Jun

Completion of coding

ldquoAnd all the defect reports remaining to be fixed will keep us far behind schedulerdquo

Click to edit Master title style Requirements Change Impact

Number of requirements

affected

Chart6

Requirements Added
Requirements Changed
Requirements Deleted
Person hours of work or rework
2
20
1
2
11
1
45
35
3
15
61
1
1
20
4
15
15
0
18
05
2

Sheet1

Sheet1

Requirements Added
Requirements Changed
Requirements Deleted
Person hours of work or rework

Sheet2

Sheet3

Click to edit Master title style Phase Defect Removal Model

PHASE

Escapes Previous Phase KSLOC

Defect Injection KSLOC

Subtotal KSLOC

Removal Effective-ness

Defect Removal KSLOC

Escapes at phase exit KSLOC

Rqmts 0 12 12 0 0 12

Design 12 180 192 76 146 46

Code amp UT 46 154 200 71 142 58

Inte-gration 58 0 58 67 39 19

Sys Test 19 0 19 58 11 08

Opera-tion 08

Kan (1999) Metrics and Model in Software Quality Engineering

Click to edit Master title style SCRUM Board Scrumboard displays

the path to completion for each story by showing task status

Story points are an estimate of days for each story

Agile estimating is by feature (story) rather than by task

lsquoDonersquo does not always mean all tasks are finished

Kanban methods restrict the work in progress to a limited number of paths

Click to edit Master title style Burndown Chart and Velocity

Burndown chart displays story completion by day

Burndown chart indicates actual versus estimated progress

Velocity is the rate at which stories are completed

Velocity indicates sustainable progress

Velocity results provide one source of historical data for improving estimates

Story points without the context of productivity factors are dangerous for estimating

Click to edit Master title style Team Velocity

0

1

2

3

4

5

1 2 2 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20Days

Story Points per Hour

0

1

2

3

4

5

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

HrsStryPt

Analyzing Velocity

Item Cycle Times

02468

101214161820

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

Stories Hours HrsStryPt

0

10

20

30

40

50

60

1 2 2 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

Burndown Chart

Days

Click to edit Master title style Measuring and Managing Flow

Released

In progress

Backlogged

Cumulative Flow Diagram Days to delivery

outliers

Anderson (2010) Kanban

Click to edit Master title style

0

10

20

30

40

50

60

70

80

1 2 3 4 5 6 7 8 9 10 11 12

Sprints

Sto

ry p

oin

ts d

eliv

ered

Trends Over Sprints

0

10

20

30

40

50

60

70

80

1 2 3 4 5 6 7 8 9 10 11 12

Sprints

Sto

ry p

oin

ts d

eliv

ered

0

05

1

15

2

25

3

35

4

1 2 3 4 5 6 7 8 9 10 11 12

Sprints

Tech

nic

al d

ebt

Track and correlate multiple measures

across sprints

Predictive models

Click to edit Master title style Sources and Measures

Stories submitted

Stories pulled back

Growth in size of stories

Stories added mid-sprint

Stories under review

Failed tests

Non-development tasks

Assists to other projects

Data sources

Measures Project tracking

Code control

Bug tracking

Build system

Test environment

Deployment tools

Operational monitoring

Click to edit Master title style Recommended Books

Click to edit Master title style Personal Software Process (PSP)

PSP 0 Current process Time recording

Defect recording Defect type standards

PSP 01 Coding standard

Size measurement Process improvement

proposal

Baseline Personal Process

Humphrey (1997) Introduction to the Personal Software Process

PSP 1 Size estimating

Test report

Personal Planning

PSP 2 Code reviews

Design reviews

Personal Quality Management

PSP 21 Cyclic development

Cyclic Personal Process

PSP 11 Task planning

Schedule planning

PSP 21 Design templates

Click to edit Master title style Team Software Process (TSP) bull Built from personal processes of team members

bull Well defined team roles

bull Project launch workshops for planning

bull Team measurement and tracking

bull Post-mortems for improvement

bull Team application of lean principles

Humphrey (2000) Introduction to the Team Software Process

Click to edit Master title style Aggregated Personal Data Best

R2s range from 70 to 87

The ability to predict the amount of time required to produce a piece of software is dramatically improved by estimating at the individual level first and then combining the estimates rather than developing a single team estimate averaged over individuals

R2 = 53

Click to edit Master title style Defect Detection at Intuit

Consequently test changes from defect

detection into correctness verification

PSPTSP shifts defect detection from the test phase back up into development

Fagan (2005)

Personal reviews

33

Compile 33

Team reviews 19

Unit test 15

Test 14

Click to edit Master title style

Dramatic reductions

ndash Delivered defects

ndash Days per KLOC

ndash Schedule deviation

ndash Test defectskloc

ndash Variation in results

Humphrey (2006) TSP Leading a Development Team

TSP Benefits

Click to edit Master title style Section 3 Section 3 Measuring Software

Improvement Programs

1) Improvement program methods

2) Maturity-based measurement

3) Improvement program results

61

Click to edit Master title style CMMI Maturity Transitions

Level 5 Optimizing

Innovation management

Level 4 Quantitatively

Managed

Capability management

Level 3 Defined

Process management

Level 2 Managed

Project management

Level 1 Initial

Inconsistent management

Chrisis Konrad amp Shrum (2005) CMMI

Click to edit Master title style Enhanced Maturity Framework

Level 5 Innovating

Innovation management

Level 4 Optimized

Capability management

Level 3 Standardized

Organizational management

Level 2 Stabilized

Work unit management

Level 1 Initial

Inconsistent management

Click to edit Master title style Maturity Level Transformation

Level 5 Opportunistic improvements

Empowered culture

Level 5 Proactive

improvements Agile culture

Organization

Projects

Individual

Level 4 Process and results

managed statistically Precision culture

Level 3 Organization develops

standard processes Common culture

Level 2 Project managers stabilize projects

Local culture

Level 1 Ad Hoc inconsistent

development practices Relationship culture

Trust

Click to edit Master title style Measurement by Level

Level 5 mdash Exploratory data ∆ t = N x ROI

Level 4 mdash Predictive data X plusmn (196σ radicn )

Level 3 mdash Process data 83 defects per review

Level 2 mdash Management data Effort = α LOCβ

Level 1 mdash Haphazard data ldquoabout a million linesrdquo

Measure-maturity mismatch slows improvement and creates

dysfunction

Click to edit Master title style Average Improvement Results

Productivity growth 4 35 9 - 67

Pre-test defect detection 3 22 6 - 25

Time to market 2 darr 19 15 - 23

Field error reports 5 darr 39 10 - 94

Return on investment 5 501 41 - 881

Annual Annual Improvement benefit Orgs median range

Click to edit Master title style Schedule Performance

0

02

04

06

08

1

12

14

16

1 2 3

Schedule performance

index

CMM maturity level

Budgeted cost of work performed

Budgeted cost of work scheduled

behind ahead

Air Force study of contractors

Flowe amp Thordahl (1994)

Click to edit Master title style Cost Performance

Cost performance

index

0

05

1

15

2

25

1 2 3 CMM Maturity Level

Budgeted cost of work performed Actual cost of

work performed

overrun underrun

Air Force study of contractors

Flowe amp Thordahl (1994)

Click to edit Master title style Crosbyrsquos Cost of Quality Model Project Cost

Performance Cost of Quality Generating plans documentation Development - requirements - design - code - integration Prevention Appraisal

Reviews - system - requirements - design - test plan - test scripts Walkthroughs Testing (first time) Independent Verification and Validation (first time) Audits

Training Policies Procedures and methods Tools Planning Quality Improvement Projects Data Gathering and Analysis Root Cause Analysis Quality Reporting

Nonconformance Conformance Fixing Defects Reworking any Document Updating Source Code Re-reviews Re-tests Lab costs for re-tests Patches(Internal and External) Engineering Changes Change Control Boards External Failures and fines Customer Support Help Desks

Dion (1993) Crosby (1979) Quality Is Free

Click to edit Master title style Raytheons Cost of Quality

bull Performance mdash cost of building it right first time bull Nonconformance mdash cost of rework bull Appraisal mdash cost of testing bull Prevention mdash cost of preventing nonconformance

Dion (1993) Haley (1995)

23

18

Year Level Perform Nonconf Appraise Prevent

1988 1 34 41 15 7

1990 2 55 18 15 12

1992 3 66 11

1994 4 76 6 75

Click to edit Master title style Raytheonrsquos Productivity

0

1

2

3

4

1988 1989 1990 1991 1992 1993 1994

Growth relative to 1989 baseline

Lydon (1995)

Click to edit Master title style Raytheonrsquos Cost Reduction

0

02

04

06

08

1

12

1988 1989 1990 1991 1992 1993 1994

Reduction relative to 1989 baseline

Lydon (1995)

Click to edit Master title style

Haley (1995)

1993 1992 1991 1990 1989 1988 1994

50

40

30

20

10

0

-10

Actual cost

Budgeted cost

over-run under-run

4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1

Raytheonrsquos Cost Predictability

Click to edit Master title style OMRONrsquos Reuse gains

Sakamoto et al (1996)

0

5 0 0

1 0 0 0

1 5 0 0

2 0 0 0

2 5 0 0

3 0 0 0

1 9 8 7 1 9 8 8 1 9 8 9 1 9 9 0 1 9 9 1 1 9 9 2 1 9 9 3 1 9 9 4

Steps

100

75

50

25

0

of reuse

Click to edit Master title style Section 4 Section 4 Measuring Quality and

Technical Debt

1) Structural quality measurement

2) Technical debt

3) Consortium for IT Software Quality (CISQ)

89

Click to edit Master title style What about the Product CMMIrsquos primary assumption

ldquoThe quality of a software system is governed by quality of the process used to develop itrdquo

- Watts Humphrey

CMMIrsquos product quality problem

1 Assessments do not verify product quality

2 Compliance focus undermines CMMIrsquos primary assumption

3 Product quality focus at Level 4 in CMM was lost in quantitative process control in CMMI

4 The assumptions underlying control charts are violated by software data

Click to edit Master title style Six Sigmarsquos Challenge

Process focus ndash process improvement ndash Six Sigma Product focus ndash product improvement ndash Design for 6σ

Product focus supplements product focus to unlock even more value from software

Six Sigma projects must have significant benefits Huge benefits

Large benefits

Good benefits

Okay benefits

Small benefits

Now what

Ultimately we run out of projects with significant enough benefits to continue the programhellip

so what is the solution that allows continual improvement

Click to edit Master title style Testing is Not Enough ldquoAs higher levels of assurance are demandedhelliptesting cannot deliver the level of confidence required at a reasonable costrdquo

ldquoThe correctness of the code is rarely the weakest linkrdquo

Jackson D (2009) Spinellis D (2006)

ldquohellipa failure to satisfy a non-functional requirement can be critical even catastrophichellipnon-functional requirements are sometimes difficult to verify We cannot write a test case to verify a systemrsquos reliabilityhellipThe ability to associate code to non-functional properties can be a powerful weapon in a software engineerrsquos arsenalrdquo

Click to edit Master title style CASTrsquos Application Intelligence Platform

Application Analysis

Evaluation of 1200+ coding amp

architectural rules

Application meta-data

Transferability

Changeability

Robustness

Performance

Security

Quality Measurements

Detected Violations

Expensive operation in loop Static vs pooled connections Complex query on big table Large indices on big table

Empty CATCH block Uncontrolled data access Poor memory management Opened resource not closed

SQL injection Cross-site scripting Buffer overflow Uncontrolled format string

Unstructured code Misuse of inheritance Lack of comments Violated naming convention

Highly coupled component Duplicated code Index modified in loop High cyclomatic complexity

Language Parsers

Oracle PLSQL Sybase T-SQL SQL Server T-SQL IBM SQLPSM C C++ C Pro C Cobol CICS Visual Basic VBNet ASPNet Java J2EE JSP XML HTML Javascript VBScript PHP PowerBuilder Oracle Forms PeopleSoft SAP ABAP Netweaver Tibco Business Objects Universal Analyzer for other languages

Click to edit Master title style Modern Apps are a Technology Stack Ar

chite

ctur

al C

ompl

ianc

e

Data Flow Transaction Risk

EJB PLSQL

Oracle

SQL Server

DB2

TSQL

Hibernate

Spring

Struts NET

COBOL

Sybase IMS

Messaging

Integration quality Architectural

compliance Risk propagation Application security Resiliency checks Transaction integrity

Function point Effort estimation Data access control SDK versioning Calibration across

technologies IT organization level

System Level 3

Code style amp layout Expression complexity Code documentation Class or program design Basic coding standards Developer level

Unit Level 1

Java Java Java

Java Java

Web Services Java Java Single languagetechnology layer

Intra-technology architecture Intra-layer dependencies Inter-program invocation Security vulnerabilities Development team level

Technology Level 2

JSP ASPNET APIs

Click to edit Master title style Architecturally Complex Defects

Primary cause of operational

problems

20x as many

fixes to correct

Architecturally Complex Defect

A structural flaw involving interactions among multiple components that reside in different application layers

48

52 92

8 Architecturally

Complex Defects

Code unit-level violations

of total app defects

of total repair effort

Most architecturally complex defects involve an architectural hotspot ndash a badly-built component that should be replaced

Click to edit Master title style Propagated Risk

The impact of violation 284 in Component B is very widely propagated throughout the system ndash

it presents very high risk

The impact of violation 342 in Component A is only narrowly propagated in the system

ndash it presents low risk

Propagated Risk Index

A ranking of the risk created by each violation based on the its severity and the breadth of its impact in the system

Violation 342 in component A

Violation 284 in component B

Remediating violation 284 in Component B will have

the greater impact on reducing risk and cost

Click to edit Master title style Transaction Risk

Customers care most about the dependability of their transactions

The risk of a transaction is determined by the number and severity of violations along its path

Transactions can be ranked by risk to prioritize their remediation

The Transaction Risk Index can be further tuned using the operational frequency of each transaction

Transaction Risk Index

A ranking of transactions based on the risk created by the number and severity of violations along the transaction path

Transaction entryexit in the user interface layer

Transaction fulfillment in the data storage layer

1 3 2 4 5

Transaction 1 poses greatest risk

Transaction processing in the business logic layer

Click to edit Master title style Risk-Based Prioritization

Violation Health Factor Severity Propagated Risk Index

In a high risk transaction

Frequency of path execution

148 Security 9

371 Robustness 9

062 Performance 8

284 Robustness 7

016 Changeability 7

241 Security 5

173 Transferability 4

359 Transferability 3

Click to edit Master title style Static Analysis in US DoD

John Keane (2013)

Testing finds lt13 of structural defects

Static analysis before testing is 85 efficient

With testing and static analysis combined test schedules are reduced by 50

Click to edit Master title style Reducing Operational Incidents amp Costs

Study of structural quality measures and maintenance effort across 20 customers in a large global system integrator

TQI increase of 24 decreased corrective maintenance effort by 50

Total Quality Index

Click to edit Master title style Reducing Operational Losses

0

20

40

60

80

100

120

140

160

180

100 150 200 250 300 350 400

Technical Quality Index

Pro

du

ctio

n In

cid

ents

+$2000000 Annual Loss

$50000 Annual Loss

$5000 Annual Loss

Large international investment bank Business critical applications

Click to edit Master title style Factors Affecting Structural Quality

2017 CRASH Report Request from infocastsoftwarecom

Factor apps Robust Security Perform Change Transfer

Technology 1606 20 3 16 26 7

Java-EE

Industry 677 5 5 7 4 6

App type 510 1 1 1 1 1

Source 580 1 1 4

Shore 540

Maturity 72 28 25 15 24 12

Method 208 10 6 14 6

Team size 186 7 5 8 8

of users 262 4 3 5 4

Percentage of variation in structural quality scores accounted for by various factors

Click to edit Master title style The Technical Debt Metaphor

Structural quality problems in production code

Technical Debt

Principal borrowed

Interest on the debt

Business Risk Liability from debt

Opportunity cost

Interestmdashcontinuing IT costs attributable to the violations causing technical debt ie higher maintenance costs greater resource usage etc

Principalcost of fixing problems remaining in the code after release that must be remediated

Opportunity costmdashbenefits that could have been achieved had resources been put on new capability rather than retiring technical debt

Liabilitymdashbusiness costs related to outages breaches corrupted data etc

Technical Debt the future cost of fixing defects remaining in code at release a component of the cost of ownership

Todayrsquos talk focuses on the principal

Curtis et al (2012) IEEE Software

Click to edit Master title style Tech Debt by Quality Factor

Transferability

40

Changeability

30

Security 7

Robustness

18

70 of Technical Debt is in IT Cost (Transferability Changeability)

30 of Technical Debt is in Business Risk

(Robustness Performance Security)

Health Factor proportions are mostly

consistent across technologies

Curtis et al (2012) IEEE Software

Click to edit Master title style Join CISQ wwwit-cisqorg

Click to edit Master title style Section

References

115

Click to edit Master title style References 1 Anderson DJ (2010) Kanban Sequim WA Blue Hole Press

Basili V Trendowicz A Kowalczyk M Heidrich J Seaman C Munch J amp Rombach D (2014) Aligning Organizations through Measurement Heidelberg Springer

Boehm B (1987) Increasing software productivity IEEE Computer 20 (9) 43-57

Boehm B et al (2000) Software Cost Estimation with COCOMO II Prentice-Hall

Chrissis MB Konrad M amp Shrum S (2011) CMMI Third Edition Guidelines for Process Integration and Product Improvement Reading MA Addison Wesley

Cohn M (2006) Agile Estimating and Planning Upper Saddle River NJ Prentice-Hall

Crosby P (1979) Quality Is Free New York McGraw-Hill

Curtis B Sapiddi J amp Szynkarski A (2012) Estimating the principle of an applicationrsquos technical debt IEEE Software 29 (6) 34-42

Dion R (1993) Process improvement and the corporate balance sheet IEEE Software 10 (4) 28-35

Duncker R (1992) Presentation at the 25th Annual Conference of the Singapore Computer Society Singapore November 1992

Dyne K (1999) ESSI Benefits from Continuous Software Improvements In Proceedings of ESEPGrsquo99 Milton Keyes UK ESPI Foundation

Flowe RM amp Thordahl JB (1994) A Correlational Study of the SEIrsquos Capability Maturity Model and Software Development Performance in DoD Contracts (AFITGSSLAR94D-2) Dayton OH Air Force Institute of Technology Wright Patterson Air Force Base

116

Click to edit Master title style References 2 Haley T Ireland B Wojtaszek E Nash D amp Dion R (1995) Raytheon Electronic Systems

Experience in Software Process Improvement (Tech Rep CMUSEI-95-TR-017) ) Pittsburgh Software Engineering Institute Carnegie Mellon University

Ebert C amp Dumke R (2007) Software Measurement Berlin Springer-Verlag

Fagan E amp Davis N (2005) TSP PSP at Intuit Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Fenton N (2006) New directions for Software Metrics CIO Symposium on Software Best Practices

Garmus D amp Herron D (2001) Function Point Analysis Boston Addison-Wesley

George E (2005) Project management with TSP Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Herbsleb J Carleton A Rozum J Siegel J amp Zubrow D (1994) Benefits of CMM-Based Software Process Improvement Initial Results (Tech Rep CMUSEI-94-TR-13) Pittsburgh Software Engineering Institute Carnegie Mellon University

Humphrey W (1997) Introduction to the Personal Software Process Boston Addison Wesley

Humphrey W (2000) Introduction to the Team Software Process Boston Addison Wesley

Humphrey W (2006) TSP Leading a Development Team Boston Addison Wesley

Jackson D (2009) A direct path to dependable software Communications of the ACM 52 (4)

Kan SH (1995) Metrics and Models in Software Quality Engineering Reading MA Addison-Wesley

Keeni B (2000) IEEE Software 16 (4)

117

Click to edit Master title style References 2 Larman C (2004) Agile and Iterative Development Boston Addison Wesley

Lydon T (1995) Productivity drivers Process and capital In Proceedings of the 1995 SEPG Conference Pittsburgh Software Engineering Institute Carnegie Mellon University

McGarry J et al (2002) Practical Software Measurement Reading MA Addison Wesley

Paulk M Weber C Curtis B amp Chrissis M (1995) The Capability Maturity Model Guidelines for Improving the Software Process Reading MA Addison-Wesley

Peterman W (2000) IEEE Software JulyAugust issue

Poppendieck M amp Poppendieck T (2007) Implementing Lean Software Development Boston Addison-Wesley

Sakamoto K Kishida K amp Nakakoji K (1996) Cultural adaptation of the CMM In Fuggetta A amp Wolf A (Eds) Software Process Chichester UK Wiley 137-154

Sinha M (2006) Performance models ndash Enabling organizations prediction capability Proceedings of SEPGrsquo06 Pittsburgh Software Engineering Institute Carnegie Mellon University

Schwaber K amp Beedle M (2002) Agile Software Development Using Scrum

Spinellis D (2006) Code Quality Addison-Wesley

Van Epps S amp Marshall R (2012) Using TSP at the end of the development cycle Proceedings of TSP 2012 Pittsburgh Software Engineering Institute Carnegie Mellon University

Vu JD (1996) Software process improvement A business case In Proceedings of the European SEPG Conference Milton Keynes UK European Software Process Improvement Foundation

118

  • Best Practices for Software Process and Product Measurement
  • Instructor Dr Bill Curtis
  • Agenda
  • Section 1
  • Why Does Measurement Languish
  • Measurement Process Guidance
  • Measurement Categories
  • Productivity Analysis Objectives
  • Productivity Analysis Measures
  • Software Productivity
  • Software Size Measures
  • How Many Lines of Code
  • Function Point Estimation
  • Effort mdash Weakest Measure
  • How Quality Affects Productivity
  • Carry-forward Rework
  • Example of Quality Impact
  • Quality-Adjusted Productivity
  • Best Practices for Analysis
  • 1 Evaluate Demographics
  • 2 Segment Baselines
  • 3 Beware Sampling Effects
  • 4 Understand Variation
  • 5 Investigate Distributions
  • 6 Account for Maturity Effects
  • 7 Caution for External Data Sets
  • Slide Number 27
  • Measuring Project Progress
  • Requirements Change Impact
  • Phase Defect Removal Model
  • SCRUM Board
  • Burndown Chart and Velocity
  • Analyzing Velocity
  • Measuring and Managing Flow
  • Trends Over Sprints
  • Sources and Measures
  • Recommended Books
  • Personal Software Process (PSP)
  • Team Software Process (TSP)
  • Aggregated Personal Data Best
  • Defect Detection at Intuit
  • TSP Benefits
  • Section 3
  • CMMI Maturity Transitions
  • Enhanced Maturity Framework
  • Maturity Level Transformation
  • Measurement by Level
  • Average Improvement Results
  • Schedule Performance
  • Cost Performance
  • Crosbyrsquos Cost of Quality Model
  • Raytheons Cost of Quality
  • Raytheonrsquos Productivity
  • Raytheonrsquos Cost Reduction
  • Raytheonrsquos Cost Predictability
  • OMRONrsquos Reuse gains
  • Section 4
  • What about the Product
  • Six Sigmarsquos Challenge
  • Testing is Not Enough
  • CASTrsquos Application Intelligence Platform
  • Modern Apps are a Technology Stack
  • Architecturally Complex Defects
  • Propagated Risk
  • Transaction Risk
  • Risk-Based Prioritization
  • Static Analysis in US DoD
  • Reducing Operational Incidents amp Costs
  • Reducing Operational Losses
  • Slide Number 70
  • The Technical Debt Metaphor
  • Tech Debt by Quality Factor
  • Join CISQ wwwit-cisqorg
  • Section
  • References 1
  • References 2
  • References 2

PHASE

Escapes Previous Phase

KSLOC

Defect Injection

KSLOC

Subtotal

KSLOC

Removal

Effective-ness

Defect Removal

KSLOC

Escapes at phase exit KSLOC

Rqmts

0

12

12

0

0

12

Design

12

180

192

76

146

46

Code amp UT

46

154

200

71

142

58

Inte-gration

58

0

58

67

39

19

Sys Test

19

0

19

58

11

08

Opera-tion

08

Requirements Added Requirements Changed Requirements Deleted
Feb 2 20 1
Mar 2 11 1
Apr 45 35 3
May 15 61 1
Jun 1 20 4
Jul 15 15 0
Aug 18 05 2
Feb Feb Feb
Mar Mar Mar
Apr Apr Apr
May May May
Jun Jun Jun
Jul Jul Jul
Aug Aug Aug
Page 19: Best Practices for Software Process and Product Measurement · Software Process and Product Measurement Dr. Bill Curtis ... measurement best practice • Best guidance for starting

Click to edit Master title style Best Practices for Analysis

1) Segment baselines

2) Beware sampling effects

3) Understand variation

4) Evaluate demographics

5) Investigate distributions

6) Account for maturity effects

7) Beware external data sets

Click to edit Master title style 1 Evaluate Demographics

Routine reports

COBOL apps

Function Points

Prod

uctiv

ity

Web-based apps

Innovative mobile

Custom ERP

Click to edit Master title style 2 Segment Baselines

Year Projects Productivity Total Corporate 1981 28 2342 1980 21 1939 Telecommunications 1981 14 1811 1980 12 1458 Engineering amp Defense 1981 8 2965 1980 6 2739 Business Applications 1981 6 3054 1980 3 1813

Multiple baselines are usually the most valid

Click to edit Master title style 3 Beware Sampling Effects

22

0

500

1000

1500

2000

2500

1980 1981 1982

Line

s of

cod

epe

rson

-yea

rs

Baselines

Lots of small

projects Lots of large

projects

Click to edit Master title style 4 Understand Variation

23

Product factors bull timing constraints bull memory utilization bull CPU occupancy bull resource constraints bull complexity bull product size

Project factors bull concurrent development bull development computer bull requirements definition bull requirements churn bull programming practices bull personnel experience bull client experience bull client participation

Project 49

Product 16

Not explained

35

R2product = 16

R2project = 49

R2

model = 65

J Vosburgh B Curtis et al (1984) Proceedings of ICSE 1994

Click to edit Master title style 5 Investigate Distributions

Low Medium High

Rel

ativ

e p

rod

uct

ivit

y (

of m

ean

)

Software Engineering Practices Usage

-100

-50

0

50

100

150

200

250

Click to edit Master title style 6 Account for Maturity Effects

Which organization will be required to spend more effort on correcting software flaws

Click to edit Master title style 7 Caution for External Data Sets

26

DACS Us

Productivity

Siz

e Data definitions

Data collection

Data validity

Data age

Demographics

Click to edit Master title style

1) Tracking traditional projects

2) Iterative and Agile measurement

3) Team Software Process measurement

33

Section 2 Measuring to Manage

Software Projects

Click to edit Master title style Measuring Project Progress

300K 250K 200K 150K 100K 50K 0K

Jan Feb Mar Apr May Jun

Planned rate of code Growth

Actual code growth

Completion of coding

ldquoThe measures say we are on schedulerdquo

300K

250K

200K

150K

100K

50K

0K Jan Feb Mar Apr May Jun

Planned rate of code Growth

Actual code growth

Completion of coding

ldquoBut the code keeps growing We should have measured the requirements growthrdquo

50

100

150

Defect reports

Total

Open

Closed

125

75

25

Jan Feb Mar Apr May Jun

Completion of coding

ldquoAnd all the defect reports remaining to be fixed will keep us far behind schedulerdquo

Click to edit Master title style Requirements Change Impact

Number of requirements

affected

Chart6

Requirements Added
Requirements Changed
Requirements Deleted
Person hours of work or rework
2
20
1
2
11
1
45
35
3
15
61
1
1
20
4
15
15
0
18
05
2

Sheet1

Sheet1

Requirements Added
Requirements Changed
Requirements Deleted
Person hours of work or rework

Sheet2

Sheet3

Click to edit Master title style Phase Defect Removal Model

PHASE

Escapes Previous Phase KSLOC

Defect Injection KSLOC

Subtotal KSLOC

Removal Effective-ness

Defect Removal KSLOC

Escapes at phase exit KSLOC

Rqmts 0 12 12 0 0 12

Design 12 180 192 76 146 46

Code amp UT 46 154 200 71 142 58

Inte-gration 58 0 58 67 39 19

Sys Test 19 0 19 58 11 08

Opera-tion 08

Kan (1999) Metrics and Model in Software Quality Engineering

Click to edit Master title style SCRUM Board Scrumboard displays

the path to completion for each story by showing task status

Story points are an estimate of days for each story

Agile estimating is by feature (story) rather than by task

lsquoDonersquo does not always mean all tasks are finished

Kanban methods restrict the work in progress to a limited number of paths

Click to edit Master title style Burndown Chart and Velocity

Burndown chart displays story completion by day

Burndown chart indicates actual versus estimated progress

Velocity is the rate at which stories are completed

Velocity indicates sustainable progress

Velocity results provide one source of historical data for improving estimates

Story points without the context of productivity factors are dangerous for estimating

Click to edit Master title style Team Velocity

0

1

2

3

4

5

1 2 2 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20Days

Story Points per Hour

0

1

2

3

4

5

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

HrsStryPt

Analyzing Velocity

Item Cycle Times

02468

101214161820

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

Stories Hours HrsStryPt

0

10

20

30

40

50

60

1 2 2 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

Burndown Chart

Days

Click to edit Master title style Measuring and Managing Flow

Released

In progress

Backlogged

Cumulative Flow Diagram Days to delivery

outliers

Anderson (2010) Kanban

Click to edit Master title style

0

10

20

30

40

50

60

70

80

1 2 3 4 5 6 7 8 9 10 11 12

Sprints

Sto

ry p

oin

ts d

eliv

ered

Trends Over Sprints

0

10

20

30

40

50

60

70

80

1 2 3 4 5 6 7 8 9 10 11 12

Sprints

Sto

ry p

oin

ts d

eliv

ered

0

05

1

15

2

25

3

35

4

1 2 3 4 5 6 7 8 9 10 11 12

Sprints

Tech

nic

al d

ebt

Track and correlate multiple measures

across sprints

Predictive models

Click to edit Master title style Sources and Measures

Stories submitted

Stories pulled back

Growth in size of stories

Stories added mid-sprint

Stories under review

Failed tests

Non-development tasks

Assists to other projects

Data sources

Measures Project tracking

Code control

Bug tracking

Build system

Test environment

Deployment tools

Operational monitoring

Click to edit Master title style Recommended Books

Click to edit Master title style Personal Software Process (PSP)

PSP 0 Current process Time recording

Defect recording Defect type standards

PSP 01 Coding standard

Size measurement Process improvement

proposal

Baseline Personal Process

Humphrey (1997) Introduction to the Personal Software Process

PSP 1 Size estimating

Test report

Personal Planning

PSP 2 Code reviews

Design reviews

Personal Quality Management

PSP 21 Cyclic development

Cyclic Personal Process

PSP 11 Task planning

Schedule planning

PSP 21 Design templates

Click to edit Master title style Team Software Process (TSP) bull Built from personal processes of team members

bull Well defined team roles

bull Project launch workshops for planning

bull Team measurement and tracking

bull Post-mortems for improvement

bull Team application of lean principles

Humphrey (2000) Introduction to the Team Software Process

Click to edit Master title style Aggregated Personal Data Best

R2s range from 70 to 87

The ability to predict the amount of time required to produce a piece of software is dramatically improved by estimating at the individual level first and then combining the estimates rather than developing a single team estimate averaged over individuals

R2 = 53

Click to edit Master title style Defect Detection at Intuit

Consequently test changes from defect

detection into correctness verification

PSPTSP shifts defect detection from the test phase back up into development

Fagan (2005)

Personal reviews

33

Compile 33

Team reviews 19

Unit test 15

Test 14

Click to edit Master title style

Dramatic reductions

ndash Delivered defects

ndash Days per KLOC

ndash Schedule deviation

ndash Test defectskloc

ndash Variation in results

Humphrey (2006) TSP Leading a Development Team

TSP Benefits

Click to edit Master title style Section 3 Section 3 Measuring Software

Improvement Programs

1) Improvement program methods

2) Maturity-based measurement

3) Improvement program results

61

Click to edit Master title style CMMI Maturity Transitions

Level 5 Optimizing

Innovation management

Level 4 Quantitatively

Managed

Capability management

Level 3 Defined

Process management

Level 2 Managed

Project management

Level 1 Initial

Inconsistent management

Chrisis Konrad amp Shrum (2005) CMMI

Click to edit Master title style Enhanced Maturity Framework

Level 5 Innovating

Innovation management

Level 4 Optimized

Capability management

Level 3 Standardized

Organizational management

Level 2 Stabilized

Work unit management

Level 1 Initial

Inconsistent management

Click to edit Master title style Maturity Level Transformation

Level 5 Opportunistic improvements

Empowered culture

Level 5 Proactive

improvements Agile culture

Organization

Projects

Individual

Level 4 Process and results

managed statistically Precision culture

Level 3 Organization develops

standard processes Common culture

Level 2 Project managers stabilize projects

Local culture

Level 1 Ad Hoc inconsistent

development practices Relationship culture

Trust

Click to edit Master title style Measurement by Level

Level 5 mdash Exploratory data ∆ t = N x ROI

Level 4 mdash Predictive data X plusmn (196σ radicn )

Level 3 mdash Process data 83 defects per review

Level 2 mdash Management data Effort = α LOCβ

Level 1 mdash Haphazard data ldquoabout a million linesrdquo

Measure-maturity mismatch slows improvement and creates

dysfunction

Click to edit Master title style Average Improvement Results

Productivity growth 4 35 9 - 67

Pre-test defect detection 3 22 6 - 25

Time to market 2 darr 19 15 - 23

Field error reports 5 darr 39 10 - 94

Return on investment 5 501 41 - 881

Annual Annual Improvement benefit Orgs median range

Click to edit Master title style Schedule Performance

0

02

04

06

08

1

12

14

16

1 2 3

Schedule performance

index

CMM maturity level

Budgeted cost of work performed

Budgeted cost of work scheduled

behind ahead

Air Force study of contractors

Flowe amp Thordahl (1994)

Click to edit Master title style Cost Performance

Cost performance

index

0

05

1

15

2

25

1 2 3 CMM Maturity Level

Budgeted cost of work performed Actual cost of

work performed

overrun underrun

Air Force study of contractors

Flowe amp Thordahl (1994)

Click to edit Master title style Crosbyrsquos Cost of Quality Model Project Cost

Performance Cost of Quality Generating plans documentation Development - requirements - design - code - integration Prevention Appraisal

Reviews - system - requirements - design - test plan - test scripts Walkthroughs Testing (first time) Independent Verification and Validation (first time) Audits

Training Policies Procedures and methods Tools Planning Quality Improvement Projects Data Gathering and Analysis Root Cause Analysis Quality Reporting

Nonconformance Conformance Fixing Defects Reworking any Document Updating Source Code Re-reviews Re-tests Lab costs for re-tests Patches(Internal and External) Engineering Changes Change Control Boards External Failures and fines Customer Support Help Desks

Dion (1993) Crosby (1979) Quality Is Free

Click to edit Master title style Raytheons Cost of Quality

bull Performance mdash cost of building it right first time bull Nonconformance mdash cost of rework bull Appraisal mdash cost of testing bull Prevention mdash cost of preventing nonconformance

Dion (1993) Haley (1995)

23

18

Year Level Perform Nonconf Appraise Prevent

1988 1 34 41 15 7

1990 2 55 18 15 12

1992 3 66 11

1994 4 76 6 75

Click to edit Master title style Raytheonrsquos Productivity

0

1

2

3

4

1988 1989 1990 1991 1992 1993 1994

Growth relative to 1989 baseline

Lydon (1995)

Click to edit Master title style Raytheonrsquos Cost Reduction

0

02

04

06

08

1

12

1988 1989 1990 1991 1992 1993 1994

Reduction relative to 1989 baseline

Lydon (1995)

Click to edit Master title style

Haley (1995)

1993 1992 1991 1990 1989 1988 1994

50

40

30

20

10

0

-10

Actual cost

Budgeted cost

over-run under-run

4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1

Raytheonrsquos Cost Predictability

Click to edit Master title style OMRONrsquos Reuse gains

Sakamoto et al (1996)

0

5 0 0

1 0 0 0

1 5 0 0

2 0 0 0

2 5 0 0

3 0 0 0

1 9 8 7 1 9 8 8 1 9 8 9 1 9 9 0 1 9 9 1 1 9 9 2 1 9 9 3 1 9 9 4

Steps

100

75

50

25

0

of reuse

Click to edit Master title style Section 4 Section 4 Measuring Quality and

Technical Debt

1) Structural quality measurement

2) Technical debt

3) Consortium for IT Software Quality (CISQ)

89

Click to edit Master title style What about the Product CMMIrsquos primary assumption

ldquoThe quality of a software system is governed by quality of the process used to develop itrdquo

- Watts Humphrey

CMMIrsquos product quality problem

1 Assessments do not verify product quality

2 Compliance focus undermines CMMIrsquos primary assumption

3 Product quality focus at Level 4 in CMM was lost in quantitative process control in CMMI

4 The assumptions underlying control charts are violated by software data

Click to edit Master title style Six Sigmarsquos Challenge

Process focus ndash process improvement ndash Six Sigma Product focus ndash product improvement ndash Design for 6σ

Product focus supplements product focus to unlock even more value from software

Six Sigma projects must have significant benefits Huge benefits

Large benefits

Good benefits

Okay benefits

Small benefits

Now what

Ultimately we run out of projects with significant enough benefits to continue the programhellip

so what is the solution that allows continual improvement

Click to edit Master title style Testing is Not Enough ldquoAs higher levels of assurance are demandedhelliptesting cannot deliver the level of confidence required at a reasonable costrdquo

ldquoThe correctness of the code is rarely the weakest linkrdquo

Jackson D (2009) Spinellis D (2006)

ldquohellipa failure to satisfy a non-functional requirement can be critical even catastrophichellipnon-functional requirements are sometimes difficult to verify We cannot write a test case to verify a systemrsquos reliabilityhellipThe ability to associate code to non-functional properties can be a powerful weapon in a software engineerrsquos arsenalrdquo

Click to edit Master title style CASTrsquos Application Intelligence Platform

Application Analysis

Evaluation of 1200+ coding amp

architectural rules

Application meta-data

Transferability

Changeability

Robustness

Performance

Security

Quality Measurements

Detected Violations

Expensive operation in loop Static vs pooled connections Complex query on big table Large indices on big table

Empty CATCH block Uncontrolled data access Poor memory management Opened resource not closed

SQL injection Cross-site scripting Buffer overflow Uncontrolled format string

Unstructured code Misuse of inheritance Lack of comments Violated naming convention

Highly coupled component Duplicated code Index modified in loop High cyclomatic complexity

Language Parsers

Oracle PLSQL Sybase T-SQL SQL Server T-SQL IBM SQLPSM C C++ C Pro C Cobol CICS Visual Basic VBNet ASPNet Java J2EE JSP XML HTML Javascript VBScript PHP PowerBuilder Oracle Forms PeopleSoft SAP ABAP Netweaver Tibco Business Objects Universal Analyzer for other languages

Click to edit Master title style Modern Apps are a Technology Stack Ar

chite

ctur

al C

ompl

ianc

e

Data Flow Transaction Risk

EJB PLSQL

Oracle

SQL Server

DB2

TSQL

Hibernate

Spring

Struts NET

COBOL

Sybase IMS

Messaging

Integration quality Architectural

compliance Risk propagation Application security Resiliency checks Transaction integrity

Function point Effort estimation Data access control SDK versioning Calibration across

technologies IT organization level

System Level 3

Code style amp layout Expression complexity Code documentation Class or program design Basic coding standards Developer level

Unit Level 1

Java Java Java

Java Java

Web Services Java Java Single languagetechnology layer

Intra-technology architecture Intra-layer dependencies Inter-program invocation Security vulnerabilities Development team level

Technology Level 2

JSP ASPNET APIs

Click to edit Master title style Architecturally Complex Defects

Primary cause of operational

problems

20x as many

fixes to correct

Architecturally Complex Defect

A structural flaw involving interactions among multiple components that reside in different application layers

48

52 92

8 Architecturally

Complex Defects

Code unit-level violations

of total app defects

of total repair effort

Most architecturally complex defects involve an architectural hotspot ndash a badly-built component that should be replaced

Click to edit Master title style Propagated Risk

The impact of violation 284 in Component B is very widely propagated throughout the system ndash

it presents very high risk

The impact of violation 342 in Component A is only narrowly propagated in the system

ndash it presents low risk

Propagated Risk Index

A ranking of the risk created by each violation based on the its severity and the breadth of its impact in the system

Violation 342 in component A

Violation 284 in component B

Remediating violation 284 in Component B will have

the greater impact on reducing risk and cost

Click to edit Master title style Transaction Risk

Customers care most about the dependability of their transactions

The risk of a transaction is determined by the number and severity of violations along its path

Transactions can be ranked by risk to prioritize their remediation

The Transaction Risk Index can be further tuned using the operational frequency of each transaction

Transaction Risk Index

A ranking of transactions based on the risk created by the number and severity of violations along the transaction path

Transaction entryexit in the user interface layer

Transaction fulfillment in the data storage layer

1 3 2 4 5

Transaction 1 poses greatest risk

Transaction processing in the business logic layer

Click to edit Master title style Risk-Based Prioritization

Violation Health Factor Severity Propagated Risk Index

In a high risk transaction

Frequency of path execution

148 Security 9

371 Robustness 9

062 Performance 8

284 Robustness 7

016 Changeability 7

241 Security 5

173 Transferability 4

359 Transferability 3

Click to edit Master title style Static Analysis in US DoD

John Keane (2013)

Testing finds lt13 of structural defects

Static analysis before testing is 85 efficient

With testing and static analysis combined test schedules are reduced by 50

Click to edit Master title style Reducing Operational Incidents amp Costs

Study of structural quality measures and maintenance effort across 20 customers in a large global system integrator

TQI increase of 24 decreased corrective maintenance effort by 50

Total Quality Index

Click to edit Master title style Reducing Operational Losses

0

20

40

60

80

100

120

140

160

180

100 150 200 250 300 350 400

Technical Quality Index

Pro

du

ctio

n In

cid

ents

+$2000000 Annual Loss

$50000 Annual Loss

$5000 Annual Loss

Large international investment bank Business critical applications

Click to edit Master title style Factors Affecting Structural Quality

2017 CRASH Report Request from infocastsoftwarecom

Factor apps Robust Security Perform Change Transfer

Technology 1606 20 3 16 26 7

Java-EE

Industry 677 5 5 7 4 6

App type 510 1 1 1 1 1

Source 580 1 1 4

Shore 540

Maturity 72 28 25 15 24 12

Method 208 10 6 14 6

Team size 186 7 5 8 8

of users 262 4 3 5 4

Percentage of variation in structural quality scores accounted for by various factors

Click to edit Master title style The Technical Debt Metaphor

Structural quality problems in production code

Technical Debt

Principal borrowed

Interest on the debt

Business Risk Liability from debt

Opportunity cost

Interestmdashcontinuing IT costs attributable to the violations causing technical debt ie higher maintenance costs greater resource usage etc

Principalcost of fixing problems remaining in the code after release that must be remediated

Opportunity costmdashbenefits that could have been achieved had resources been put on new capability rather than retiring technical debt

Liabilitymdashbusiness costs related to outages breaches corrupted data etc

Technical Debt the future cost of fixing defects remaining in code at release a component of the cost of ownership

Todayrsquos talk focuses on the principal

Curtis et al (2012) IEEE Software

Click to edit Master title style Tech Debt by Quality Factor

Transferability

40

Changeability

30

Security 7

Robustness

18

70 of Technical Debt is in IT Cost (Transferability Changeability)

30 of Technical Debt is in Business Risk

(Robustness Performance Security)

Health Factor proportions are mostly

consistent across technologies

Curtis et al (2012) IEEE Software

Click to edit Master title style Join CISQ wwwit-cisqorg

Click to edit Master title style Section

References

115

Click to edit Master title style References 1 Anderson DJ (2010) Kanban Sequim WA Blue Hole Press

Basili V Trendowicz A Kowalczyk M Heidrich J Seaman C Munch J amp Rombach D (2014) Aligning Organizations through Measurement Heidelberg Springer

Boehm B (1987) Increasing software productivity IEEE Computer 20 (9) 43-57

Boehm B et al (2000) Software Cost Estimation with COCOMO II Prentice-Hall

Chrissis MB Konrad M amp Shrum S (2011) CMMI Third Edition Guidelines for Process Integration and Product Improvement Reading MA Addison Wesley

Cohn M (2006) Agile Estimating and Planning Upper Saddle River NJ Prentice-Hall

Crosby P (1979) Quality Is Free New York McGraw-Hill

Curtis B Sapiddi J amp Szynkarski A (2012) Estimating the principle of an applicationrsquos technical debt IEEE Software 29 (6) 34-42

Dion R (1993) Process improvement and the corporate balance sheet IEEE Software 10 (4) 28-35

Duncker R (1992) Presentation at the 25th Annual Conference of the Singapore Computer Society Singapore November 1992

Dyne K (1999) ESSI Benefits from Continuous Software Improvements In Proceedings of ESEPGrsquo99 Milton Keyes UK ESPI Foundation

Flowe RM amp Thordahl JB (1994) A Correlational Study of the SEIrsquos Capability Maturity Model and Software Development Performance in DoD Contracts (AFITGSSLAR94D-2) Dayton OH Air Force Institute of Technology Wright Patterson Air Force Base

116

Click to edit Master title style References 2 Haley T Ireland B Wojtaszek E Nash D amp Dion R (1995) Raytheon Electronic Systems

Experience in Software Process Improvement (Tech Rep CMUSEI-95-TR-017) ) Pittsburgh Software Engineering Institute Carnegie Mellon University

Ebert C amp Dumke R (2007) Software Measurement Berlin Springer-Verlag

Fagan E amp Davis N (2005) TSP PSP at Intuit Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Fenton N (2006) New directions for Software Metrics CIO Symposium on Software Best Practices

Garmus D amp Herron D (2001) Function Point Analysis Boston Addison-Wesley

George E (2005) Project management with TSP Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Herbsleb J Carleton A Rozum J Siegel J amp Zubrow D (1994) Benefits of CMM-Based Software Process Improvement Initial Results (Tech Rep CMUSEI-94-TR-13) Pittsburgh Software Engineering Institute Carnegie Mellon University

Humphrey W (1997) Introduction to the Personal Software Process Boston Addison Wesley

Humphrey W (2000) Introduction to the Team Software Process Boston Addison Wesley

Humphrey W (2006) TSP Leading a Development Team Boston Addison Wesley

Jackson D (2009) A direct path to dependable software Communications of the ACM 52 (4)

Kan SH (1995) Metrics and Models in Software Quality Engineering Reading MA Addison-Wesley

Keeni B (2000) IEEE Software 16 (4)

117

Click to edit Master title style References 2 Larman C (2004) Agile and Iterative Development Boston Addison Wesley

Lydon T (1995) Productivity drivers Process and capital In Proceedings of the 1995 SEPG Conference Pittsburgh Software Engineering Institute Carnegie Mellon University

McGarry J et al (2002) Practical Software Measurement Reading MA Addison Wesley

Paulk M Weber C Curtis B amp Chrissis M (1995) The Capability Maturity Model Guidelines for Improving the Software Process Reading MA Addison-Wesley

Peterman W (2000) IEEE Software JulyAugust issue

Poppendieck M amp Poppendieck T (2007) Implementing Lean Software Development Boston Addison-Wesley

Sakamoto K Kishida K amp Nakakoji K (1996) Cultural adaptation of the CMM In Fuggetta A amp Wolf A (Eds) Software Process Chichester UK Wiley 137-154

Sinha M (2006) Performance models ndash Enabling organizations prediction capability Proceedings of SEPGrsquo06 Pittsburgh Software Engineering Institute Carnegie Mellon University

Schwaber K amp Beedle M (2002) Agile Software Development Using Scrum

Spinellis D (2006) Code Quality Addison-Wesley

Van Epps S amp Marshall R (2012) Using TSP at the end of the development cycle Proceedings of TSP 2012 Pittsburgh Software Engineering Institute Carnegie Mellon University

Vu JD (1996) Software process improvement A business case In Proceedings of the European SEPG Conference Milton Keynes UK European Software Process Improvement Foundation

118

  • Best Practices for Software Process and Product Measurement
  • Instructor Dr Bill Curtis
  • Agenda
  • Section 1
  • Why Does Measurement Languish
  • Measurement Process Guidance
  • Measurement Categories
  • Productivity Analysis Objectives
  • Productivity Analysis Measures
  • Software Productivity
  • Software Size Measures
  • How Many Lines of Code
  • Function Point Estimation
  • Effort mdash Weakest Measure
  • How Quality Affects Productivity
  • Carry-forward Rework
  • Example of Quality Impact
  • Quality-Adjusted Productivity
  • Best Practices for Analysis
  • 1 Evaluate Demographics
  • 2 Segment Baselines
  • 3 Beware Sampling Effects
  • 4 Understand Variation
  • 5 Investigate Distributions
  • 6 Account for Maturity Effects
  • 7 Caution for External Data Sets
  • Slide Number 27
  • Measuring Project Progress
  • Requirements Change Impact
  • Phase Defect Removal Model
  • SCRUM Board
  • Burndown Chart and Velocity
  • Analyzing Velocity
  • Measuring and Managing Flow
  • Trends Over Sprints
  • Sources and Measures
  • Recommended Books
  • Personal Software Process (PSP)
  • Team Software Process (TSP)
  • Aggregated Personal Data Best
  • Defect Detection at Intuit
  • TSP Benefits
  • Section 3
  • CMMI Maturity Transitions
  • Enhanced Maturity Framework
  • Maturity Level Transformation
  • Measurement by Level
  • Average Improvement Results
  • Schedule Performance
  • Cost Performance
  • Crosbyrsquos Cost of Quality Model
  • Raytheons Cost of Quality
  • Raytheonrsquos Productivity
  • Raytheonrsquos Cost Reduction
  • Raytheonrsquos Cost Predictability
  • OMRONrsquos Reuse gains
  • Section 4
  • What about the Product
  • Six Sigmarsquos Challenge
  • Testing is Not Enough
  • CASTrsquos Application Intelligence Platform
  • Modern Apps are a Technology Stack
  • Architecturally Complex Defects
  • Propagated Risk
  • Transaction Risk
  • Risk-Based Prioritization
  • Static Analysis in US DoD
  • Reducing Operational Incidents amp Costs
  • Reducing Operational Losses
  • Slide Number 70
  • The Technical Debt Metaphor
  • Tech Debt by Quality Factor
  • Join CISQ wwwit-cisqorg
  • Section
  • References 1
  • References 2
  • References 2

PHASE

Escapes Previous Phase

KSLOC

Defect Injection

KSLOC

Subtotal

KSLOC

Removal

Effective-ness

Defect Removal

KSLOC

Escapes at phase exit KSLOC

Rqmts

0

12

12

0

0

12

Design

12

180

192

76

146

46

Code amp UT

46

154

200

71

142

58

Inte-gration

58

0

58

67

39

19

Sys Test

19

0

19

58

11

08

Opera-tion

08

Requirements Added Requirements Changed Requirements Deleted
Feb 2 20 1
Mar 2 11 1
Apr 45 35 3
May 15 61 1
Jun 1 20 4
Jul 15 15 0
Aug 18 05 2
Feb Feb Feb
Mar Mar Mar
Apr Apr Apr
May May May
Jun Jun Jun
Jul Jul Jul
Aug Aug Aug
Page 20: Best Practices for Software Process and Product Measurement · Software Process and Product Measurement Dr. Bill Curtis ... measurement best practice • Best guidance for starting

Click to edit Master title style 1 Evaluate Demographics

Routine reports

COBOL apps

Function Points

Prod

uctiv

ity

Web-based apps

Innovative mobile

Custom ERP

Click to edit Master title style 2 Segment Baselines

Year Projects Productivity Total Corporate 1981 28 2342 1980 21 1939 Telecommunications 1981 14 1811 1980 12 1458 Engineering amp Defense 1981 8 2965 1980 6 2739 Business Applications 1981 6 3054 1980 3 1813

Multiple baselines are usually the most valid

Click to edit Master title style 3 Beware Sampling Effects

22

0

500

1000

1500

2000

2500

1980 1981 1982

Line

s of

cod

epe

rson

-yea

rs

Baselines

Lots of small

projects Lots of large

projects

Click to edit Master title style 4 Understand Variation

23

Product factors bull timing constraints bull memory utilization bull CPU occupancy bull resource constraints bull complexity bull product size

Project factors bull concurrent development bull development computer bull requirements definition bull requirements churn bull programming practices bull personnel experience bull client experience bull client participation

Project 49

Product 16

Not explained

35

R2product = 16

R2project = 49

R2

model = 65

J Vosburgh B Curtis et al (1984) Proceedings of ICSE 1994

Click to edit Master title style 5 Investigate Distributions

Low Medium High

Rel

ativ

e p

rod

uct

ivit

y (

of m

ean

)

Software Engineering Practices Usage

-100

-50

0

50

100

150

200

250

Click to edit Master title style 6 Account for Maturity Effects

Which organization will be required to spend more effort on correcting software flaws

Click to edit Master title style 7 Caution for External Data Sets

26

DACS Us

Productivity

Siz

e Data definitions

Data collection

Data validity

Data age

Demographics

Click to edit Master title style

1) Tracking traditional projects

2) Iterative and Agile measurement

3) Team Software Process measurement

33

Section 2 Measuring to Manage

Software Projects

Click to edit Master title style Measuring Project Progress

300K 250K 200K 150K 100K 50K 0K

Jan Feb Mar Apr May Jun

Planned rate of code Growth

Actual code growth

Completion of coding

ldquoThe measures say we are on schedulerdquo

300K

250K

200K

150K

100K

50K

0K Jan Feb Mar Apr May Jun

Planned rate of code Growth

Actual code growth

Completion of coding

ldquoBut the code keeps growing We should have measured the requirements growthrdquo

50

100

150

Defect reports

Total

Open

Closed

125

75

25

Jan Feb Mar Apr May Jun

Completion of coding

ldquoAnd all the defect reports remaining to be fixed will keep us far behind schedulerdquo

Click to edit Master title style Requirements Change Impact

Number of requirements

affected

Chart6

Requirements Added
Requirements Changed
Requirements Deleted
Person hours of work or rework
2
20
1
2
11
1
45
35
3
15
61
1
1
20
4
15
15
0
18
05
2

Sheet1

Sheet1

Requirements Added
Requirements Changed
Requirements Deleted
Person hours of work or rework

Sheet2

Sheet3

Click to edit Master title style Phase Defect Removal Model

PHASE

Escapes Previous Phase KSLOC

Defect Injection KSLOC

Subtotal KSLOC

Removal Effective-ness

Defect Removal KSLOC

Escapes at phase exit KSLOC

Rqmts 0 12 12 0 0 12

Design 12 180 192 76 146 46

Code amp UT 46 154 200 71 142 58

Inte-gration 58 0 58 67 39 19

Sys Test 19 0 19 58 11 08

Opera-tion 08

Kan (1999) Metrics and Model in Software Quality Engineering

Click to edit Master title style SCRUM Board Scrumboard displays

the path to completion for each story by showing task status

Story points are an estimate of days for each story

Agile estimating is by feature (story) rather than by task

lsquoDonersquo does not always mean all tasks are finished

Kanban methods restrict the work in progress to a limited number of paths

Click to edit Master title style Burndown Chart and Velocity

Burndown chart displays story completion by day

Burndown chart indicates actual versus estimated progress

Velocity is the rate at which stories are completed

Velocity indicates sustainable progress

Velocity results provide one source of historical data for improving estimates

Story points without the context of productivity factors are dangerous for estimating

Click to edit Master title style Team Velocity

0

1

2

3

4

5

1 2 2 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20Days

Story Points per Hour

0

1

2

3

4

5

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

HrsStryPt

Analyzing Velocity

Item Cycle Times

02468

101214161820

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

Stories Hours HrsStryPt

0

10

20

30

40

50

60

1 2 2 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

Burndown Chart

Days

Click to edit Master title style Measuring and Managing Flow

Released

In progress

Backlogged

Cumulative Flow Diagram Days to delivery

outliers

Anderson (2010) Kanban

Click to edit Master title style

0

10

20

30

40

50

60

70

80

1 2 3 4 5 6 7 8 9 10 11 12

Sprints

Sto

ry p

oin

ts d

eliv

ered

Trends Over Sprints

0

10

20

30

40

50

60

70

80

1 2 3 4 5 6 7 8 9 10 11 12

Sprints

Sto

ry p

oin

ts d

eliv

ered

0

05

1

15

2

25

3

35

4

1 2 3 4 5 6 7 8 9 10 11 12

Sprints

Tech

nic

al d

ebt

Track and correlate multiple measures

across sprints

Predictive models

Click to edit Master title style Sources and Measures

Stories submitted

Stories pulled back

Growth in size of stories

Stories added mid-sprint

Stories under review

Failed tests

Non-development tasks

Assists to other projects

Data sources

Measures Project tracking

Code control

Bug tracking

Build system

Test environment

Deployment tools

Operational monitoring

Click to edit Master title style Recommended Books

Click to edit Master title style Personal Software Process (PSP)

PSP 0 Current process Time recording

Defect recording Defect type standards

PSP 01 Coding standard

Size measurement Process improvement

proposal

Baseline Personal Process

Humphrey (1997) Introduction to the Personal Software Process

PSP 1 Size estimating

Test report

Personal Planning

PSP 2 Code reviews

Design reviews

Personal Quality Management

PSP 21 Cyclic development

Cyclic Personal Process

PSP 11 Task planning

Schedule planning

PSP 21 Design templates

Click to edit Master title style Team Software Process (TSP) bull Built from personal processes of team members

bull Well defined team roles

bull Project launch workshops for planning

bull Team measurement and tracking

bull Post-mortems for improvement

bull Team application of lean principles

Humphrey (2000) Introduction to the Team Software Process

Click to edit Master title style Aggregated Personal Data Best

R2s range from 70 to 87

The ability to predict the amount of time required to produce a piece of software is dramatically improved by estimating at the individual level first and then combining the estimates rather than developing a single team estimate averaged over individuals

R2 = 53

Click to edit Master title style Defect Detection at Intuit

Consequently test changes from defect

detection into correctness verification

PSPTSP shifts defect detection from the test phase back up into development

Fagan (2005)

Personal reviews

33

Compile 33

Team reviews 19

Unit test 15

Test 14

Click to edit Master title style

Dramatic reductions

ndash Delivered defects

ndash Days per KLOC

ndash Schedule deviation

ndash Test defectskloc

ndash Variation in results

Humphrey (2006) TSP Leading a Development Team

TSP Benefits

Click to edit Master title style Section 3 Section 3 Measuring Software

Improvement Programs

1) Improvement program methods

2) Maturity-based measurement

3) Improvement program results

61

Click to edit Master title style CMMI Maturity Transitions

Level 5 Optimizing

Innovation management

Level 4 Quantitatively

Managed

Capability management

Level 3 Defined

Process management

Level 2 Managed

Project management

Level 1 Initial

Inconsistent management

Chrisis Konrad amp Shrum (2005) CMMI

Click to edit Master title style Enhanced Maturity Framework

Level 5 Innovating

Innovation management

Level 4 Optimized

Capability management

Level 3 Standardized

Organizational management

Level 2 Stabilized

Work unit management

Level 1 Initial

Inconsistent management

Click to edit Master title style Maturity Level Transformation

Level 5 Opportunistic improvements

Empowered culture

Level 5 Proactive

improvements Agile culture

Organization

Projects

Individual

Level 4 Process and results

managed statistically Precision culture

Level 3 Organization develops

standard processes Common culture

Level 2 Project managers stabilize projects

Local culture

Level 1 Ad Hoc inconsistent

development practices Relationship culture

Trust

Click to edit Master title style Measurement by Level

Level 5 mdash Exploratory data ∆ t = N x ROI

Level 4 mdash Predictive data X plusmn (196σ radicn )

Level 3 mdash Process data 83 defects per review

Level 2 mdash Management data Effort = α LOCβ

Level 1 mdash Haphazard data ldquoabout a million linesrdquo

Measure-maturity mismatch slows improvement and creates

dysfunction

Click to edit Master title style Average Improvement Results

Productivity growth 4 35 9 - 67

Pre-test defect detection 3 22 6 - 25

Time to market 2 darr 19 15 - 23

Field error reports 5 darr 39 10 - 94

Return on investment 5 501 41 - 881

Annual Annual Improvement benefit Orgs median range

Click to edit Master title style Schedule Performance

0

02

04

06

08

1

12

14

16

1 2 3

Schedule performance

index

CMM maturity level

Budgeted cost of work performed

Budgeted cost of work scheduled

behind ahead

Air Force study of contractors

Flowe amp Thordahl (1994)

Click to edit Master title style Cost Performance

Cost performance

index

0

05

1

15

2

25

1 2 3 CMM Maturity Level

Budgeted cost of work performed Actual cost of

work performed

overrun underrun

Air Force study of contractors

Flowe amp Thordahl (1994)

Click to edit Master title style Crosbyrsquos Cost of Quality Model Project Cost

Performance Cost of Quality Generating plans documentation Development - requirements - design - code - integration Prevention Appraisal

Reviews - system - requirements - design - test plan - test scripts Walkthroughs Testing (first time) Independent Verification and Validation (first time) Audits

Training Policies Procedures and methods Tools Planning Quality Improvement Projects Data Gathering and Analysis Root Cause Analysis Quality Reporting

Nonconformance Conformance Fixing Defects Reworking any Document Updating Source Code Re-reviews Re-tests Lab costs for re-tests Patches(Internal and External) Engineering Changes Change Control Boards External Failures and fines Customer Support Help Desks

Dion (1993) Crosby (1979) Quality Is Free

Click to edit Master title style Raytheons Cost of Quality

bull Performance mdash cost of building it right first time bull Nonconformance mdash cost of rework bull Appraisal mdash cost of testing bull Prevention mdash cost of preventing nonconformance

Dion (1993) Haley (1995)

23

18

Year Level Perform Nonconf Appraise Prevent

1988 1 34 41 15 7

1990 2 55 18 15 12

1992 3 66 11

1994 4 76 6 75

Click to edit Master title style Raytheonrsquos Productivity

0

1

2

3

4

1988 1989 1990 1991 1992 1993 1994

Growth relative to 1989 baseline

Lydon (1995)

Click to edit Master title style Raytheonrsquos Cost Reduction

0

02

04

06

08

1

12

1988 1989 1990 1991 1992 1993 1994

Reduction relative to 1989 baseline

Lydon (1995)

Click to edit Master title style

Haley (1995)

1993 1992 1991 1990 1989 1988 1994

50

40

30

20

10

0

-10

Actual cost

Budgeted cost

over-run under-run

4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1

Raytheonrsquos Cost Predictability

Click to edit Master title style OMRONrsquos Reuse gains

Sakamoto et al (1996)

0

5 0 0

1 0 0 0

1 5 0 0

2 0 0 0

2 5 0 0

3 0 0 0

1 9 8 7 1 9 8 8 1 9 8 9 1 9 9 0 1 9 9 1 1 9 9 2 1 9 9 3 1 9 9 4

Steps

100

75

50

25

0

of reuse

Click to edit Master title style Section 4 Section 4 Measuring Quality and

Technical Debt

1) Structural quality measurement

2) Technical debt

3) Consortium for IT Software Quality (CISQ)

89

Click to edit Master title style What about the Product CMMIrsquos primary assumption

ldquoThe quality of a software system is governed by quality of the process used to develop itrdquo

- Watts Humphrey

CMMIrsquos product quality problem

1 Assessments do not verify product quality

2 Compliance focus undermines CMMIrsquos primary assumption

3 Product quality focus at Level 4 in CMM was lost in quantitative process control in CMMI

4 The assumptions underlying control charts are violated by software data

Click to edit Master title style Six Sigmarsquos Challenge

Process focus ndash process improvement ndash Six Sigma Product focus ndash product improvement ndash Design for 6σ

Product focus supplements product focus to unlock even more value from software

Six Sigma projects must have significant benefits Huge benefits

Large benefits

Good benefits

Okay benefits

Small benefits

Now what

Ultimately we run out of projects with significant enough benefits to continue the programhellip

so what is the solution that allows continual improvement

Click to edit Master title style Testing is Not Enough ldquoAs higher levels of assurance are demandedhelliptesting cannot deliver the level of confidence required at a reasonable costrdquo

ldquoThe correctness of the code is rarely the weakest linkrdquo

Jackson D (2009) Spinellis D (2006)

ldquohellipa failure to satisfy a non-functional requirement can be critical even catastrophichellipnon-functional requirements are sometimes difficult to verify We cannot write a test case to verify a systemrsquos reliabilityhellipThe ability to associate code to non-functional properties can be a powerful weapon in a software engineerrsquos arsenalrdquo

Click to edit Master title style CASTrsquos Application Intelligence Platform

Application Analysis

Evaluation of 1200+ coding amp

architectural rules

Application meta-data

Transferability

Changeability

Robustness

Performance

Security

Quality Measurements

Detected Violations

Expensive operation in loop Static vs pooled connections Complex query on big table Large indices on big table

Empty CATCH block Uncontrolled data access Poor memory management Opened resource not closed

SQL injection Cross-site scripting Buffer overflow Uncontrolled format string

Unstructured code Misuse of inheritance Lack of comments Violated naming convention

Highly coupled component Duplicated code Index modified in loop High cyclomatic complexity

Language Parsers

Oracle PLSQL Sybase T-SQL SQL Server T-SQL IBM SQLPSM C C++ C Pro C Cobol CICS Visual Basic VBNet ASPNet Java J2EE JSP XML HTML Javascript VBScript PHP PowerBuilder Oracle Forms PeopleSoft SAP ABAP Netweaver Tibco Business Objects Universal Analyzer for other languages

Click to edit Master title style Modern Apps are a Technology Stack Ar

chite

ctur

al C

ompl

ianc

e

Data Flow Transaction Risk

EJB PLSQL

Oracle

SQL Server

DB2

TSQL

Hibernate

Spring

Struts NET

COBOL

Sybase IMS

Messaging

Integration quality Architectural

compliance Risk propagation Application security Resiliency checks Transaction integrity

Function point Effort estimation Data access control SDK versioning Calibration across

technologies IT organization level

System Level 3

Code style amp layout Expression complexity Code documentation Class or program design Basic coding standards Developer level

Unit Level 1

Java Java Java

Java Java

Web Services Java Java Single languagetechnology layer

Intra-technology architecture Intra-layer dependencies Inter-program invocation Security vulnerabilities Development team level

Technology Level 2

JSP ASPNET APIs

Click to edit Master title style Architecturally Complex Defects

Primary cause of operational

problems

20x as many

fixes to correct

Architecturally Complex Defect

A structural flaw involving interactions among multiple components that reside in different application layers

48

52 92

8 Architecturally

Complex Defects

Code unit-level violations

of total app defects

of total repair effort

Most architecturally complex defects involve an architectural hotspot ndash a badly-built component that should be replaced

Click to edit Master title style Propagated Risk

The impact of violation 284 in Component B is very widely propagated throughout the system ndash

it presents very high risk

The impact of violation 342 in Component A is only narrowly propagated in the system

ndash it presents low risk

Propagated Risk Index

A ranking of the risk created by each violation based on the its severity and the breadth of its impact in the system

Violation 342 in component A

Violation 284 in component B

Remediating violation 284 in Component B will have

the greater impact on reducing risk and cost

Click to edit Master title style Transaction Risk

Customers care most about the dependability of their transactions

The risk of a transaction is determined by the number and severity of violations along its path

Transactions can be ranked by risk to prioritize their remediation

The Transaction Risk Index can be further tuned using the operational frequency of each transaction

Transaction Risk Index

A ranking of transactions based on the risk created by the number and severity of violations along the transaction path

Transaction entryexit in the user interface layer

Transaction fulfillment in the data storage layer

1 3 2 4 5

Transaction 1 poses greatest risk

Transaction processing in the business logic layer

Click to edit Master title style Risk-Based Prioritization

Violation Health Factor Severity Propagated Risk Index

In a high risk transaction

Frequency of path execution

148 Security 9

371 Robustness 9

062 Performance 8

284 Robustness 7

016 Changeability 7

241 Security 5

173 Transferability 4

359 Transferability 3

Click to edit Master title style Static Analysis in US DoD

John Keane (2013)

Testing finds lt13 of structural defects

Static analysis before testing is 85 efficient

With testing and static analysis combined test schedules are reduced by 50

Click to edit Master title style Reducing Operational Incidents amp Costs

Study of structural quality measures and maintenance effort across 20 customers in a large global system integrator

TQI increase of 24 decreased corrective maintenance effort by 50

Total Quality Index

Click to edit Master title style Reducing Operational Losses

0

20

40

60

80

100

120

140

160

180

100 150 200 250 300 350 400

Technical Quality Index

Pro

du

ctio

n In

cid

ents

+$2000000 Annual Loss

$50000 Annual Loss

$5000 Annual Loss

Large international investment bank Business critical applications

Click to edit Master title style Factors Affecting Structural Quality

2017 CRASH Report Request from infocastsoftwarecom

Factor apps Robust Security Perform Change Transfer

Technology 1606 20 3 16 26 7

Java-EE

Industry 677 5 5 7 4 6

App type 510 1 1 1 1 1

Source 580 1 1 4

Shore 540

Maturity 72 28 25 15 24 12

Method 208 10 6 14 6

Team size 186 7 5 8 8

of users 262 4 3 5 4

Percentage of variation in structural quality scores accounted for by various factors

Click to edit Master title style The Technical Debt Metaphor

Structural quality problems in production code

Technical Debt

Principal borrowed

Interest on the debt

Business Risk Liability from debt

Opportunity cost

Interestmdashcontinuing IT costs attributable to the violations causing technical debt ie higher maintenance costs greater resource usage etc

Principalcost of fixing problems remaining in the code after release that must be remediated

Opportunity costmdashbenefits that could have been achieved had resources been put on new capability rather than retiring technical debt

Liabilitymdashbusiness costs related to outages breaches corrupted data etc

Technical Debt the future cost of fixing defects remaining in code at release a component of the cost of ownership

Todayrsquos talk focuses on the principal

Curtis et al (2012) IEEE Software

Click to edit Master title style Tech Debt by Quality Factor

Transferability

40

Changeability

30

Security 7

Robustness

18

70 of Technical Debt is in IT Cost (Transferability Changeability)

30 of Technical Debt is in Business Risk

(Robustness Performance Security)

Health Factor proportions are mostly

consistent across technologies

Curtis et al (2012) IEEE Software

Click to edit Master title style Join CISQ wwwit-cisqorg

Click to edit Master title style Section

References

115

Click to edit Master title style References 1 Anderson DJ (2010) Kanban Sequim WA Blue Hole Press

Basili V Trendowicz A Kowalczyk M Heidrich J Seaman C Munch J amp Rombach D (2014) Aligning Organizations through Measurement Heidelberg Springer

Boehm B (1987) Increasing software productivity IEEE Computer 20 (9) 43-57

Boehm B et al (2000) Software Cost Estimation with COCOMO II Prentice-Hall

Chrissis MB Konrad M amp Shrum S (2011) CMMI Third Edition Guidelines for Process Integration and Product Improvement Reading MA Addison Wesley

Cohn M (2006) Agile Estimating and Planning Upper Saddle River NJ Prentice-Hall

Crosby P (1979) Quality Is Free New York McGraw-Hill

Curtis B Sapiddi J amp Szynkarski A (2012) Estimating the principle of an applicationrsquos technical debt IEEE Software 29 (6) 34-42

Dion R (1993) Process improvement and the corporate balance sheet IEEE Software 10 (4) 28-35

Duncker R (1992) Presentation at the 25th Annual Conference of the Singapore Computer Society Singapore November 1992

Dyne K (1999) ESSI Benefits from Continuous Software Improvements In Proceedings of ESEPGrsquo99 Milton Keyes UK ESPI Foundation

Flowe RM amp Thordahl JB (1994) A Correlational Study of the SEIrsquos Capability Maturity Model and Software Development Performance in DoD Contracts (AFITGSSLAR94D-2) Dayton OH Air Force Institute of Technology Wright Patterson Air Force Base

116

Click to edit Master title style References 2 Haley T Ireland B Wojtaszek E Nash D amp Dion R (1995) Raytheon Electronic Systems

Experience in Software Process Improvement (Tech Rep CMUSEI-95-TR-017) ) Pittsburgh Software Engineering Institute Carnegie Mellon University

Ebert C amp Dumke R (2007) Software Measurement Berlin Springer-Verlag

Fagan E amp Davis N (2005) TSP PSP at Intuit Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Fenton N (2006) New directions for Software Metrics CIO Symposium on Software Best Practices

Garmus D amp Herron D (2001) Function Point Analysis Boston Addison-Wesley

George E (2005) Project management with TSP Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Herbsleb J Carleton A Rozum J Siegel J amp Zubrow D (1994) Benefits of CMM-Based Software Process Improvement Initial Results (Tech Rep CMUSEI-94-TR-13) Pittsburgh Software Engineering Institute Carnegie Mellon University

Humphrey W (1997) Introduction to the Personal Software Process Boston Addison Wesley

Humphrey W (2000) Introduction to the Team Software Process Boston Addison Wesley

Humphrey W (2006) TSP Leading a Development Team Boston Addison Wesley

Jackson D (2009) A direct path to dependable software Communications of the ACM 52 (4)

Kan SH (1995) Metrics and Models in Software Quality Engineering Reading MA Addison-Wesley

Keeni B (2000) IEEE Software 16 (4)

117

Click to edit Master title style References 2 Larman C (2004) Agile and Iterative Development Boston Addison Wesley

Lydon T (1995) Productivity drivers Process and capital In Proceedings of the 1995 SEPG Conference Pittsburgh Software Engineering Institute Carnegie Mellon University

McGarry J et al (2002) Practical Software Measurement Reading MA Addison Wesley

Paulk M Weber C Curtis B amp Chrissis M (1995) The Capability Maturity Model Guidelines for Improving the Software Process Reading MA Addison-Wesley

Peterman W (2000) IEEE Software JulyAugust issue

Poppendieck M amp Poppendieck T (2007) Implementing Lean Software Development Boston Addison-Wesley

Sakamoto K Kishida K amp Nakakoji K (1996) Cultural adaptation of the CMM In Fuggetta A amp Wolf A (Eds) Software Process Chichester UK Wiley 137-154

Sinha M (2006) Performance models ndash Enabling organizations prediction capability Proceedings of SEPGrsquo06 Pittsburgh Software Engineering Institute Carnegie Mellon University

Schwaber K amp Beedle M (2002) Agile Software Development Using Scrum

Spinellis D (2006) Code Quality Addison-Wesley

Van Epps S amp Marshall R (2012) Using TSP at the end of the development cycle Proceedings of TSP 2012 Pittsburgh Software Engineering Institute Carnegie Mellon University

Vu JD (1996) Software process improvement A business case In Proceedings of the European SEPG Conference Milton Keynes UK European Software Process Improvement Foundation

118

  • Best Practices for Software Process and Product Measurement
  • Instructor Dr Bill Curtis
  • Agenda
  • Section 1
  • Why Does Measurement Languish
  • Measurement Process Guidance
  • Measurement Categories
  • Productivity Analysis Objectives
  • Productivity Analysis Measures
  • Software Productivity
  • Software Size Measures
  • How Many Lines of Code
  • Function Point Estimation
  • Effort mdash Weakest Measure
  • How Quality Affects Productivity
  • Carry-forward Rework
  • Example of Quality Impact
  • Quality-Adjusted Productivity
  • Best Practices for Analysis
  • 1 Evaluate Demographics
  • 2 Segment Baselines
  • 3 Beware Sampling Effects
  • 4 Understand Variation
  • 5 Investigate Distributions
  • 6 Account for Maturity Effects
  • 7 Caution for External Data Sets
  • Slide Number 27
  • Measuring Project Progress
  • Requirements Change Impact
  • Phase Defect Removal Model
  • SCRUM Board
  • Burndown Chart and Velocity
  • Analyzing Velocity
  • Measuring and Managing Flow
  • Trends Over Sprints
  • Sources and Measures
  • Recommended Books
  • Personal Software Process (PSP)
  • Team Software Process (TSP)
  • Aggregated Personal Data Best
  • Defect Detection at Intuit
  • TSP Benefits
  • Section 3
  • CMMI Maturity Transitions
  • Enhanced Maturity Framework
  • Maturity Level Transformation
  • Measurement by Level
  • Average Improvement Results
  • Schedule Performance
  • Cost Performance
  • Crosbyrsquos Cost of Quality Model
  • Raytheons Cost of Quality
  • Raytheonrsquos Productivity
  • Raytheonrsquos Cost Reduction
  • Raytheonrsquos Cost Predictability
  • OMRONrsquos Reuse gains
  • Section 4
  • What about the Product
  • Six Sigmarsquos Challenge
  • Testing is Not Enough
  • CASTrsquos Application Intelligence Platform
  • Modern Apps are a Technology Stack
  • Architecturally Complex Defects
  • Propagated Risk
  • Transaction Risk
  • Risk-Based Prioritization
  • Static Analysis in US DoD
  • Reducing Operational Incidents amp Costs
  • Reducing Operational Losses
  • Slide Number 70
  • The Technical Debt Metaphor
  • Tech Debt by Quality Factor
  • Join CISQ wwwit-cisqorg
  • Section
  • References 1
  • References 2
  • References 2

PHASE

Escapes Previous Phase

KSLOC

Defect Injection

KSLOC

Subtotal

KSLOC

Removal

Effective-ness

Defect Removal

KSLOC

Escapes at phase exit KSLOC

Rqmts

0

12

12

0

0

12

Design

12

180

192

76

146

46

Code amp UT

46

154

200

71

142

58

Inte-gration

58

0

58

67

39

19

Sys Test

19

0

19

58

11

08

Opera-tion

08

Requirements Added Requirements Changed Requirements Deleted
Feb 2 20 1
Mar 2 11 1
Apr 45 35 3
May 15 61 1
Jun 1 20 4
Jul 15 15 0
Aug 18 05 2
Feb Feb Feb
Mar Mar Mar
Apr Apr Apr
May May May
Jun Jun Jun
Jul Jul Jul
Aug Aug Aug
Page 21: Best Practices for Software Process and Product Measurement · Software Process and Product Measurement Dr. Bill Curtis ... measurement best practice • Best guidance for starting

Click to edit Master title style 2 Segment Baselines

Year Projects Productivity Total Corporate 1981 28 2342 1980 21 1939 Telecommunications 1981 14 1811 1980 12 1458 Engineering amp Defense 1981 8 2965 1980 6 2739 Business Applications 1981 6 3054 1980 3 1813

Multiple baselines are usually the most valid

Click to edit Master title style 3 Beware Sampling Effects

22

0

500

1000

1500

2000

2500

1980 1981 1982

Line

s of

cod

epe

rson

-yea

rs

Baselines

Lots of small

projects Lots of large

projects

Click to edit Master title style 4 Understand Variation

23

Product factors bull timing constraints bull memory utilization bull CPU occupancy bull resource constraints bull complexity bull product size

Project factors bull concurrent development bull development computer bull requirements definition bull requirements churn bull programming practices bull personnel experience bull client experience bull client participation

Project 49

Product 16

Not explained

35

R2product = 16

R2project = 49

R2

model = 65

J Vosburgh B Curtis et al (1984) Proceedings of ICSE 1994

Click to edit Master title style 5 Investigate Distributions

Low Medium High

Rel

ativ

e p

rod

uct

ivit

y (

of m

ean

)

Software Engineering Practices Usage

-100

-50

0

50

100

150

200

250

Click to edit Master title style 6 Account for Maturity Effects

Which organization will be required to spend more effort on correcting software flaws

Click to edit Master title style 7 Caution for External Data Sets

26

DACS Us

Productivity

Siz

e Data definitions

Data collection

Data validity

Data age

Demographics

Click to edit Master title style

1) Tracking traditional projects

2) Iterative and Agile measurement

3) Team Software Process measurement

33

Section 2 Measuring to Manage

Software Projects

Click to edit Master title style Measuring Project Progress

300K 250K 200K 150K 100K 50K 0K

Jan Feb Mar Apr May Jun

Planned rate of code Growth

Actual code growth

Completion of coding

ldquoThe measures say we are on schedulerdquo

300K

250K

200K

150K

100K

50K

0K Jan Feb Mar Apr May Jun

Planned rate of code Growth

Actual code growth

Completion of coding

ldquoBut the code keeps growing We should have measured the requirements growthrdquo

50

100

150

Defect reports

Total

Open

Closed

125

75

25

Jan Feb Mar Apr May Jun

Completion of coding

ldquoAnd all the defect reports remaining to be fixed will keep us far behind schedulerdquo

Click to edit Master title style Requirements Change Impact

Number of requirements

affected

Chart6

Requirements Added
Requirements Changed
Requirements Deleted
Person hours of work or rework
2
20
1
2
11
1
45
35
3
15
61
1
1
20
4
15
15
0
18
05
2

Sheet1

Sheet1

Requirements Added
Requirements Changed
Requirements Deleted
Person hours of work or rework

Sheet2

Sheet3

Click to edit Master title style Phase Defect Removal Model

PHASE

Escapes Previous Phase KSLOC

Defect Injection KSLOC

Subtotal KSLOC

Removal Effective-ness

Defect Removal KSLOC

Escapes at phase exit KSLOC

Rqmts 0 12 12 0 0 12

Design 12 180 192 76 146 46

Code amp UT 46 154 200 71 142 58

Inte-gration 58 0 58 67 39 19

Sys Test 19 0 19 58 11 08

Opera-tion 08

Kan (1999) Metrics and Model in Software Quality Engineering

Click to edit Master title style SCRUM Board Scrumboard displays

the path to completion for each story by showing task status

Story points are an estimate of days for each story

Agile estimating is by feature (story) rather than by task

lsquoDonersquo does not always mean all tasks are finished

Kanban methods restrict the work in progress to a limited number of paths

Click to edit Master title style Burndown Chart and Velocity

Burndown chart displays story completion by day

Burndown chart indicates actual versus estimated progress

Velocity is the rate at which stories are completed

Velocity indicates sustainable progress

Velocity results provide one source of historical data for improving estimates

Story points without the context of productivity factors are dangerous for estimating

Click to edit Master title style Team Velocity

0

1

2

3

4

5

1 2 2 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20Days

Story Points per Hour

0

1

2

3

4

5

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

HrsStryPt

Analyzing Velocity

Item Cycle Times

02468

101214161820

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

Stories Hours HrsStryPt

0

10

20

30

40

50

60

1 2 2 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

Burndown Chart

Days

Click to edit Master title style Measuring and Managing Flow

Released

In progress

Backlogged

Cumulative Flow Diagram Days to delivery

outliers

Anderson (2010) Kanban

Click to edit Master title style

0

10

20

30

40

50

60

70

80

1 2 3 4 5 6 7 8 9 10 11 12

Sprints

Sto

ry p

oin

ts d

eliv

ered

Trends Over Sprints

0

10

20

30

40

50

60

70

80

1 2 3 4 5 6 7 8 9 10 11 12

Sprints

Sto

ry p

oin

ts d

eliv

ered

0

05

1

15

2

25

3

35

4

1 2 3 4 5 6 7 8 9 10 11 12

Sprints

Tech

nic

al d

ebt

Track and correlate multiple measures

across sprints

Predictive models

Click to edit Master title style Sources and Measures

Stories submitted

Stories pulled back

Growth in size of stories

Stories added mid-sprint

Stories under review

Failed tests

Non-development tasks

Assists to other projects

Data sources

Measures Project tracking

Code control

Bug tracking

Build system

Test environment

Deployment tools

Operational monitoring

Click to edit Master title style Recommended Books

Click to edit Master title style Personal Software Process (PSP)

PSP 0 Current process Time recording

Defect recording Defect type standards

PSP 01 Coding standard

Size measurement Process improvement

proposal

Baseline Personal Process

Humphrey (1997) Introduction to the Personal Software Process

PSP 1 Size estimating

Test report

Personal Planning

PSP 2 Code reviews

Design reviews

Personal Quality Management

PSP 21 Cyclic development

Cyclic Personal Process

PSP 11 Task planning

Schedule planning

PSP 21 Design templates

Click to edit Master title style Team Software Process (TSP) bull Built from personal processes of team members

bull Well defined team roles

bull Project launch workshops for planning

bull Team measurement and tracking

bull Post-mortems for improvement

bull Team application of lean principles

Humphrey (2000) Introduction to the Team Software Process

Click to edit Master title style Aggregated Personal Data Best

R2s range from 70 to 87

The ability to predict the amount of time required to produce a piece of software is dramatically improved by estimating at the individual level first and then combining the estimates rather than developing a single team estimate averaged over individuals

R2 = 53

Click to edit Master title style Defect Detection at Intuit

Consequently test changes from defect

detection into correctness verification

PSPTSP shifts defect detection from the test phase back up into development

Fagan (2005)

Personal reviews

33

Compile 33

Team reviews 19

Unit test 15

Test 14

Click to edit Master title style

Dramatic reductions

ndash Delivered defects

ndash Days per KLOC

ndash Schedule deviation

ndash Test defectskloc

ndash Variation in results

Humphrey (2006) TSP Leading a Development Team

TSP Benefits

Click to edit Master title style Section 3 Section 3 Measuring Software

Improvement Programs

1) Improvement program methods

2) Maturity-based measurement

3) Improvement program results

61

Click to edit Master title style CMMI Maturity Transitions

Level 5 Optimizing

Innovation management

Level 4 Quantitatively

Managed

Capability management

Level 3 Defined

Process management

Level 2 Managed

Project management

Level 1 Initial

Inconsistent management

Chrisis Konrad amp Shrum (2005) CMMI

Click to edit Master title style Enhanced Maturity Framework

Level 5 Innovating

Innovation management

Level 4 Optimized

Capability management

Level 3 Standardized

Organizational management

Level 2 Stabilized

Work unit management

Level 1 Initial

Inconsistent management

Click to edit Master title style Maturity Level Transformation

Level 5 Opportunistic improvements

Empowered culture

Level 5 Proactive

improvements Agile culture

Organization

Projects

Individual

Level 4 Process and results

managed statistically Precision culture

Level 3 Organization develops

standard processes Common culture

Level 2 Project managers stabilize projects

Local culture

Level 1 Ad Hoc inconsistent

development practices Relationship culture

Trust

Click to edit Master title style Measurement by Level

Level 5 mdash Exploratory data ∆ t = N x ROI

Level 4 mdash Predictive data X plusmn (196σ radicn )

Level 3 mdash Process data 83 defects per review

Level 2 mdash Management data Effort = α LOCβ

Level 1 mdash Haphazard data ldquoabout a million linesrdquo

Measure-maturity mismatch slows improvement and creates

dysfunction

Click to edit Master title style Average Improvement Results

Productivity growth 4 35 9 - 67

Pre-test defect detection 3 22 6 - 25

Time to market 2 darr 19 15 - 23

Field error reports 5 darr 39 10 - 94

Return on investment 5 501 41 - 881

Annual Annual Improvement benefit Orgs median range

Click to edit Master title style Schedule Performance

0

02

04

06

08

1

12

14

16

1 2 3

Schedule performance

index

CMM maturity level

Budgeted cost of work performed

Budgeted cost of work scheduled

behind ahead

Air Force study of contractors

Flowe amp Thordahl (1994)

Click to edit Master title style Cost Performance

Cost performance

index

0

05

1

15

2

25

1 2 3 CMM Maturity Level

Budgeted cost of work performed Actual cost of

work performed

overrun underrun

Air Force study of contractors

Flowe amp Thordahl (1994)

Click to edit Master title style Crosbyrsquos Cost of Quality Model Project Cost

Performance Cost of Quality Generating plans documentation Development - requirements - design - code - integration Prevention Appraisal

Reviews - system - requirements - design - test plan - test scripts Walkthroughs Testing (first time) Independent Verification and Validation (first time) Audits

Training Policies Procedures and methods Tools Planning Quality Improvement Projects Data Gathering and Analysis Root Cause Analysis Quality Reporting

Nonconformance Conformance Fixing Defects Reworking any Document Updating Source Code Re-reviews Re-tests Lab costs for re-tests Patches(Internal and External) Engineering Changes Change Control Boards External Failures and fines Customer Support Help Desks

Dion (1993) Crosby (1979) Quality Is Free

Click to edit Master title style Raytheons Cost of Quality

bull Performance mdash cost of building it right first time bull Nonconformance mdash cost of rework bull Appraisal mdash cost of testing bull Prevention mdash cost of preventing nonconformance

Dion (1993) Haley (1995)

23

18

Year Level Perform Nonconf Appraise Prevent

1988 1 34 41 15 7

1990 2 55 18 15 12

1992 3 66 11

1994 4 76 6 75

Click to edit Master title style Raytheonrsquos Productivity

0

1

2

3

4

1988 1989 1990 1991 1992 1993 1994

Growth relative to 1989 baseline

Lydon (1995)

Click to edit Master title style Raytheonrsquos Cost Reduction

0

02

04

06

08

1

12

1988 1989 1990 1991 1992 1993 1994

Reduction relative to 1989 baseline

Lydon (1995)

Click to edit Master title style

Haley (1995)

1993 1992 1991 1990 1989 1988 1994

50

40

30

20

10

0

-10

Actual cost

Budgeted cost

over-run under-run

4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1

Raytheonrsquos Cost Predictability

Click to edit Master title style OMRONrsquos Reuse gains

Sakamoto et al (1996)

0

5 0 0

1 0 0 0

1 5 0 0

2 0 0 0

2 5 0 0

3 0 0 0

1 9 8 7 1 9 8 8 1 9 8 9 1 9 9 0 1 9 9 1 1 9 9 2 1 9 9 3 1 9 9 4

Steps

100

75

50

25

0

of reuse

Click to edit Master title style Section 4 Section 4 Measuring Quality and

Technical Debt

1) Structural quality measurement

2) Technical debt

3) Consortium for IT Software Quality (CISQ)

89

Click to edit Master title style What about the Product CMMIrsquos primary assumption

ldquoThe quality of a software system is governed by quality of the process used to develop itrdquo

- Watts Humphrey

CMMIrsquos product quality problem

1 Assessments do not verify product quality

2 Compliance focus undermines CMMIrsquos primary assumption

3 Product quality focus at Level 4 in CMM was lost in quantitative process control in CMMI

4 The assumptions underlying control charts are violated by software data

Click to edit Master title style Six Sigmarsquos Challenge

Process focus ndash process improvement ndash Six Sigma Product focus ndash product improvement ndash Design for 6σ

Product focus supplements product focus to unlock even more value from software

Six Sigma projects must have significant benefits Huge benefits

Large benefits

Good benefits

Okay benefits

Small benefits

Now what

Ultimately we run out of projects with significant enough benefits to continue the programhellip

so what is the solution that allows continual improvement

Click to edit Master title style Testing is Not Enough ldquoAs higher levels of assurance are demandedhelliptesting cannot deliver the level of confidence required at a reasonable costrdquo

ldquoThe correctness of the code is rarely the weakest linkrdquo

Jackson D (2009) Spinellis D (2006)

ldquohellipa failure to satisfy a non-functional requirement can be critical even catastrophichellipnon-functional requirements are sometimes difficult to verify We cannot write a test case to verify a systemrsquos reliabilityhellipThe ability to associate code to non-functional properties can be a powerful weapon in a software engineerrsquos arsenalrdquo

Click to edit Master title style CASTrsquos Application Intelligence Platform

Application Analysis

Evaluation of 1200+ coding amp

architectural rules

Application meta-data

Transferability

Changeability

Robustness

Performance

Security

Quality Measurements

Detected Violations

Expensive operation in loop Static vs pooled connections Complex query on big table Large indices on big table

Empty CATCH block Uncontrolled data access Poor memory management Opened resource not closed

SQL injection Cross-site scripting Buffer overflow Uncontrolled format string

Unstructured code Misuse of inheritance Lack of comments Violated naming convention

Highly coupled component Duplicated code Index modified in loop High cyclomatic complexity

Language Parsers

Oracle PLSQL Sybase T-SQL SQL Server T-SQL IBM SQLPSM C C++ C Pro C Cobol CICS Visual Basic VBNet ASPNet Java J2EE JSP XML HTML Javascript VBScript PHP PowerBuilder Oracle Forms PeopleSoft SAP ABAP Netweaver Tibco Business Objects Universal Analyzer for other languages

Click to edit Master title style Modern Apps are a Technology Stack Ar

chite

ctur

al C

ompl

ianc

e

Data Flow Transaction Risk

EJB PLSQL

Oracle

SQL Server

DB2

TSQL

Hibernate

Spring

Struts NET

COBOL

Sybase IMS

Messaging

Integration quality Architectural

compliance Risk propagation Application security Resiliency checks Transaction integrity

Function point Effort estimation Data access control SDK versioning Calibration across

technologies IT organization level

System Level 3

Code style amp layout Expression complexity Code documentation Class or program design Basic coding standards Developer level

Unit Level 1

Java Java Java

Java Java

Web Services Java Java Single languagetechnology layer

Intra-technology architecture Intra-layer dependencies Inter-program invocation Security vulnerabilities Development team level

Technology Level 2

JSP ASPNET APIs

Click to edit Master title style Architecturally Complex Defects

Primary cause of operational

problems

20x as many

fixes to correct

Architecturally Complex Defect

A structural flaw involving interactions among multiple components that reside in different application layers

48

52 92

8 Architecturally

Complex Defects

Code unit-level violations

of total app defects

of total repair effort

Most architecturally complex defects involve an architectural hotspot ndash a badly-built component that should be replaced

Click to edit Master title style Propagated Risk

The impact of violation 284 in Component B is very widely propagated throughout the system ndash

it presents very high risk

The impact of violation 342 in Component A is only narrowly propagated in the system

ndash it presents low risk

Propagated Risk Index

A ranking of the risk created by each violation based on the its severity and the breadth of its impact in the system

Violation 342 in component A

Violation 284 in component B

Remediating violation 284 in Component B will have

the greater impact on reducing risk and cost

Click to edit Master title style Transaction Risk

Customers care most about the dependability of their transactions

The risk of a transaction is determined by the number and severity of violations along its path

Transactions can be ranked by risk to prioritize their remediation

The Transaction Risk Index can be further tuned using the operational frequency of each transaction

Transaction Risk Index

A ranking of transactions based on the risk created by the number and severity of violations along the transaction path

Transaction entryexit in the user interface layer

Transaction fulfillment in the data storage layer

1 3 2 4 5

Transaction 1 poses greatest risk

Transaction processing in the business logic layer

Click to edit Master title style Risk-Based Prioritization

Violation Health Factor Severity Propagated Risk Index

In a high risk transaction

Frequency of path execution

148 Security 9

371 Robustness 9

062 Performance 8

284 Robustness 7

016 Changeability 7

241 Security 5

173 Transferability 4

359 Transferability 3

Click to edit Master title style Static Analysis in US DoD

John Keane (2013)

Testing finds lt13 of structural defects

Static analysis before testing is 85 efficient

With testing and static analysis combined test schedules are reduced by 50

Click to edit Master title style Reducing Operational Incidents amp Costs

Study of structural quality measures and maintenance effort across 20 customers in a large global system integrator

TQI increase of 24 decreased corrective maintenance effort by 50

Total Quality Index

Click to edit Master title style Reducing Operational Losses

0

20

40

60

80

100

120

140

160

180

100 150 200 250 300 350 400

Technical Quality Index

Pro

du

ctio

n In

cid

ents

+$2000000 Annual Loss

$50000 Annual Loss

$5000 Annual Loss

Large international investment bank Business critical applications

Click to edit Master title style Factors Affecting Structural Quality

2017 CRASH Report Request from infocastsoftwarecom

Factor apps Robust Security Perform Change Transfer

Technology 1606 20 3 16 26 7

Java-EE

Industry 677 5 5 7 4 6

App type 510 1 1 1 1 1

Source 580 1 1 4

Shore 540

Maturity 72 28 25 15 24 12

Method 208 10 6 14 6

Team size 186 7 5 8 8

of users 262 4 3 5 4

Percentage of variation in structural quality scores accounted for by various factors

Click to edit Master title style The Technical Debt Metaphor

Structural quality problems in production code

Technical Debt

Principal borrowed

Interest on the debt

Business Risk Liability from debt

Opportunity cost

Interestmdashcontinuing IT costs attributable to the violations causing technical debt ie higher maintenance costs greater resource usage etc

Principalcost of fixing problems remaining in the code after release that must be remediated

Opportunity costmdashbenefits that could have been achieved had resources been put on new capability rather than retiring technical debt

Liabilitymdashbusiness costs related to outages breaches corrupted data etc

Technical Debt the future cost of fixing defects remaining in code at release a component of the cost of ownership

Todayrsquos talk focuses on the principal

Curtis et al (2012) IEEE Software

Click to edit Master title style Tech Debt by Quality Factor

Transferability

40

Changeability

30

Security 7

Robustness

18

70 of Technical Debt is in IT Cost (Transferability Changeability)

30 of Technical Debt is in Business Risk

(Robustness Performance Security)

Health Factor proportions are mostly

consistent across technologies

Curtis et al (2012) IEEE Software

Click to edit Master title style Join CISQ wwwit-cisqorg

Click to edit Master title style Section

References

115

Click to edit Master title style References 1 Anderson DJ (2010) Kanban Sequim WA Blue Hole Press

Basili V Trendowicz A Kowalczyk M Heidrich J Seaman C Munch J amp Rombach D (2014) Aligning Organizations through Measurement Heidelberg Springer

Boehm B (1987) Increasing software productivity IEEE Computer 20 (9) 43-57

Boehm B et al (2000) Software Cost Estimation with COCOMO II Prentice-Hall

Chrissis MB Konrad M amp Shrum S (2011) CMMI Third Edition Guidelines for Process Integration and Product Improvement Reading MA Addison Wesley

Cohn M (2006) Agile Estimating and Planning Upper Saddle River NJ Prentice-Hall

Crosby P (1979) Quality Is Free New York McGraw-Hill

Curtis B Sapiddi J amp Szynkarski A (2012) Estimating the principle of an applicationrsquos technical debt IEEE Software 29 (6) 34-42

Dion R (1993) Process improvement and the corporate balance sheet IEEE Software 10 (4) 28-35

Duncker R (1992) Presentation at the 25th Annual Conference of the Singapore Computer Society Singapore November 1992

Dyne K (1999) ESSI Benefits from Continuous Software Improvements In Proceedings of ESEPGrsquo99 Milton Keyes UK ESPI Foundation

Flowe RM amp Thordahl JB (1994) A Correlational Study of the SEIrsquos Capability Maturity Model and Software Development Performance in DoD Contracts (AFITGSSLAR94D-2) Dayton OH Air Force Institute of Technology Wright Patterson Air Force Base

116

Click to edit Master title style References 2 Haley T Ireland B Wojtaszek E Nash D amp Dion R (1995) Raytheon Electronic Systems

Experience in Software Process Improvement (Tech Rep CMUSEI-95-TR-017) ) Pittsburgh Software Engineering Institute Carnegie Mellon University

Ebert C amp Dumke R (2007) Software Measurement Berlin Springer-Verlag

Fagan E amp Davis N (2005) TSP PSP at Intuit Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Fenton N (2006) New directions for Software Metrics CIO Symposium on Software Best Practices

Garmus D amp Herron D (2001) Function Point Analysis Boston Addison-Wesley

George E (2005) Project management with TSP Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Herbsleb J Carleton A Rozum J Siegel J amp Zubrow D (1994) Benefits of CMM-Based Software Process Improvement Initial Results (Tech Rep CMUSEI-94-TR-13) Pittsburgh Software Engineering Institute Carnegie Mellon University

Humphrey W (1997) Introduction to the Personal Software Process Boston Addison Wesley

Humphrey W (2000) Introduction to the Team Software Process Boston Addison Wesley

Humphrey W (2006) TSP Leading a Development Team Boston Addison Wesley

Jackson D (2009) A direct path to dependable software Communications of the ACM 52 (4)

Kan SH (1995) Metrics and Models in Software Quality Engineering Reading MA Addison-Wesley

Keeni B (2000) IEEE Software 16 (4)

117

Click to edit Master title style References 2 Larman C (2004) Agile and Iterative Development Boston Addison Wesley

Lydon T (1995) Productivity drivers Process and capital In Proceedings of the 1995 SEPG Conference Pittsburgh Software Engineering Institute Carnegie Mellon University

McGarry J et al (2002) Practical Software Measurement Reading MA Addison Wesley

Paulk M Weber C Curtis B amp Chrissis M (1995) The Capability Maturity Model Guidelines for Improving the Software Process Reading MA Addison-Wesley

Peterman W (2000) IEEE Software JulyAugust issue

Poppendieck M amp Poppendieck T (2007) Implementing Lean Software Development Boston Addison-Wesley

Sakamoto K Kishida K amp Nakakoji K (1996) Cultural adaptation of the CMM In Fuggetta A amp Wolf A (Eds) Software Process Chichester UK Wiley 137-154

Sinha M (2006) Performance models ndash Enabling organizations prediction capability Proceedings of SEPGrsquo06 Pittsburgh Software Engineering Institute Carnegie Mellon University

Schwaber K amp Beedle M (2002) Agile Software Development Using Scrum

Spinellis D (2006) Code Quality Addison-Wesley

Van Epps S amp Marshall R (2012) Using TSP at the end of the development cycle Proceedings of TSP 2012 Pittsburgh Software Engineering Institute Carnegie Mellon University

Vu JD (1996) Software process improvement A business case In Proceedings of the European SEPG Conference Milton Keynes UK European Software Process Improvement Foundation

118

  • Best Practices for Software Process and Product Measurement
  • Instructor Dr Bill Curtis
  • Agenda
  • Section 1
  • Why Does Measurement Languish
  • Measurement Process Guidance
  • Measurement Categories
  • Productivity Analysis Objectives
  • Productivity Analysis Measures
  • Software Productivity
  • Software Size Measures
  • How Many Lines of Code
  • Function Point Estimation
  • Effort mdash Weakest Measure
  • How Quality Affects Productivity
  • Carry-forward Rework
  • Example of Quality Impact
  • Quality-Adjusted Productivity
  • Best Practices for Analysis
  • 1 Evaluate Demographics
  • 2 Segment Baselines
  • 3 Beware Sampling Effects
  • 4 Understand Variation
  • 5 Investigate Distributions
  • 6 Account for Maturity Effects
  • 7 Caution for External Data Sets
  • Slide Number 27
  • Measuring Project Progress
  • Requirements Change Impact
  • Phase Defect Removal Model
  • SCRUM Board
  • Burndown Chart and Velocity
  • Analyzing Velocity
  • Measuring and Managing Flow
  • Trends Over Sprints
  • Sources and Measures
  • Recommended Books
  • Personal Software Process (PSP)
  • Team Software Process (TSP)
  • Aggregated Personal Data Best
  • Defect Detection at Intuit
  • TSP Benefits
  • Section 3
  • CMMI Maturity Transitions
  • Enhanced Maturity Framework
  • Maturity Level Transformation
  • Measurement by Level
  • Average Improvement Results
  • Schedule Performance
  • Cost Performance
  • Crosbyrsquos Cost of Quality Model
  • Raytheons Cost of Quality
  • Raytheonrsquos Productivity
  • Raytheonrsquos Cost Reduction
  • Raytheonrsquos Cost Predictability
  • OMRONrsquos Reuse gains
  • Section 4
  • What about the Product
  • Six Sigmarsquos Challenge
  • Testing is Not Enough
  • CASTrsquos Application Intelligence Platform
  • Modern Apps are a Technology Stack
  • Architecturally Complex Defects
  • Propagated Risk
  • Transaction Risk
  • Risk-Based Prioritization
  • Static Analysis in US DoD
  • Reducing Operational Incidents amp Costs
  • Reducing Operational Losses
  • Slide Number 70
  • The Technical Debt Metaphor
  • Tech Debt by Quality Factor
  • Join CISQ wwwit-cisqorg
  • Section
  • References 1
  • References 2
  • References 2

PHASE

Escapes Previous Phase

KSLOC

Defect Injection

KSLOC

Subtotal

KSLOC

Removal

Effective-ness

Defect Removal

KSLOC

Escapes at phase exit KSLOC

Rqmts

0

12

12

0

0

12

Design

12

180

192

76

146

46

Code amp UT

46

154

200

71

142

58

Inte-gration

58

0

58

67

39

19

Sys Test

19

0

19

58

11

08

Opera-tion

08

Requirements Added Requirements Changed Requirements Deleted
Feb 2 20 1
Mar 2 11 1
Apr 45 35 3
May 15 61 1
Jun 1 20 4
Jul 15 15 0
Aug 18 05 2
Feb Feb Feb
Mar Mar Mar
Apr Apr Apr
May May May
Jun Jun Jun
Jul Jul Jul
Aug Aug Aug
Page 22: Best Practices for Software Process and Product Measurement · Software Process and Product Measurement Dr. Bill Curtis ... measurement best practice • Best guidance for starting

Click to edit Master title style 3 Beware Sampling Effects

22

0

500

1000

1500

2000

2500

1980 1981 1982

Line

s of

cod

epe

rson

-yea

rs

Baselines

Lots of small

projects Lots of large

projects

Click to edit Master title style 4 Understand Variation

23

Product factors bull timing constraints bull memory utilization bull CPU occupancy bull resource constraints bull complexity bull product size

Project factors bull concurrent development bull development computer bull requirements definition bull requirements churn bull programming practices bull personnel experience bull client experience bull client participation

Project 49

Product 16

Not explained

35

R2product = 16

R2project = 49

R2

model = 65

J Vosburgh B Curtis et al (1984) Proceedings of ICSE 1994

Click to edit Master title style 5 Investigate Distributions

Low Medium High

Rel

ativ

e p

rod

uct

ivit

y (

of m

ean

)

Software Engineering Practices Usage

-100

-50

0

50

100

150

200

250

Click to edit Master title style 6 Account for Maturity Effects

Which organization will be required to spend more effort on correcting software flaws

Click to edit Master title style 7 Caution for External Data Sets

26

DACS Us

Productivity

Siz

e Data definitions

Data collection

Data validity

Data age

Demographics

Click to edit Master title style

1) Tracking traditional projects

2) Iterative and Agile measurement

3) Team Software Process measurement

33

Section 2 Measuring to Manage

Software Projects

Click to edit Master title style Measuring Project Progress

300K 250K 200K 150K 100K 50K 0K

Jan Feb Mar Apr May Jun

Planned rate of code Growth

Actual code growth

Completion of coding

ldquoThe measures say we are on schedulerdquo

300K

250K

200K

150K

100K

50K

0K Jan Feb Mar Apr May Jun

Planned rate of code Growth

Actual code growth

Completion of coding

ldquoBut the code keeps growing We should have measured the requirements growthrdquo

50

100

150

Defect reports

Total

Open

Closed

125

75

25

Jan Feb Mar Apr May Jun

Completion of coding

ldquoAnd all the defect reports remaining to be fixed will keep us far behind schedulerdquo

Click to edit Master title style Requirements Change Impact

Number of requirements

affected

Chart6

Requirements Added
Requirements Changed
Requirements Deleted
Person hours of work or rework
2
20
1
2
11
1
45
35
3
15
61
1
1
20
4
15
15
0
18
05
2

Sheet1

Sheet1

Requirements Added
Requirements Changed
Requirements Deleted
Person hours of work or rework

Sheet2

Sheet3

Click to edit Master title style Phase Defect Removal Model

PHASE

Escapes Previous Phase KSLOC

Defect Injection KSLOC

Subtotal KSLOC

Removal Effective-ness

Defect Removal KSLOC

Escapes at phase exit KSLOC

Rqmts 0 12 12 0 0 12

Design 12 180 192 76 146 46

Code amp UT 46 154 200 71 142 58

Inte-gration 58 0 58 67 39 19

Sys Test 19 0 19 58 11 08

Opera-tion 08

Kan (1999) Metrics and Model in Software Quality Engineering

Click to edit Master title style SCRUM Board Scrumboard displays

the path to completion for each story by showing task status

Story points are an estimate of days for each story

Agile estimating is by feature (story) rather than by task

lsquoDonersquo does not always mean all tasks are finished

Kanban methods restrict the work in progress to a limited number of paths

Click to edit Master title style Burndown Chart and Velocity

Burndown chart displays story completion by day

Burndown chart indicates actual versus estimated progress

Velocity is the rate at which stories are completed

Velocity indicates sustainable progress

Velocity results provide one source of historical data for improving estimates

Story points without the context of productivity factors are dangerous for estimating

Click to edit Master title style Team Velocity

0

1

2

3

4

5

1 2 2 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20Days

Story Points per Hour

0

1

2

3

4

5

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

HrsStryPt

Analyzing Velocity

Item Cycle Times

02468

101214161820

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

Stories Hours HrsStryPt

0

10

20

30

40

50

60

1 2 2 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

Burndown Chart

Days

Click to edit Master title style Measuring and Managing Flow

Released

In progress

Backlogged

Cumulative Flow Diagram Days to delivery

outliers

Anderson (2010) Kanban

Click to edit Master title style

0

10

20

30

40

50

60

70

80

1 2 3 4 5 6 7 8 9 10 11 12

Sprints

Sto

ry p

oin

ts d

eliv

ered

Trends Over Sprints

0

10

20

30

40

50

60

70

80

1 2 3 4 5 6 7 8 9 10 11 12

Sprints

Sto

ry p

oin

ts d

eliv

ered

0

05

1

15

2

25

3

35

4

1 2 3 4 5 6 7 8 9 10 11 12

Sprints

Tech

nic

al d

ebt

Track and correlate multiple measures

across sprints

Predictive models

Click to edit Master title style Sources and Measures

Stories submitted

Stories pulled back

Growth in size of stories

Stories added mid-sprint

Stories under review

Failed tests

Non-development tasks

Assists to other projects

Data sources

Measures Project tracking

Code control

Bug tracking

Build system

Test environment

Deployment tools

Operational monitoring

Click to edit Master title style Recommended Books

Click to edit Master title style Personal Software Process (PSP)

PSP 0 Current process Time recording

Defect recording Defect type standards

PSP 01 Coding standard

Size measurement Process improvement

proposal

Baseline Personal Process

Humphrey (1997) Introduction to the Personal Software Process

PSP 1 Size estimating

Test report

Personal Planning

PSP 2 Code reviews

Design reviews

Personal Quality Management

PSP 21 Cyclic development

Cyclic Personal Process

PSP 11 Task planning

Schedule planning

PSP 21 Design templates

Click to edit Master title style Team Software Process (TSP) bull Built from personal processes of team members

bull Well defined team roles

bull Project launch workshops for planning

bull Team measurement and tracking

bull Post-mortems for improvement

bull Team application of lean principles

Humphrey (2000) Introduction to the Team Software Process

Click to edit Master title style Aggregated Personal Data Best

R2s range from 70 to 87

The ability to predict the amount of time required to produce a piece of software is dramatically improved by estimating at the individual level first and then combining the estimates rather than developing a single team estimate averaged over individuals

R2 = 53

Click to edit Master title style Defect Detection at Intuit

Consequently test changes from defect

detection into correctness verification

PSPTSP shifts defect detection from the test phase back up into development

Fagan (2005)

Personal reviews

33

Compile 33

Team reviews 19

Unit test 15

Test 14

Click to edit Master title style

Dramatic reductions

ndash Delivered defects

ndash Days per KLOC

ndash Schedule deviation

ndash Test defectskloc

ndash Variation in results

Humphrey (2006) TSP Leading a Development Team

TSP Benefits

Click to edit Master title style Section 3 Section 3 Measuring Software

Improvement Programs

1) Improvement program methods

2) Maturity-based measurement

3) Improvement program results

61

Click to edit Master title style CMMI Maturity Transitions

Level 5 Optimizing

Innovation management

Level 4 Quantitatively

Managed

Capability management

Level 3 Defined

Process management

Level 2 Managed

Project management

Level 1 Initial

Inconsistent management

Chrisis Konrad amp Shrum (2005) CMMI

Click to edit Master title style Enhanced Maturity Framework

Level 5 Innovating

Innovation management

Level 4 Optimized

Capability management

Level 3 Standardized

Organizational management

Level 2 Stabilized

Work unit management

Level 1 Initial

Inconsistent management

Click to edit Master title style Maturity Level Transformation

Level 5 Opportunistic improvements

Empowered culture

Level 5 Proactive

improvements Agile culture

Organization

Projects

Individual

Level 4 Process and results

managed statistically Precision culture

Level 3 Organization develops

standard processes Common culture

Level 2 Project managers stabilize projects

Local culture

Level 1 Ad Hoc inconsistent

development practices Relationship culture

Trust

Click to edit Master title style Measurement by Level

Level 5 mdash Exploratory data ∆ t = N x ROI

Level 4 mdash Predictive data X plusmn (196σ radicn )

Level 3 mdash Process data 83 defects per review

Level 2 mdash Management data Effort = α LOCβ

Level 1 mdash Haphazard data ldquoabout a million linesrdquo

Measure-maturity mismatch slows improvement and creates

dysfunction

Click to edit Master title style Average Improvement Results

Productivity growth 4 35 9 - 67

Pre-test defect detection 3 22 6 - 25

Time to market 2 darr 19 15 - 23

Field error reports 5 darr 39 10 - 94

Return on investment 5 501 41 - 881

Annual Annual Improvement benefit Orgs median range

Click to edit Master title style Schedule Performance

0

02

04

06

08

1

12

14

16

1 2 3

Schedule performance

index

CMM maturity level

Budgeted cost of work performed

Budgeted cost of work scheduled

behind ahead

Air Force study of contractors

Flowe amp Thordahl (1994)

Click to edit Master title style Cost Performance

Cost performance

index

0

05

1

15

2

25

1 2 3 CMM Maturity Level

Budgeted cost of work performed Actual cost of

work performed

overrun underrun

Air Force study of contractors

Flowe amp Thordahl (1994)

Click to edit Master title style Crosbyrsquos Cost of Quality Model Project Cost

Performance Cost of Quality Generating plans documentation Development - requirements - design - code - integration Prevention Appraisal

Reviews - system - requirements - design - test plan - test scripts Walkthroughs Testing (first time) Independent Verification and Validation (first time) Audits

Training Policies Procedures and methods Tools Planning Quality Improvement Projects Data Gathering and Analysis Root Cause Analysis Quality Reporting

Nonconformance Conformance Fixing Defects Reworking any Document Updating Source Code Re-reviews Re-tests Lab costs for re-tests Patches(Internal and External) Engineering Changes Change Control Boards External Failures and fines Customer Support Help Desks

Dion (1993) Crosby (1979) Quality Is Free

Click to edit Master title style Raytheons Cost of Quality

bull Performance mdash cost of building it right first time bull Nonconformance mdash cost of rework bull Appraisal mdash cost of testing bull Prevention mdash cost of preventing nonconformance

Dion (1993) Haley (1995)

23

18

Year Level Perform Nonconf Appraise Prevent

1988 1 34 41 15 7

1990 2 55 18 15 12

1992 3 66 11

1994 4 76 6 75

Click to edit Master title style Raytheonrsquos Productivity

0

1

2

3

4

1988 1989 1990 1991 1992 1993 1994

Growth relative to 1989 baseline

Lydon (1995)

Click to edit Master title style Raytheonrsquos Cost Reduction

0

02

04

06

08

1

12

1988 1989 1990 1991 1992 1993 1994

Reduction relative to 1989 baseline

Lydon (1995)

Click to edit Master title style

Haley (1995)

1993 1992 1991 1990 1989 1988 1994

50

40

30

20

10

0

-10

Actual cost

Budgeted cost

over-run under-run

4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1

Raytheonrsquos Cost Predictability

Click to edit Master title style OMRONrsquos Reuse gains

Sakamoto et al (1996)

0

5 0 0

1 0 0 0

1 5 0 0

2 0 0 0

2 5 0 0

3 0 0 0

1 9 8 7 1 9 8 8 1 9 8 9 1 9 9 0 1 9 9 1 1 9 9 2 1 9 9 3 1 9 9 4

Steps

100

75

50

25

0

of reuse

Click to edit Master title style Section 4 Section 4 Measuring Quality and

Technical Debt

1) Structural quality measurement

2) Technical debt

3) Consortium for IT Software Quality (CISQ)

89

Click to edit Master title style What about the Product CMMIrsquos primary assumption

ldquoThe quality of a software system is governed by quality of the process used to develop itrdquo

- Watts Humphrey

CMMIrsquos product quality problem

1 Assessments do not verify product quality

2 Compliance focus undermines CMMIrsquos primary assumption

3 Product quality focus at Level 4 in CMM was lost in quantitative process control in CMMI

4 The assumptions underlying control charts are violated by software data

Click to edit Master title style Six Sigmarsquos Challenge

Process focus ndash process improvement ndash Six Sigma Product focus ndash product improvement ndash Design for 6σ

Product focus supplements product focus to unlock even more value from software

Six Sigma projects must have significant benefits Huge benefits

Large benefits

Good benefits

Okay benefits

Small benefits

Now what

Ultimately we run out of projects with significant enough benefits to continue the programhellip

so what is the solution that allows continual improvement

Click to edit Master title style Testing is Not Enough ldquoAs higher levels of assurance are demandedhelliptesting cannot deliver the level of confidence required at a reasonable costrdquo

ldquoThe correctness of the code is rarely the weakest linkrdquo

Jackson D (2009) Spinellis D (2006)

ldquohellipa failure to satisfy a non-functional requirement can be critical even catastrophichellipnon-functional requirements are sometimes difficult to verify We cannot write a test case to verify a systemrsquos reliabilityhellipThe ability to associate code to non-functional properties can be a powerful weapon in a software engineerrsquos arsenalrdquo

Click to edit Master title style CASTrsquos Application Intelligence Platform

Application Analysis

Evaluation of 1200+ coding amp

architectural rules

Application meta-data

Transferability

Changeability

Robustness

Performance

Security

Quality Measurements

Detected Violations

Expensive operation in loop Static vs pooled connections Complex query on big table Large indices on big table

Empty CATCH block Uncontrolled data access Poor memory management Opened resource not closed

SQL injection Cross-site scripting Buffer overflow Uncontrolled format string

Unstructured code Misuse of inheritance Lack of comments Violated naming convention

Highly coupled component Duplicated code Index modified in loop High cyclomatic complexity

Language Parsers

Oracle PLSQL Sybase T-SQL SQL Server T-SQL IBM SQLPSM C C++ C Pro C Cobol CICS Visual Basic VBNet ASPNet Java J2EE JSP XML HTML Javascript VBScript PHP PowerBuilder Oracle Forms PeopleSoft SAP ABAP Netweaver Tibco Business Objects Universal Analyzer for other languages

Click to edit Master title style Modern Apps are a Technology Stack Ar

chite

ctur

al C

ompl

ianc

e

Data Flow Transaction Risk

EJB PLSQL

Oracle

SQL Server

DB2

TSQL

Hibernate

Spring

Struts NET

COBOL

Sybase IMS

Messaging

Integration quality Architectural

compliance Risk propagation Application security Resiliency checks Transaction integrity

Function point Effort estimation Data access control SDK versioning Calibration across

technologies IT organization level

System Level 3

Code style amp layout Expression complexity Code documentation Class or program design Basic coding standards Developer level

Unit Level 1

Java Java Java

Java Java

Web Services Java Java Single languagetechnology layer

Intra-technology architecture Intra-layer dependencies Inter-program invocation Security vulnerabilities Development team level

Technology Level 2

JSP ASPNET APIs

Click to edit Master title style Architecturally Complex Defects

Primary cause of operational

problems

20x as many

fixes to correct

Architecturally Complex Defect

A structural flaw involving interactions among multiple components that reside in different application layers

48

52 92

8 Architecturally

Complex Defects

Code unit-level violations

of total app defects

of total repair effort

Most architecturally complex defects involve an architectural hotspot ndash a badly-built component that should be replaced

Click to edit Master title style Propagated Risk

The impact of violation 284 in Component B is very widely propagated throughout the system ndash

it presents very high risk

The impact of violation 342 in Component A is only narrowly propagated in the system

ndash it presents low risk

Propagated Risk Index

A ranking of the risk created by each violation based on the its severity and the breadth of its impact in the system

Violation 342 in component A

Violation 284 in component B

Remediating violation 284 in Component B will have

the greater impact on reducing risk and cost

Click to edit Master title style Transaction Risk

Customers care most about the dependability of their transactions

The risk of a transaction is determined by the number and severity of violations along its path

Transactions can be ranked by risk to prioritize their remediation

The Transaction Risk Index can be further tuned using the operational frequency of each transaction

Transaction Risk Index

A ranking of transactions based on the risk created by the number and severity of violations along the transaction path

Transaction entryexit in the user interface layer

Transaction fulfillment in the data storage layer

1 3 2 4 5

Transaction 1 poses greatest risk

Transaction processing in the business logic layer

Click to edit Master title style Risk-Based Prioritization

Violation Health Factor Severity Propagated Risk Index

In a high risk transaction

Frequency of path execution

148 Security 9

371 Robustness 9

062 Performance 8

284 Robustness 7

016 Changeability 7

241 Security 5

173 Transferability 4

359 Transferability 3

Click to edit Master title style Static Analysis in US DoD

John Keane (2013)

Testing finds lt13 of structural defects

Static analysis before testing is 85 efficient

With testing and static analysis combined test schedules are reduced by 50

Click to edit Master title style Reducing Operational Incidents amp Costs

Study of structural quality measures and maintenance effort across 20 customers in a large global system integrator

TQI increase of 24 decreased corrective maintenance effort by 50

Total Quality Index

Click to edit Master title style Reducing Operational Losses

0

20

40

60

80

100

120

140

160

180

100 150 200 250 300 350 400

Technical Quality Index

Pro

du

ctio

n In

cid

ents

+$2000000 Annual Loss

$50000 Annual Loss

$5000 Annual Loss

Large international investment bank Business critical applications

Click to edit Master title style Factors Affecting Structural Quality

2017 CRASH Report Request from infocastsoftwarecom

Factor apps Robust Security Perform Change Transfer

Technology 1606 20 3 16 26 7

Java-EE

Industry 677 5 5 7 4 6

App type 510 1 1 1 1 1

Source 580 1 1 4

Shore 540

Maturity 72 28 25 15 24 12

Method 208 10 6 14 6

Team size 186 7 5 8 8

of users 262 4 3 5 4

Percentage of variation in structural quality scores accounted for by various factors

Click to edit Master title style The Technical Debt Metaphor

Structural quality problems in production code

Technical Debt

Principal borrowed

Interest on the debt

Business Risk Liability from debt

Opportunity cost

Interestmdashcontinuing IT costs attributable to the violations causing technical debt ie higher maintenance costs greater resource usage etc

Principalcost of fixing problems remaining in the code after release that must be remediated

Opportunity costmdashbenefits that could have been achieved had resources been put on new capability rather than retiring technical debt

Liabilitymdashbusiness costs related to outages breaches corrupted data etc

Technical Debt the future cost of fixing defects remaining in code at release a component of the cost of ownership

Todayrsquos talk focuses on the principal

Curtis et al (2012) IEEE Software

Click to edit Master title style Tech Debt by Quality Factor

Transferability

40

Changeability

30

Security 7

Robustness

18

70 of Technical Debt is in IT Cost (Transferability Changeability)

30 of Technical Debt is in Business Risk

(Robustness Performance Security)

Health Factor proportions are mostly

consistent across technologies

Curtis et al (2012) IEEE Software

Click to edit Master title style Join CISQ wwwit-cisqorg

Click to edit Master title style Section

References

115

Click to edit Master title style References 1 Anderson DJ (2010) Kanban Sequim WA Blue Hole Press

Basili V Trendowicz A Kowalczyk M Heidrich J Seaman C Munch J amp Rombach D (2014) Aligning Organizations through Measurement Heidelberg Springer

Boehm B (1987) Increasing software productivity IEEE Computer 20 (9) 43-57

Boehm B et al (2000) Software Cost Estimation with COCOMO II Prentice-Hall

Chrissis MB Konrad M amp Shrum S (2011) CMMI Third Edition Guidelines for Process Integration and Product Improvement Reading MA Addison Wesley

Cohn M (2006) Agile Estimating and Planning Upper Saddle River NJ Prentice-Hall

Crosby P (1979) Quality Is Free New York McGraw-Hill

Curtis B Sapiddi J amp Szynkarski A (2012) Estimating the principle of an applicationrsquos technical debt IEEE Software 29 (6) 34-42

Dion R (1993) Process improvement and the corporate balance sheet IEEE Software 10 (4) 28-35

Duncker R (1992) Presentation at the 25th Annual Conference of the Singapore Computer Society Singapore November 1992

Dyne K (1999) ESSI Benefits from Continuous Software Improvements In Proceedings of ESEPGrsquo99 Milton Keyes UK ESPI Foundation

Flowe RM amp Thordahl JB (1994) A Correlational Study of the SEIrsquos Capability Maturity Model and Software Development Performance in DoD Contracts (AFITGSSLAR94D-2) Dayton OH Air Force Institute of Technology Wright Patterson Air Force Base

116

Click to edit Master title style References 2 Haley T Ireland B Wojtaszek E Nash D amp Dion R (1995) Raytheon Electronic Systems

Experience in Software Process Improvement (Tech Rep CMUSEI-95-TR-017) ) Pittsburgh Software Engineering Institute Carnegie Mellon University

Ebert C amp Dumke R (2007) Software Measurement Berlin Springer-Verlag

Fagan E amp Davis N (2005) TSP PSP at Intuit Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Fenton N (2006) New directions for Software Metrics CIO Symposium on Software Best Practices

Garmus D amp Herron D (2001) Function Point Analysis Boston Addison-Wesley

George E (2005) Project management with TSP Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Herbsleb J Carleton A Rozum J Siegel J amp Zubrow D (1994) Benefits of CMM-Based Software Process Improvement Initial Results (Tech Rep CMUSEI-94-TR-13) Pittsburgh Software Engineering Institute Carnegie Mellon University

Humphrey W (1997) Introduction to the Personal Software Process Boston Addison Wesley

Humphrey W (2000) Introduction to the Team Software Process Boston Addison Wesley

Humphrey W (2006) TSP Leading a Development Team Boston Addison Wesley

Jackson D (2009) A direct path to dependable software Communications of the ACM 52 (4)

Kan SH (1995) Metrics and Models in Software Quality Engineering Reading MA Addison-Wesley

Keeni B (2000) IEEE Software 16 (4)

117

Click to edit Master title style References 2 Larman C (2004) Agile and Iterative Development Boston Addison Wesley

Lydon T (1995) Productivity drivers Process and capital In Proceedings of the 1995 SEPG Conference Pittsburgh Software Engineering Institute Carnegie Mellon University

McGarry J et al (2002) Practical Software Measurement Reading MA Addison Wesley

Paulk M Weber C Curtis B amp Chrissis M (1995) The Capability Maturity Model Guidelines for Improving the Software Process Reading MA Addison-Wesley

Peterman W (2000) IEEE Software JulyAugust issue

Poppendieck M amp Poppendieck T (2007) Implementing Lean Software Development Boston Addison-Wesley

Sakamoto K Kishida K amp Nakakoji K (1996) Cultural adaptation of the CMM In Fuggetta A amp Wolf A (Eds) Software Process Chichester UK Wiley 137-154

Sinha M (2006) Performance models ndash Enabling organizations prediction capability Proceedings of SEPGrsquo06 Pittsburgh Software Engineering Institute Carnegie Mellon University

Schwaber K amp Beedle M (2002) Agile Software Development Using Scrum

Spinellis D (2006) Code Quality Addison-Wesley

Van Epps S amp Marshall R (2012) Using TSP at the end of the development cycle Proceedings of TSP 2012 Pittsburgh Software Engineering Institute Carnegie Mellon University

Vu JD (1996) Software process improvement A business case In Proceedings of the European SEPG Conference Milton Keynes UK European Software Process Improvement Foundation

118

  • Best Practices for Software Process and Product Measurement
  • Instructor Dr Bill Curtis
  • Agenda
  • Section 1
  • Why Does Measurement Languish
  • Measurement Process Guidance
  • Measurement Categories
  • Productivity Analysis Objectives
  • Productivity Analysis Measures
  • Software Productivity
  • Software Size Measures
  • How Many Lines of Code
  • Function Point Estimation
  • Effort mdash Weakest Measure
  • How Quality Affects Productivity
  • Carry-forward Rework
  • Example of Quality Impact
  • Quality-Adjusted Productivity
  • Best Practices for Analysis
  • 1 Evaluate Demographics
  • 2 Segment Baselines
  • 3 Beware Sampling Effects
  • 4 Understand Variation
  • 5 Investigate Distributions
  • 6 Account for Maturity Effects
  • 7 Caution for External Data Sets
  • Slide Number 27
  • Measuring Project Progress
  • Requirements Change Impact
  • Phase Defect Removal Model
  • SCRUM Board
  • Burndown Chart and Velocity
  • Analyzing Velocity
  • Measuring and Managing Flow
  • Trends Over Sprints
  • Sources and Measures
  • Recommended Books
  • Personal Software Process (PSP)
  • Team Software Process (TSP)
  • Aggregated Personal Data Best
  • Defect Detection at Intuit
  • TSP Benefits
  • Section 3
  • CMMI Maturity Transitions
  • Enhanced Maturity Framework
  • Maturity Level Transformation
  • Measurement by Level
  • Average Improvement Results
  • Schedule Performance
  • Cost Performance
  • Crosbyrsquos Cost of Quality Model
  • Raytheons Cost of Quality
  • Raytheonrsquos Productivity
  • Raytheonrsquos Cost Reduction
  • Raytheonrsquos Cost Predictability
  • OMRONrsquos Reuse gains
  • Section 4
  • What about the Product
  • Six Sigmarsquos Challenge
  • Testing is Not Enough
  • CASTrsquos Application Intelligence Platform
  • Modern Apps are a Technology Stack
  • Architecturally Complex Defects
  • Propagated Risk
  • Transaction Risk
  • Risk-Based Prioritization
  • Static Analysis in US DoD
  • Reducing Operational Incidents amp Costs
  • Reducing Operational Losses
  • Slide Number 70
  • The Technical Debt Metaphor
  • Tech Debt by Quality Factor
  • Join CISQ wwwit-cisqorg
  • Section
  • References 1
  • References 2
  • References 2

PHASE

Escapes Previous Phase

KSLOC

Defect Injection

KSLOC

Subtotal

KSLOC

Removal

Effective-ness

Defect Removal

KSLOC

Escapes at phase exit KSLOC

Rqmts

0

12

12

0

0

12

Design

12

180

192

76

146

46

Code amp UT

46

154

200

71

142

58

Inte-gration

58

0

58

67

39

19

Sys Test

19

0

19

58

11

08

Opera-tion

08

Requirements Added Requirements Changed Requirements Deleted
Feb 2 20 1
Mar 2 11 1
Apr 45 35 3
May 15 61 1
Jun 1 20 4
Jul 15 15 0
Aug 18 05 2
Feb Feb Feb
Mar Mar Mar
Apr Apr Apr
May May May
Jun Jun Jun
Jul Jul Jul
Aug Aug Aug
Page 23: Best Practices for Software Process and Product Measurement · Software Process and Product Measurement Dr. Bill Curtis ... measurement best practice • Best guidance for starting

Click to edit Master title style 4 Understand Variation

23

Product factors bull timing constraints bull memory utilization bull CPU occupancy bull resource constraints bull complexity bull product size

Project factors bull concurrent development bull development computer bull requirements definition bull requirements churn bull programming practices bull personnel experience bull client experience bull client participation

Project 49

Product 16

Not explained

35

R2product = 16

R2project = 49

R2

model = 65

J Vosburgh B Curtis et al (1984) Proceedings of ICSE 1994

Click to edit Master title style 5 Investigate Distributions

Low Medium High

Rel

ativ

e p

rod

uct

ivit

y (

of m

ean

)

Software Engineering Practices Usage

-100

-50

0

50

100

150

200

250

Click to edit Master title style 6 Account for Maturity Effects

Which organization will be required to spend more effort on correcting software flaws

Click to edit Master title style 7 Caution for External Data Sets

26

DACS Us

Productivity

Siz

e Data definitions

Data collection

Data validity

Data age

Demographics

Click to edit Master title style

1) Tracking traditional projects

2) Iterative and Agile measurement

3) Team Software Process measurement

33

Section 2 Measuring to Manage

Software Projects

Click to edit Master title style Measuring Project Progress

300K 250K 200K 150K 100K 50K 0K

Jan Feb Mar Apr May Jun

Planned rate of code Growth

Actual code growth

Completion of coding

ldquoThe measures say we are on schedulerdquo

300K

250K

200K

150K

100K

50K

0K Jan Feb Mar Apr May Jun

Planned rate of code Growth

Actual code growth

Completion of coding

ldquoBut the code keeps growing We should have measured the requirements growthrdquo

50

100

150

Defect reports

Total

Open

Closed

125

75

25

Jan Feb Mar Apr May Jun

Completion of coding

ldquoAnd all the defect reports remaining to be fixed will keep us far behind schedulerdquo

Click to edit Master title style Requirements Change Impact

Number of requirements

affected

Chart6

Requirements Added
Requirements Changed
Requirements Deleted
Person hours of work or rework
2
20
1
2
11
1
45
35
3
15
61
1
1
20
4
15
15
0
18
05
2

Sheet1

Sheet1

Requirements Added
Requirements Changed
Requirements Deleted
Person hours of work or rework

Sheet2

Sheet3

Click to edit Master title style Phase Defect Removal Model

PHASE

Escapes Previous Phase KSLOC

Defect Injection KSLOC

Subtotal KSLOC

Removal Effective-ness

Defect Removal KSLOC

Escapes at phase exit KSLOC

Rqmts 0 12 12 0 0 12

Design 12 180 192 76 146 46

Code amp UT 46 154 200 71 142 58

Inte-gration 58 0 58 67 39 19

Sys Test 19 0 19 58 11 08

Opera-tion 08

Kan (1999) Metrics and Model in Software Quality Engineering

Click to edit Master title style SCRUM Board Scrumboard displays

the path to completion for each story by showing task status

Story points are an estimate of days for each story

Agile estimating is by feature (story) rather than by task

lsquoDonersquo does not always mean all tasks are finished

Kanban methods restrict the work in progress to a limited number of paths

Click to edit Master title style Burndown Chart and Velocity

Burndown chart displays story completion by day

Burndown chart indicates actual versus estimated progress

Velocity is the rate at which stories are completed

Velocity indicates sustainable progress

Velocity results provide one source of historical data for improving estimates

Story points without the context of productivity factors are dangerous for estimating

Click to edit Master title style Team Velocity

0

1

2

3

4

5

1 2 2 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20Days

Story Points per Hour

0

1

2

3

4

5

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

HrsStryPt

Analyzing Velocity

Item Cycle Times

02468

101214161820

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

Stories Hours HrsStryPt

0

10

20

30

40

50

60

1 2 2 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

Burndown Chart

Days

Click to edit Master title style Measuring and Managing Flow

Released

In progress

Backlogged

Cumulative Flow Diagram Days to delivery

outliers

Anderson (2010) Kanban

Click to edit Master title style

0

10

20

30

40

50

60

70

80

1 2 3 4 5 6 7 8 9 10 11 12

Sprints

Sto

ry p

oin

ts d

eliv

ered

Trends Over Sprints

0

10

20

30

40

50

60

70

80

1 2 3 4 5 6 7 8 9 10 11 12

Sprints

Sto

ry p

oin

ts d

eliv

ered

0

05

1

15

2

25

3

35

4

1 2 3 4 5 6 7 8 9 10 11 12

Sprints

Tech

nic

al d

ebt

Track and correlate multiple measures

across sprints

Predictive models

Click to edit Master title style Sources and Measures

Stories submitted

Stories pulled back

Growth in size of stories

Stories added mid-sprint

Stories under review

Failed tests

Non-development tasks

Assists to other projects

Data sources

Measures Project tracking

Code control

Bug tracking

Build system

Test environment

Deployment tools

Operational monitoring

Click to edit Master title style Recommended Books

Click to edit Master title style Personal Software Process (PSP)

PSP 0 Current process Time recording

Defect recording Defect type standards

PSP 01 Coding standard

Size measurement Process improvement

proposal

Baseline Personal Process

Humphrey (1997) Introduction to the Personal Software Process

PSP 1 Size estimating

Test report

Personal Planning

PSP 2 Code reviews

Design reviews

Personal Quality Management

PSP 21 Cyclic development

Cyclic Personal Process

PSP 11 Task planning

Schedule planning

PSP 21 Design templates

Click to edit Master title style Team Software Process (TSP) bull Built from personal processes of team members

bull Well defined team roles

bull Project launch workshops for planning

bull Team measurement and tracking

bull Post-mortems for improvement

bull Team application of lean principles

Humphrey (2000) Introduction to the Team Software Process

Click to edit Master title style Aggregated Personal Data Best

R2s range from 70 to 87

The ability to predict the amount of time required to produce a piece of software is dramatically improved by estimating at the individual level first and then combining the estimates rather than developing a single team estimate averaged over individuals

R2 = 53

Click to edit Master title style Defect Detection at Intuit

Consequently test changes from defect

detection into correctness verification

PSPTSP shifts defect detection from the test phase back up into development

Fagan (2005)

Personal reviews

33

Compile 33

Team reviews 19

Unit test 15

Test 14

Click to edit Master title style

Dramatic reductions

ndash Delivered defects

ndash Days per KLOC

ndash Schedule deviation

ndash Test defectskloc

ndash Variation in results

Humphrey (2006) TSP Leading a Development Team

TSP Benefits

Click to edit Master title style Section 3 Section 3 Measuring Software

Improvement Programs

1) Improvement program methods

2) Maturity-based measurement

3) Improvement program results

61

Click to edit Master title style CMMI Maturity Transitions

Level 5 Optimizing

Innovation management

Level 4 Quantitatively

Managed

Capability management

Level 3 Defined

Process management

Level 2 Managed

Project management

Level 1 Initial

Inconsistent management

Chrisis Konrad amp Shrum (2005) CMMI

Click to edit Master title style Enhanced Maturity Framework

Level 5 Innovating

Innovation management

Level 4 Optimized

Capability management

Level 3 Standardized

Organizational management

Level 2 Stabilized

Work unit management

Level 1 Initial

Inconsistent management

Click to edit Master title style Maturity Level Transformation

Level 5 Opportunistic improvements

Empowered culture

Level 5 Proactive

improvements Agile culture

Organization

Projects

Individual

Level 4 Process and results

managed statistically Precision culture

Level 3 Organization develops

standard processes Common culture

Level 2 Project managers stabilize projects

Local culture

Level 1 Ad Hoc inconsistent

development practices Relationship culture

Trust

Click to edit Master title style Measurement by Level

Level 5 mdash Exploratory data ∆ t = N x ROI

Level 4 mdash Predictive data X plusmn (196σ radicn )

Level 3 mdash Process data 83 defects per review

Level 2 mdash Management data Effort = α LOCβ

Level 1 mdash Haphazard data ldquoabout a million linesrdquo

Measure-maturity mismatch slows improvement and creates

dysfunction

Click to edit Master title style Average Improvement Results

Productivity growth 4 35 9 - 67

Pre-test defect detection 3 22 6 - 25

Time to market 2 darr 19 15 - 23

Field error reports 5 darr 39 10 - 94

Return on investment 5 501 41 - 881

Annual Annual Improvement benefit Orgs median range

Click to edit Master title style Schedule Performance

0

02

04

06

08

1

12

14

16

1 2 3

Schedule performance

index

CMM maturity level

Budgeted cost of work performed

Budgeted cost of work scheduled

behind ahead

Air Force study of contractors

Flowe amp Thordahl (1994)

Click to edit Master title style Cost Performance

Cost performance

index

0

05

1

15

2

25

1 2 3 CMM Maturity Level

Budgeted cost of work performed Actual cost of

work performed

overrun underrun

Air Force study of contractors

Flowe amp Thordahl (1994)

Click to edit Master title style Crosbyrsquos Cost of Quality Model Project Cost

Performance Cost of Quality Generating plans documentation Development - requirements - design - code - integration Prevention Appraisal

Reviews - system - requirements - design - test plan - test scripts Walkthroughs Testing (first time) Independent Verification and Validation (first time) Audits

Training Policies Procedures and methods Tools Planning Quality Improvement Projects Data Gathering and Analysis Root Cause Analysis Quality Reporting

Nonconformance Conformance Fixing Defects Reworking any Document Updating Source Code Re-reviews Re-tests Lab costs for re-tests Patches(Internal and External) Engineering Changes Change Control Boards External Failures and fines Customer Support Help Desks

Dion (1993) Crosby (1979) Quality Is Free

Click to edit Master title style Raytheons Cost of Quality

bull Performance mdash cost of building it right first time bull Nonconformance mdash cost of rework bull Appraisal mdash cost of testing bull Prevention mdash cost of preventing nonconformance

Dion (1993) Haley (1995)

23

18

Year Level Perform Nonconf Appraise Prevent

1988 1 34 41 15 7

1990 2 55 18 15 12

1992 3 66 11

1994 4 76 6 75

Click to edit Master title style Raytheonrsquos Productivity

0

1

2

3

4

1988 1989 1990 1991 1992 1993 1994

Growth relative to 1989 baseline

Lydon (1995)

Click to edit Master title style Raytheonrsquos Cost Reduction

0

02

04

06

08

1

12

1988 1989 1990 1991 1992 1993 1994

Reduction relative to 1989 baseline

Lydon (1995)

Click to edit Master title style

Haley (1995)

1993 1992 1991 1990 1989 1988 1994

50

40

30

20

10

0

-10

Actual cost

Budgeted cost

over-run under-run

4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1

Raytheonrsquos Cost Predictability

Click to edit Master title style OMRONrsquos Reuse gains

Sakamoto et al (1996)

0

5 0 0

1 0 0 0

1 5 0 0

2 0 0 0

2 5 0 0

3 0 0 0

1 9 8 7 1 9 8 8 1 9 8 9 1 9 9 0 1 9 9 1 1 9 9 2 1 9 9 3 1 9 9 4

Steps

100

75

50

25

0

of reuse

Click to edit Master title style Section 4 Section 4 Measuring Quality and

Technical Debt

1) Structural quality measurement

2) Technical debt

3) Consortium for IT Software Quality (CISQ)

89

Click to edit Master title style What about the Product CMMIrsquos primary assumption

ldquoThe quality of a software system is governed by quality of the process used to develop itrdquo

- Watts Humphrey

CMMIrsquos product quality problem

1 Assessments do not verify product quality

2 Compliance focus undermines CMMIrsquos primary assumption

3 Product quality focus at Level 4 in CMM was lost in quantitative process control in CMMI

4 The assumptions underlying control charts are violated by software data

Click to edit Master title style Six Sigmarsquos Challenge

Process focus ndash process improvement ndash Six Sigma Product focus ndash product improvement ndash Design for 6σ

Product focus supplements product focus to unlock even more value from software

Six Sigma projects must have significant benefits Huge benefits

Large benefits

Good benefits

Okay benefits

Small benefits

Now what

Ultimately we run out of projects with significant enough benefits to continue the programhellip

so what is the solution that allows continual improvement

Click to edit Master title style Testing is Not Enough ldquoAs higher levels of assurance are demandedhelliptesting cannot deliver the level of confidence required at a reasonable costrdquo

ldquoThe correctness of the code is rarely the weakest linkrdquo

Jackson D (2009) Spinellis D (2006)

ldquohellipa failure to satisfy a non-functional requirement can be critical even catastrophichellipnon-functional requirements are sometimes difficult to verify We cannot write a test case to verify a systemrsquos reliabilityhellipThe ability to associate code to non-functional properties can be a powerful weapon in a software engineerrsquos arsenalrdquo

Click to edit Master title style CASTrsquos Application Intelligence Platform

Application Analysis

Evaluation of 1200+ coding amp

architectural rules

Application meta-data

Transferability

Changeability

Robustness

Performance

Security

Quality Measurements

Detected Violations

Expensive operation in loop Static vs pooled connections Complex query on big table Large indices on big table

Empty CATCH block Uncontrolled data access Poor memory management Opened resource not closed

SQL injection Cross-site scripting Buffer overflow Uncontrolled format string

Unstructured code Misuse of inheritance Lack of comments Violated naming convention

Highly coupled component Duplicated code Index modified in loop High cyclomatic complexity

Language Parsers

Oracle PLSQL Sybase T-SQL SQL Server T-SQL IBM SQLPSM C C++ C Pro C Cobol CICS Visual Basic VBNet ASPNet Java J2EE JSP XML HTML Javascript VBScript PHP PowerBuilder Oracle Forms PeopleSoft SAP ABAP Netweaver Tibco Business Objects Universal Analyzer for other languages

Click to edit Master title style Modern Apps are a Technology Stack Ar

chite

ctur

al C

ompl

ianc

e

Data Flow Transaction Risk

EJB PLSQL

Oracle

SQL Server

DB2

TSQL

Hibernate

Spring

Struts NET

COBOL

Sybase IMS

Messaging

Integration quality Architectural

compliance Risk propagation Application security Resiliency checks Transaction integrity

Function point Effort estimation Data access control SDK versioning Calibration across

technologies IT organization level

System Level 3

Code style amp layout Expression complexity Code documentation Class or program design Basic coding standards Developer level

Unit Level 1

Java Java Java

Java Java

Web Services Java Java Single languagetechnology layer

Intra-technology architecture Intra-layer dependencies Inter-program invocation Security vulnerabilities Development team level

Technology Level 2

JSP ASPNET APIs

Click to edit Master title style Architecturally Complex Defects

Primary cause of operational

problems

20x as many

fixes to correct

Architecturally Complex Defect

A structural flaw involving interactions among multiple components that reside in different application layers

48

52 92

8 Architecturally

Complex Defects

Code unit-level violations

of total app defects

of total repair effort

Most architecturally complex defects involve an architectural hotspot ndash a badly-built component that should be replaced

Click to edit Master title style Propagated Risk

The impact of violation 284 in Component B is very widely propagated throughout the system ndash

it presents very high risk

The impact of violation 342 in Component A is only narrowly propagated in the system

ndash it presents low risk

Propagated Risk Index

A ranking of the risk created by each violation based on the its severity and the breadth of its impact in the system

Violation 342 in component A

Violation 284 in component B

Remediating violation 284 in Component B will have

the greater impact on reducing risk and cost

Click to edit Master title style Transaction Risk

Customers care most about the dependability of their transactions

The risk of a transaction is determined by the number and severity of violations along its path

Transactions can be ranked by risk to prioritize their remediation

The Transaction Risk Index can be further tuned using the operational frequency of each transaction

Transaction Risk Index

A ranking of transactions based on the risk created by the number and severity of violations along the transaction path

Transaction entryexit in the user interface layer

Transaction fulfillment in the data storage layer

1 3 2 4 5

Transaction 1 poses greatest risk

Transaction processing in the business logic layer

Click to edit Master title style Risk-Based Prioritization

Violation Health Factor Severity Propagated Risk Index

In a high risk transaction

Frequency of path execution

148 Security 9

371 Robustness 9

062 Performance 8

284 Robustness 7

016 Changeability 7

241 Security 5

173 Transferability 4

359 Transferability 3

Click to edit Master title style Static Analysis in US DoD

John Keane (2013)

Testing finds lt13 of structural defects

Static analysis before testing is 85 efficient

With testing and static analysis combined test schedules are reduced by 50

Click to edit Master title style Reducing Operational Incidents amp Costs

Study of structural quality measures and maintenance effort across 20 customers in a large global system integrator

TQI increase of 24 decreased corrective maintenance effort by 50

Total Quality Index

Click to edit Master title style Reducing Operational Losses

0

20

40

60

80

100

120

140

160

180

100 150 200 250 300 350 400

Technical Quality Index

Pro

du

ctio

n In

cid

ents

+$2000000 Annual Loss

$50000 Annual Loss

$5000 Annual Loss

Large international investment bank Business critical applications

Click to edit Master title style Factors Affecting Structural Quality

2017 CRASH Report Request from infocastsoftwarecom

Factor apps Robust Security Perform Change Transfer

Technology 1606 20 3 16 26 7

Java-EE

Industry 677 5 5 7 4 6

App type 510 1 1 1 1 1

Source 580 1 1 4

Shore 540

Maturity 72 28 25 15 24 12

Method 208 10 6 14 6

Team size 186 7 5 8 8

of users 262 4 3 5 4

Percentage of variation in structural quality scores accounted for by various factors

Click to edit Master title style The Technical Debt Metaphor

Structural quality problems in production code

Technical Debt

Principal borrowed

Interest on the debt

Business Risk Liability from debt

Opportunity cost

Interestmdashcontinuing IT costs attributable to the violations causing technical debt ie higher maintenance costs greater resource usage etc

Principalcost of fixing problems remaining in the code after release that must be remediated

Opportunity costmdashbenefits that could have been achieved had resources been put on new capability rather than retiring technical debt

Liabilitymdashbusiness costs related to outages breaches corrupted data etc

Technical Debt the future cost of fixing defects remaining in code at release a component of the cost of ownership

Todayrsquos talk focuses on the principal

Curtis et al (2012) IEEE Software

Click to edit Master title style Tech Debt by Quality Factor

Transferability

40

Changeability

30

Security 7

Robustness

18

70 of Technical Debt is in IT Cost (Transferability Changeability)

30 of Technical Debt is in Business Risk

(Robustness Performance Security)

Health Factor proportions are mostly

consistent across technologies

Curtis et al (2012) IEEE Software

Click to edit Master title style Join CISQ wwwit-cisqorg

Click to edit Master title style Section

References

115

Click to edit Master title style References 1 Anderson DJ (2010) Kanban Sequim WA Blue Hole Press

Basili V Trendowicz A Kowalczyk M Heidrich J Seaman C Munch J amp Rombach D (2014) Aligning Organizations through Measurement Heidelberg Springer

Boehm B (1987) Increasing software productivity IEEE Computer 20 (9) 43-57

Boehm B et al (2000) Software Cost Estimation with COCOMO II Prentice-Hall

Chrissis MB Konrad M amp Shrum S (2011) CMMI Third Edition Guidelines for Process Integration and Product Improvement Reading MA Addison Wesley

Cohn M (2006) Agile Estimating and Planning Upper Saddle River NJ Prentice-Hall

Crosby P (1979) Quality Is Free New York McGraw-Hill

Curtis B Sapiddi J amp Szynkarski A (2012) Estimating the principle of an applicationrsquos technical debt IEEE Software 29 (6) 34-42

Dion R (1993) Process improvement and the corporate balance sheet IEEE Software 10 (4) 28-35

Duncker R (1992) Presentation at the 25th Annual Conference of the Singapore Computer Society Singapore November 1992

Dyne K (1999) ESSI Benefits from Continuous Software Improvements In Proceedings of ESEPGrsquo99 Milton Keyes UK ESPI Foundation

Flowe RM amp Thordahl JB (1994) A Correlational Study of the SEIrsquos Capability Maturity Model and Software Development Performance in DoD Contracts (AFITGSSLAR94D-2) Dayton OH Air Force Institute of Technology Wright Patterson Air Force Base

116

Click to edit Master title style References 2 Haley T Ireland B Wojtaszek E Nash D amp Dion R (1995) Raytheon Electronic Systems

Experience in Software Process Improvement (Tech Rep CMUSEI-95-TR-017) ) Pittsburgh Software Engineering Institute Carnegie Mellon University

Ebert C amp Dumke R (2007) Software Measurement Berlin Springer-Verlag

Fagan E amp Davis N (2005) TSP PSP at Intuit Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Fenton N (2006) New directions for Software Metrics CIO Symposium on Software Best Practices

Garmus D amp Herron D (2001) Function Point Analysis Boston Addison-Wesley

George E (2005) Project management with TSP Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Herbsleb J Carleton A Rozum J Siegel J amp Zubrow D (1994) Benefits of CMM-Based Software Process Improvement Initial Results (Tech Rep CMUSEI-94-TR-13) Pittsburgh Software Engineering Institute Carnegie Mellon University

Humphrey W (1997) Introduction to the Personal Software Process Boston Addison Wesley

Humphrey W (2000) Introduction to the Team Software Process Boston Addison Wesley

Humphrey W (2006) TSP Leading a Development Team Boston Addison Wesley

Jackson D (2009) A direct path to dependable software Communications of the ACM 52 (4)

Kan SH (1995) Metrics and Models in Software Quality Engineering Reading MA Addison-Wesley

Keeni B (2000) IEEE Software 16 (4)

117

Click to edit Master title style References 2 Larman C (2004) Agile and Iterative Development Boston Addison Wesley

Lydon T (1995) Productivity drivers Process and capital In Proceedings of the 1995 SEPG Conference Pittsburgh Software Engineering Institute Carnegie Mellon University

McGarry J et al (2002) Practical Software Measurement Reading MA Addison Wesley

Paulk M Weber C Curtis B amp Chrissis M (1995) The Capability Maturity Model Guidelines for Improving the Software Process Reading MA Addison-Wesley

Peterman W (2000) IEEE Software JulyAugust issue

Poppendieck M amp Poppendieck T (2007) Implementing Lean Software Development Boston Addison-Wesley

Sakamoto K Kishida K amp Nakakoji K (1996) Cultural adaptation of the CMM In Fuggetta A amp Wolf A (Eds) Software Process Chichester UK Wiley 137-154

Sinha M (2006) Performance models ndash Enabling organizations prediction capability Proceedings of SEPGrsquo06 Pittsburgh Software Engineering Institute Carnegie Mellon University

Schwaber K amp Beedle M (2002) Agile Software Development Using Scrum

Spinellis D (2006) Code Quality Addison-Wesley

Van Epps S amp Marshall R (2012) Using TSP at the end of the development cycle Proceedings of TSP 2012 Pittsburgh Software Engineering Institute Carnegie Mellon University

Vu JD (1996) Software process improvement A business case In Proceedings of the European SEPG Conference Milton Keynes UK European Software Process Improvement Foundation

118

  • Best Practices for Software Process and Product Measurement
  • Instructor Dr Bill Curtis
  • Agenda
  • Section 1
  • Why Does Measurement Languish
  • Measurement Process Guidance
  • Measurement Categories
  • Productivity Analysis Objectives
  • Productivity Analysis Measures
  • Software Productivity
  • Software Size Measures
  • How Many Lines of Code
  • Function Point Estimation
  • Effort mdash Weakest Measure
  • How Quality Affects Productivity
  • Carry-forward Rework
  • Example of Quality Impact
  • Quality-Adjusted Productivity
  • Best Practices for Analysis
  • 1 Evaluate Demographics
  • 2 Segment Baselines
  • 3 Beware Sampling Effects
  • 4 Understand Variation
  • 5 Investigate Distributions
  • 6 Account for Maturity Effects
  • 7 Caution for External Data Sets
  • Slide Number 27
  • Measuring Project Progress
  • Requirements Change Impact
  • Phase Defect Removal Model
  • SCRUM Board
  • Burndown Chart and Velocity
  • Analyzing Velocity
  • Measuring and Managing Flow
  • Trends Over Sprints
  • Sources and Measures
  • Recommended Books
  • Personal Software Process (PSP)
  • Team Software Process (TSP)
  • Aggregated Personal Data Best
  • Defect Detection at Intuit
  • TSP Benefits
  • Section 3
  • CMMI Maturity Transitions
  • Enhanced Maturity Framework
  • Maturity Level Transformation
  • Measurement by Level
  • Average Improvement Results
  • Schedule Performance
  • Cost Performance
  • Crosbyrsquos Cost of Quality Model
  • Raytheons Cost of Quality
  • Raytheonrsquos Productivity
  • Raytheonrsquos Cost Reduction
  • Raytheonrsquos Cost Predictability
  • OMRONrsquos Reuse gains
  • Section 4
  • What about the Product
  • Six Sigmarsquos Challenge
  • Testing is Not Enough
  • CASTrsquos Application Intelligence Platform
  • Modern Apps are a Technology Stack
  • Architecturally Complex Defects
  • Propagated Risk
  • Transaction Risk
  • Risk-Based Prioritization
  • Static Analysis in US DoD
  • Reducing Operational Incidents amp Costs
  • Reducing Operational Losses
  • Slide Number 70
  • The Technical Debt Metaphor
  • Tech Debt by Quality Factor
  • Join CISQ wwwit-cisqorg
  • Section
  • References 1
  • References 2
  • References 2

PHASE

Escapes Previous Phase

KSLOC

Defect Injection

KSLOC

Subtotal

KSLOC

Removal

Effective-ness

Defect Removal

KSLOC

Escapes at phase exit KSLOC

Rqmts

0

12

12

0

0

12

Design

12

180

192

76

146

46

Code amp UT

46

154

200

71

142

58

Inte-gration

58

0

58

67

39

19

Sys Test

19

0

19

58

11

08

Opera-tion

08

Requirements Added Requirements Changed Requirements Deleted
Feb 2 20 1
Mar 2 11 1
Apr 45 35 3
May 15 61 1
Jun 1 20 4
Jul 15 15 0
Aug 18 05 2
Feb Feb Feb
Mar Mar Mar
Apr Apr Apr
May May May
Jun Jun Jun
Jul Jul Jul
Aug Aug Aug
Page 24: Best Practices for Software Process and Product Measurement · Software Process and Product Measurement Dr. Bill Curtis ... measurement best practice • Best guidance for starting

Click to edit Master title style 5 Investigate Distributions

Low Medium High

Rel

ativ

e p

rod

uct

ivit

y (

of m

ean

)

Software Engineering Practices Usage

-100

-50

0

50

100

150

200

250

Click to edit Master title style 6 Account for Maturity Effects

Which organization will be required to spend more effort on correcting software flaws

Click to edit Master title style 7 Caution for External Data Sets

26

DACS Us

Productivity

Siz

e Data definitions

Data collection

Data validity

Data age

Demographics

Click to edit Master title style

1) Tracking traditional projects

2) Iterative and Agile measurement

3) Team Software Process measurement

33

Section 2 Measuring to Manage

Software Projects

Click to edit Master title style Measuring Project Progress

300K 250K 200K 150K 100K 50K 0K

Jan Feb Mar Apr May Jun

Planned rate of code Growth

Actual code growth

Completion of coding

ldquoThe measures say we are on schedulerdquo

300K

250K

200K

150K

100K

50K

0K Jan Feb Mar Apr May Jun

Planned rate of code Growth

Actual code growth

Completion of coding

ldquoBut the code keeps growing We should have measured the requirements growthrdquo

50

100

150

Defect reports

Total

Open

Closed

125

75

25

Jan Feb Mar Apr May Jun

Completion of coding

ldquoAnd all the defect reports remaining to be fixed will keep us far behind schedulerdquo

Click to edit Master title style Requirements Change Impact

Number of requirements

affected

Chart6

Requirements Added
Requirements Changed
Requirements Deleted
Person hours of work or rework
2
20
1
2
11
1
45
35
3
15
61
1
1
20
4
15
15
0
18
05
2

Sheet1

Sheet1

Requirements Added
Requirements Changed
Requirements Deleted
Person hours of work or rework

Sheet2

Sheet3

Click to edit Master title style Phase Defect Removal Model

PHASE

Escapes Previous Phase KSLOC

Defect Injection KSLOC

Subtotal KSLOC

Removal Effective-ness

Defect Removal KSLOC

Escapes at phase exit KSLOC

Rqmts 0 12 12 0 0 12

Design 12 180 192 76 146 46

Code amp UT 46 154 200 71 142 58

Inte-gration 58 0 58 67 39 19

Sys Test 19 0 19 58 11 08

Opera-tion 08

Kan (1999) Metrics and Model in Software Quality Engineering

Click to edit Master title style SCRUM Board Scrumboard displays

the path to completion for each story by showing task status

Story points are an estimate of days for each story

Agile estimating is by feature (story) rather than by task

lsquoDonersquo does not always mean all tasks are finished

Kanban methods restrict the work in progress to a limited number of paths

Click to edit Master title style Burndown Chart and Velocity

Burndown chart displays story completion by day

Burndown chart indicates actual versus estimated progress

Velocity is the rate at which stories are completed

Velocity indicates sustainable progress

Velocity results provide one source of historical data for improving estimates

Story points without the context of productivity factors are dangerous for estimating

Click to edit Master title style Team Velocity

0

1

2

3

4

5

1 2 2 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20Days

Story Points per Hour

0

1

2

3

4

5

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

HrsStryPt

Analyzing Velocity

Item Cycle Times

02468

101214161820

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

Stories Hours HrsStryPt

0

10

20

30

40

50

60

1 2 2 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

Burndown Chart

Days

Click to edit Master title style Measuring and Managing Flow

Released

In progress

Backlogged

Cumulative Flow Diagram Days to delivery

outliers

Anderson (2010) Kanban

Click to edit Master title style

0

10

20

30

40

50

60

70

80

1 2 3 4 5 6 7 8 9 10 11 12

Sprints

Sto

ry p

oin

ts d

eliv

ered

Trends Over Sprints

0

10

20

30

40

50

60

70

80

1 2 3 4 5 6 7 8 9 10 11 12

Sprints

Sto

ry p

oin

ts d

eliv

ered

0

05

1

15

2

25

3

35

4

1 2 3 4 5 6 7 8 9 10 11 12

Sprints

Tech

nic

al d

ebt

Track and correlate multiple measures

across sprints

Predictive models

Click to edit Master title style Sources and Measures

Stories submitted

Stories pulled back

Growth in size of stories

Stories added mid-sprint

Stories under review

Failed tests

Non-development tasks

Assists to other projects

Data sources

Measures Project tracking

Code control

Bug tracking

Build system

Test environment

Deployment tools

Operational monitoring

Click to edit Master title style Recommended Books

Click to edit Master title style Personal Software Process (PSP)

PSP 0 Current process Time recording

Defect recording Defect type standards

PSP 01 Coding standard

Size measurement Process improvement

proposal

Baseline Personal Process

Humphrey (1997) Introduction to the Personal Software Process

PSP 1 Size estimating

Test report

Personal Planning

PSP 2 Code reviews

Design reviews

Personal Quality Management

PSP 21 Cyclic development

Cyclic Personal Process

PSP 11 Task planning

Schedule planning

PSP 21 Design templates

Click to edit Master title style Team Software Process (TSP) bull Built from personal processes of team members

bull Well defined team roles

bull Project launch workshops for planning

bull Team measurement and tracking

bull Post-mortems for improvement

bull Team application of lean principles

Humphrey (2000) Introduction to the Team Software Process

Click to edit Master title style Aggregated Personal Data Best

R2s range from 70 to 87

The ability to predict the amount of time required to produce a piece of software is dramatically improved by estimating at the individual level first and then combining the estimates rather than developing a single team estimate averaged over individuals

R2 = 53

Click to edit Master title style Defect Detection at Intuit

Consequently test changes from defect

detection into correctness verification

PSPTSP shifts defect detection from the test phase back up into development

Fagan (2005)

Personal reviews

33

Compile 33

Team reviews 19

Unit test 15

Test 14

Click to edit Master title style

Dramatic reductions

ndash Delivered defects

ndash Days per KLOC

ndash Schedule deviation

ndash Test defectskloc

ndash Variation in results

Humphrey (2006) TSP Leading a Development Team

TSP Benefits

Click to edit Master title style Section 3 Section 3 Measuring Software

Improvement Programs

1) Improvement program methods

2) Maturity-based measurement

3) Improvement program results

61

Click to edit Master title style CMMI Maturity Transitions

Level 5 Optimizing

Innovation management

Level 4 Quantitatively

Managed

Capability management

Level 3 Defined

Process management

Level 2 Managed

Project management

Level 1 Initial

Inconsistent management

Chrisis Konrad amp Shrum (2005) CMMI

Click to edit Master title style Enhanced Maturity Framework

Level 5 Innovating

Innovation management

Level 4 Optimized

Capability management

Level 3 Standardized

Organizational management

Level 2 Stabilized

Work unit management

Level 1 Initial

Inconsistent management

Click to edit Master title style Maturity Level Transformation

Level 5 Opportunistic improvements

Empowered culture

Level 5 Proactive

improvements Agile culture

Organization

Projects

Individual

Level 4 Process and results

managed statistically Precision culture

Level 3 Organization develops

standard processes Common culture

Level 2 Project managers stabilize projects

Local culture

Level 1 Ad Hoc inconsistent

development practices Relationship culture

Trust

Click to edit Master title style Measurement by Level

Level 5 mdash Exploratory data ∆ t = N x ROI

Level 4 mdash Predictive data X plusmn (196σ radicn )

Level 3 mdash Process data 83 defects per review

Level 2 mdash Management data Effort = α LOCβ

Level 1 mdash Haphazard data ldquoabout a million linesrdquo

Measure-maturity mismatch slows improvement and creates

dysfunction

Click to edit Master title style Average Improvement Results

Productivity growth 4 35 9 - 67

Pre-test defect detection 3 22 6 - 25

Time to market 2 darr 19 15 - 23

Field error reports 5 darr 39 10 - 94

Return on investment 5 501 41 - 881

Annual Annual Improvement benefit Orgs median range

Click to edit Master title style Schedule Performance

0

02

04

06

08

1

12

14

16

1 2 3

Schedule performance

index

CMM maturity level

Budgeted cost of work performed

Budgeted cost of work scheduled

behind ahead

Air Force study of contractors

Flowe amp Thordahl (1994)

Click to edit Master title style Cost Performance

Cost performance

index

0

05

1

15

2

25

1 2 3 CMM Maturity Level

Budgeted cost of work performed Actual cost of

work performed

overrun underrun

Air Force study of contractors

Flowe amp Thordahl (1994)

Click to edit Master title style Crosbyrsquos Cost of Quality Model Project Cost

Performance Cost of Quality Generating plans documentation Development - requirements - design - code - integration Prevention Appraisal

Reviews - system - requirements - design - test plan - test scripts Walkthroughs Testing (first time) Independent Verification and Validation (first time) Audits

Training Policies Procedures and methods Tools Planning Quality Improvement Projects Data Gathering and Analysis Root Cause Analysis Quality Reporting

Nonconformance Conformance Fixing Defects Reworking any Document Updating Source Code Re-reviews Re-tests Lab costs for re-tests Patches(Internal and External) Engineering Changes Change Control Boards External Failures and fines Customer Support Help Desks

Dion (1993) Crosby (1979) Quality Is Free

Click to edit Master title style Raytheons Cost of Quality

bull Performance mdash cost of building it right first time bull Nonconformance mdash cost of rework bull Appraisal mdash cost of testing bull Prevention mdash cost of preventing nonconformance

Dion (1993) Haley (1995)

23

18

Year Level Perform Nonconf Appraise Prevent

1988 1 34 41 15 7

1990 2 55 18 15 12

1992 3 66 11

1994 4 76 6 75

Click to edit Master title style Raytheonrsquos Productivity

0

1

2

3

4

1988 1989 1990 1991 1992 1993 1994

Growth relative to 1989 baseline

Lydon (1995)

Click to edit Master title style Raytheonrsquos Cost Reduction

0

02

04

06

08

1

12

1988 1989 1990 1991 1992 1993 1994

Reduction relative to 1989 baseline

Lydon (1995)

Click to edit Master title style

Haley (1995)

1993 1992 1991 1990 1989 1988 1994

50

40

30

20

10

0

-10

Actual cost

Budgeted cost

over-run under-run

4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1

Raytheonrsquos Cost Predictability

Click to edit Master title style OMRONrsquos Reuse gains

Sakamoto et al (1996)

0

5 0 0

1 0 0 0

1 5 0 0

2 0 0 0

2 5 0 0

3 0 0 0

1 9 8 7 1 9 8 8 1 9 8 9 1 9 9 0 1 9 9 1 1 9 9 2 1 9 9 3 1 9 9 4

Steps

100

75

50

25

0

of reuse

Click to edit Master title style Section 4 Section 4 Measuring Quality and

Technical Debt

1) Structural quality measurement

2) Technical debt

3) Consortium for IT Software Quality (CISQ)

89

Click to edit Master title style What about the Product CMMIrsquos primary assumption

ldquoThe quality of a software system is governed by quality of the process used to develop itrdquo

- Watts Humphrey

CMMIrsquos product quality problem

1 Assessments do not verify product quality

2 Compliance focus undermines CMMIrsquos primary assumption

3 Product quality focus at Level 4 in CMM was lost in quantitative process control in CMMI

4 The assumptions underlying control charts are violated by software data

Click to edit Master title style Six Sigmarsquos Challenge

Process focus ndash process improvement ndash Six Sigma Product focus ndash product improvement ndash Design for 6σ

Product focus supplements product focus to unlock even more value from software

Six Sigma projects must have significant benefits Huge benefits

Large benefits

Good benefits

Okay benefits

Small benefits

Now what

Ultimately we run out of projects with significant enough benefits to continue the programhellip

so what is the solution that allows continual improvement

Click to edit Master title style Testing is Not Enough ldquoAs higher levels of assurance are demandedhelliptesting cannot deliver the level of confidence required at a reasonable costrdquo

ldquoThe correctness of the code is rarely the weakest linkrdquo

Jackson D (2009) Spinellis D (2006)

ldquohellipa failure to satisfy a non-functional requirement can be critical even catastrophichellipnon-functional requirements are sometimes difficult to verify We cannot write a test case to verify a systemrsquos reliabilityhellipThe ability to associate code to non-functional properties can be a powerful weapon in a software engineerrsquos arsenalrdquo

Click to edit Master title style CASTrsquos Application Intelligence Platform

Application Analysis

Evaluation of 1200+ coding amp

architectural rules

Application meta-data

Transferability

Changeability

Robustness

Performance

Security

Quality Measurements

Detected Violations

Expensive operation in loop Static vs pooled connections Complex query on big table Large indices on big table

Empty CATCH block Uncontrolled data access Poor memory management Opened resource not closed

SQL injection Cross-site scripting Buffer overflow Uncontrolled format string

Unstructured code Misuse of inheritance Lack of comments Violated naming convention

Highly coupled component Duplicated code Index modified in loop High cyclomatic complexity

Language Parsers

Oracle PLSQL Sybase T-SQL SQL Server T-SQL IBM SQLPSM C C++ C Pro C Cobol CICS Visual Basic VBNet ASPNet Java J2EE JSP XML HTML Javascript VBScript PHP PowerBuilder Oracle Forms PeopleSoft SAP ABAP Netweaver Tibco Business Objects Universal Analyzer for other languages

Click to edit Master title style Modern Apps are a Technology Stack Ar

chite

ctur

al C

ompl

ianc

e

Data Flow Transaction Risk

EJB PLSQL

Oracle

SQL Server

DB2

TSQL

Hibernate

Spring

Struts NET

COBOL

Sybase IMS

Messaging

Integration quality Architectural

compliance Risk propagation Application security Resiliency checks Transaction integrity

Function point Effort estimation Data access control SDK versioning Calibration across

technologies IT organization level

System Level 3

Code style amp layout Expression complexity Code documentation Class or program design Basic coding standards Developer level

Unit Level 1

Java Java Java

Java Java

Web Services Java Java Single languagetechnology layer

Intra-technology architecture Intra-layer dependencies Inter-program invocation Security vulnerabilities Development team level

Technology Level 2

JSP ASPNET APIs

Click to edit Master title style Architecturally Complex Defects

Primary cause of operational

problems

20x as many

fixes to correct

Architecturally Complex Defect

A structural flaw involving interactions among multiple components that reside in different application layers

48

52 92

8 Architecturally

Complex Defects

Code unit-level violations

of total app defects

of total repair effort

Most architecturally complex defects involve an architectural hotspot ndash a badly-built component that should be replaced

Click to edit Master title style Propagated Risk

The impact of violation 284 in Component B is very widely propagated throughout the system ndash

it presents very high risk

The impact of violation 342 in Component A is only narrowly propagated in the system

ndash it presents low risk

Propagated Risk Index

A ranking of the risk created by each violation based on the its severity and the breadth of its impact in the system

Violation 342 in component A

Violation 284 in component B

Remediating violation 284 in Component B will have

the greater impact on reducing risk and cost

Click to edit Master title style Transaction Risk

Customers care most about the dependability of their transactions

The risk of a transaction is determined by the number and severity of violations along its path

Transactions can be ranked by risk to prioritize their remediation

The Transaction Risk Index can be further tuned using the operational frequency of each transaction

Transaction Risk Index

A ranking of transactions based on the risk created by the number and severity of violations along the transaction path

Transaction entryexit in the user interface layer

Transaction fulfillment in the data storage layer

1 3 2 4 5

Transaction 1 poses greatest risk

Transaction processing in the business logic layer

Click to edit Master title style Risk-Based Prioritization

Violation Health Factor Severity Propagated Risk Index

In a high risk transaction

Frequency of path execution

148 Security 9

371 Robustness 9

062 Performance 8

284 Robustness 7

016 Changeability 7

241 Security 5

173 Transferability 4

359 Transferability 3

Click to edit Master title style Static Analysis in US DoD

John Keane (2013)

Testing finds lt13 of structural defects

Static analysis before testing is 85 efficient

With testing and static analysis combined test schedules are reduced by 50

Click to edit Master title style Reducing Operational Incidents amp Costs

Study of structural quality measures and maintenance effort across 20 customers in a large global system integrator

TQI increase of 24 decreased corrective maintenance effort by 50

Total Quality Index

Click to edit Master title style Reducing Operational Losses

0

20

40

60

80

100

120

140

160

180

100 150 200 250 300 350 400

Technical Quality Index

Pro

du

ctio

n In

cid

ents

+$2000000 Annual Loss

$50000 Annual Loss

$5000 Annual Loss

Large international investment bank Business critical applications

Click to edit Master title style Factors Affecting Structural Quality

2017 CRASH Report Request from infocastsoftwarecom

Factor apps Robust Security Perform Change Transfer

Technology 1606 20 3 16 26 7

Java-EE

Industry 677 5 5 7 4 6

App type 510 1 1 1 1 1

Source 580 1 1 4

Shore 540

Maturity 72 28 25 15 24 12

Method 208 10 6 14 6

Team size 186 7 5 8 8

of users 262 4 3 5 4

Percentage of variation in structural quality scores accounted for by various factors

Click to edit Master title style The Technical Debt Metaphor

Structural quality problems in production code

Technical Debt

Principal borrowed

Interest on the debt

Business Risk Liability from debt

Opportunity cost

Interestmdashcontinuing IT costs attributable to the violations causing technical debt ie higher maintenance costs greater resource usage etc

Principalcost of fixing problems remaining in the code after release that must be remediated

Opportunity costmdashbenefits that could have been achieved had resources been put on new capability rather than retiring technical debt

Liabilitymdashbusiness costs related to outages breaches corrupted data etc

Technical Debt the future cost of fixing defects remaining in code at release a component of the cost of ownership

Todayrsquos talk focuses on the principal

Curtis et al (2012) IEEE Software

Click to edit Master title style Tech Debt by Quality Factor

Transferability

40

Changeability

30

Security 7

Robustness

18

70 of Technical Debt is in IT Cost (Transferability Changeability)

30 of Technical Debt is in Business Risk

(Robustness Performance Security)

Health Factor proportions are mostly

consistent across technologies

Curtis et al (2012) IEEE Software

Click to edit Master title style Join CISQ wwwit-cisqorg

Click to edit Master title style Section

References

115

Click to edit Master title style References 1 Anderson DJ (2010) Kanban Sequim WA Blue Hole Press

Basili V Trendowicz A Kowalczyk M Heidrich J Seaman C Munch J amp Rombach D (2014) Aligning Organizations through Measurement Heidelberg Springer

Boehm B (1987) Increasing software productivity IEEE Computer 20 (9) 43-57

Boehm B et al (2000) Software Cost Estimation with COCOMO II Prentice-Hall

Chrissis MB Konrad M amp Shrum S (2011) CMMI Third Edition Guidelines for Process Integration and Product Improvement Reading MA Addison Wesley

Cohn M (2006) Agile Estimating and Planning Upper Saddle River NJ Prentice-Hall

Crosby P (1979) Quality Is Free New York McGraw-Hill

Curtis B Sapiddi J amp Szynkarski A (2012) Estimating the principle of an applicationrsquos technical debt IEEE Software 29 (6) 34-42

Dion R (1993) Process improvement and the corporate balance sheet IEEE Software 10 (4) 28-35

Duncker R (1992) Presentation at the 25th Annual Conference of the Singapore Computer Society Singapore November 1992

Dyne K (1999) ESSI Benefits from Continuous Software Improvements In Proceedings of ESEPGrsquo99 Milton Keyes UK ESPI Foundation

Flowe RM amp Thordahl JB (1994) A Correlational Study of the SEIrsquos Capability Maturity Model and Software Development Performance in DoD Contracts (AFITGSSLAR94D-2) Dayton OH Air Force Institute of Technology Wright Patterson Air Force Base

116

Click to edit Master title style References 2 Haley T Ireland B Wojtaszek E Nash D amp Dion R (1995) Raytheon Electronic Systems

Experience in Software Process Improvement (Tech Rep CMUSEI-95-TR-017) ) Pittsburgh Software Engineering Institute Carnegie Mellon University

Ebert C amp Dumke R (2007) Software Measurement Berlin Springer-Verlag

Fagan E amp Davis N (2005) TSP PSP at Intuit Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Fenton N (2006) New directions for Software Metrics CIO Symposium on Software Best Practices

Garmus D amp Herron D (2001) Function Point Analysis Boston Addison-Wesley

George E (2005) Project management with TSP Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Herbsleb J Carleton A Rozum J Siegel J amp Zubrow D (1994) Benefits of CMM-Based Software Process Improvement Initial Results (Tech Rep CMUSEI-94-TR-13) Pittsburgh Software Engineering Institute Carnegie Mellon University

Humphrey W (1997) Introduction to the Personal Software Process Boston Addison Wesley

Humphrey W (2000) Introduction to the Team Software Process Boston Addison Wesley

Humphrey W (2006) TSP Leading a Development Team Boston Addison Wesley

Jackson D (2009) A direct path to dependable software Communications of the ACM 52 (4)

Kan SH (1995) Metrics and Models in Software Quality Engineering Reading MA Addison-Wesley

Keeni B (2000) IEEE Software 16 (4)

117

Click to edit Master title style References 2 Larman C (2004) Agile and Iterative Development Boston Addison Wesley

Lydon T (1995) Productivity drivers Process and capital In Proceedings of the 1995 SEPG Conference Pittsburgh Software Engineering Institute Carnegie Mellon University

McGarry J et al (2002) Practical Software Measurement Reading MA Addison Wesley

Paulk M Weber C Curtis B amp Chrissis M (1995) The Capability Maturity Model Guidelines for Improving the Software Process Reading MA Addison-Wesley

Peterman W (2000) IEEE Software JulyAugust issue

Poppendieck M amp Poppendieck T (2007) Implementing Lean Software Development Boston Addison-Wesley

Sakamoto K Kishida K amp Nakakoji K (1996) Cultural adaptation of the CMM In Fuggetta A amp Wolf A (Eds) Software Process Chichester UK Wiley 137-154

Sinha M (2006) Performance models ndash Enabling organizations prediction capability Proceedings of SEPGrsquo06 Pittsburgh Software Engineering Institute Carnegie Mellon University

Schwaber K amp Beedle M (2002) Agile Software Development Using Scrum

Spinellis D (2006) Code Quality Addison-Wesley

Van Epps S amp Marshall R (2012) Using TSP at the end of the development cycle Proceedings of TSP 2012 Pittsburgh Software Engineering Institute Carnegie Mellon University

Vu JD (1996) Software process improvement A business case In Proceedings of the European SEPG Conference Milton Keynes UK European Software Process Improvement Foundation

118

  • Best Practices for Software Process and Product Measurement
  • Instructor Dr Bill Curtis
  • Agenda
  • Section 1
  • Why Does Measurement Languish
  • Measurement Process Guidance
  • Measurement Categories
  • Productivity Analysis Objectives
  • Productivity Analysis Measures
  • Software Productivity
  • Software Size Measures
  • How Many Lines of Code
  • Function Point Estimation
  • Effort mdash Weakest Measure
  • How Quality Affects Productivity
  • Carry-forward Rework
  • Example of Quality Impact
  • Quality-Adjusted Productivity
  • Best Practices for Analysis
  • 1 Evaluate Demographics
  • 2 Segment Baselines
  • 3 Beware Sampling Effects
  • 4 Understand Variation
  • 5 Investigate Distributions
  • 6 Account for Maturity Effects
  • 7 Caution for External Data Sets
  • Slide Number 27
  • Measuring Project Progress
  • Requirements Change Impact
  • Phase Defect Removal Model
  • SCRUM Board
  • Burndown Chart and Velocity
  • Analyzing Velocity
  • Measuring and Managing Flow
  • Trends Over Sprints
  • Sources and Measures
  • Recommended Books
  • Personal Software Process (PSP)
  • Team Software Process (TSP)
  • Aggregated Personal Data Best
  • Defect Detection at Intuit
  • TSP Benefits
  • Section 3
  • CMMI Maturity Transitions
  • Enhanced Maturity Framework
  • Maturity Level Transformation
  • Measurement by Level
  • Average Improvement Results
  • Schedule Performance
  • Cost Performance
  • Crosbyrsquos Cost of Quality Model
  • Raytheons Cost of Quality
  • Raytheonrsquos Productivity
  • Raytheonrsquos Cost Reduction
  • Raytheonrsquos Cost Predictability
  • OMRONrsquos Reuse gains
  • Section 4
  • What about the Product
  • Six Sigmarsquos Challenge
  • Testing is Not Enough
  • CASTrsquos Application Intelligence Platform
  • Modern Apps are a Technology Stack
  • Architecturally Complex Defects
  • Propagated Risk
  • Transaction Risk
  • Risk-Based Prioritization
  • Static Analysis in US DoD
  • Reducing Operational Incidents amp Costs
  • Reducing Operational Losses
  • Slide Number 70
  • The Technical Debt Metaphor
  • Tech Debt by Quality Factor
  • Join CISQ wwwit-cisqorg
  • Section
  • References 1
  • References 2
  • References 2

PHASE

Escapes Previous Phase

KSLOC

Defect Injection

KSLOC

Subtotal

KSLOC

Removal

Effective-ness

Defect Removal

KSLOC

Escapes at phase exit KSLOC

Rqmts

0

12

12

0

0

12

Design

12

180

192

76

146

46

Code amp UT

46

154

200

71

142

58

Inte-gration

58

0

58

67

39

19

Sys Test

19

0

19

58

11

08

Opera-tion

08

Requirements Added Requirements Changed Requirements Deleted
Feb 2 20 1
Mar 2 11 1
Apr 45 35 3
May 15 61 1
Jun 1 20 4
Jul 15 15 0
Aug 18 05 2
Feb Feb Feb
Mar Mar Mar
Apr Apr Apr
May May May
Jun Jun Jun
Jul Jul Jul
Aug Aug Aug
Page 25: Best Practices for Software Process and Product Measurement · Software Process and Product Measurement Dr. Bill Curtis ... measurement best practice • Best guidance for starting

Click to edit Master title style 6 Account for Maturity Effects

Which organization will be required to spend more effort on correcting software flaws

Click to edit Master title style 7 Caution for External Data Sets

26

DACS Us

Productivity

Siz

e Data definitions

Data collection

Data validity

Data age

Demographics

Click to edit Master title style

1) Tracking traditional projects

2) Iterative and Agile measurement

3) Team Software Process measurement

33

Section 2 Measuring to Manage

Software Projects

Click to edit Master title style Measuring Project Progress

300K 250K 200K 150K 100K 50K 0K

Jan Feb Mar Apr May Jun

Planned rate of code Growth

Actual code growth

Completion of coding

ldquoThe measures say we are on schedulerdquo

300K

250K

200K

150K

100K

50K

0K Jan Feb Mar Apr May Jun

Planned rate of code Growth

Actual code growth

Completion of coding

ldquoBut the code keeps growing We should have measured the requirements growthrdquo

50

100

150

Defect reports

Total

Open

Closed

125

75

25

Jan Feb Mar Apr May Jun

Completion of coding

ldquoAnd all the defect reports remaining to be fixed will keep us far behind schedulerdquo

Click to edit Master title style Requirements Change Impact

Number of requirements

affected

Chart6

Requirements Added
Requirements Changed
Requirements Deleted
Person hours of work or rework
2
20
1
2
11
1
45
35
3
15
61
1
1
20
4
15
15
0
18
05
2

Sheet1

Sheet1

Requirements Added
Requirements Changed
Requirements Deleted
Person hours of work or rework

Sheet2

Sheet3

Click to edit Master title style Phase Defect Removal Model

PHASE

Escapes Previous Phase KSLOC

Defect Injection KSLOC

Subtotal KSLOC

Removal Effective-ness

Defect Removal KSLOC

Escapes at phase exit KSLOC

Rqmts 0 12 12 0 0 12

Design 12 180 192 76 146 46

Code amp UT 46 154 200 71 142 58

Inte-gration 58 0 58 67 39 19

Sys Test 19 0 19 58 11 08

Opera-tion 08

Kan (1999) Metrics and Model in Software Quality Engineering

Click to edit Master title style SCRUM Board Scrumboard displays

the path to completion for each story by showing task status

Story points are an estimate of days for each story

Agile estimating is by feature (story) rather than by task

lsquoDonersquo does not always mean all tasks are finished

Kanban methods restrict the work in progress to a limited number of paths

Click to edit Master title style Burndown Chart and Velocity

Burndown chart displays story completion by day

Burndown chart indicates actual versus estimated progress

Velocity is the rate at which stories are completed

Velocity indicates sustainable progress

Velocity results provide one source of historical data for improving estimates

Story points without the context of productivity factors are dangerous for estimating

Click to edit Master title style Team Velocity

0

1

2

3

4

5

1 2 2 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20Days

Story Points per Hour

0

1

2

3

4

5

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

HrsStryPt

Analyzing Velocity

Item Cycle Times

02468

101214161820

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

Stories Hours HrsStryPt

0

10

20

30

40

50

60

1 2 2 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

Burndown Chart

Days

Click to edit Master title style Measuring and Managing Flow

Released

In progress

Backlogged

Cumulative Flow Diagram Days to delivery

outliers

Anderson (2010) Kanban

Click to edit Master title style

0

10

20

30

40

50

60

70

80

1 2 3 4 5 6 7 8 9 10 11 12

Sprints

Sto

ry p

oin

ts d

eliv

ered

Trends Over Sprints

0

10

20

30

40

50

60

70

80

1 2 3 4 5 6 7 8 9 10 11 12

Sprints

Sto

ry p

oin

ts d

eliv

ered

0

05

1

15

2

25

3

35

4

1 2 3 4 5 6 7 8 9 10 11 12

Sprints

Tech

nic

al d

ebt

Track and correlate multiple measures

across sprints

Predictive models

Click to edit Master title style Sources and Measures

Stories submitted

Stories pulled back

Growth in size of stories

Stories added mid-sprint

Stories under review

Failed tests

Non-development tasks

Assists to other projects

Data sources

Measures Project tracking

Code control

Bug tracking

Build system

Test environment

Deployment tools

Operational monitoring

Click to edit Master title style Recommended Books

Click to edit Master title style Personal Software Process (PSP)

PSP 0 Current process Time recording

Defect recording Defect type standards

PSP 01 Coding standard

Size measurement Process improvement

proposal

Baseline Personal Process

Humphrey (1997) Introduction to the Personal Software Process

PSP 1 Size estimating

Test report

Personal Planning

PSP 2 Code reviews

Design reviews

Personal Quality Management

PSP 21 Cyclic development

Cyclic Personal Process

PSP 11 Task planning

Schedule planning

PSP 21 Design templates

Click to edit Master title style Team Software Process (TSP) bull Built from personal processes of team members

bull Well defined team roles

bull Project launch workshops for planning

bull Team measurement and tracking

bull Post-mortems for improvement

bull Team application of lean principles

Humphrey (2000) Introduction to the Team Software Process

Click to edit Master title style Aggregated Personal Data Best

R2s range from 70 to 87

The ability to predict the amount of time required to produce a piece of software is dramatically improved by estimating at the individual level first and then combining the estimates rather than developing a single team estimate averaged over individuals

R2 = 53

Click to edit Master title style Defect Detection at Intuit

Consequently test changes from defect

detection into correctness verification

PSPTSP shifts defect detection from the test phase back up into development

Fagan (2005)

Personal reviews

33

Compile 33

Team reviews 19

Unit test 15

Test 14

Click to edit Master title style

Dramatic reductions

ndash Delivered defects

ndash Days per KLOC

ndash Schedule deviation

ndash Test defectskloc

ndash Variation in results

Humphrey (2006) TSP Leading a Development Team

TSP Benefits

Click to edit Master title style Section 3 Section 3 Measuring Software

Improvement Programs

1) Improvement program methods

2) Maturity-based measurement

3) Improvement program results

61

Click to edit Master title style CMMI Maturity Transitions

Level 5 Optimizing

Innovation management

Level 4 Quantitatively

Managed

Capability management

Level 3 Defined

Process management

Level 2 Managed

Project management

Level 1 Initial

Inconsistent management

Chrisis Konrad amp Shrum (2005) CMMI

Click to edit Master title style Enhanced Maturity Framework

Level 5 Innovating

Innovation management

Level 4 Optimized

Capability management

Level 3 Standardized

Organizational management

Level 2 Stabilized

Work unit management

Level 1 Initial

Inconsistent management

Click to edit Master title style Maturity Level Transformation

Level 5 Opportunistic improvements

Empowered culture

Level 5 Proactive

improvements Agile culture

Organization

Projects

Individual

Level 4 Process and results

managed statistically Precision culture

Level 3 Organization develops

standard processes Common culture

Level 2 Project managers stabilize projects

Local culture

Level 1 Ad Hoc inconsistent

development practices Relationship culture

Trust

Click to edit Master title style Measurement by Level

Level 5 mdash Exploratory data ∆ t = N x ROI

Level 4 mdash Predictive data X plusmn (196σ radicn )

Level 3 mdash Process data 83 defects per review

Level 2 mdash Management data Effort = α LOCβ

Level 1 mdash Haphazard data ldquoabout a million linesrdquo

Measure-maturity mismatch slows improvement and creates

dysfunction

Click to edit Master title style Average Improvement Results

Productivity growth 4 35 9 - 67

Pre-test defect detection 3 22 6 - 25

Time to market 2 darr 19 15 - 23

Field error reports 5 darr 39 10 - 94

Return on investment 5 501 41 - 881

Annual Annual Improvement benefit Orgs median range

Click to edit Master title style Schedule Performance

0

02

04

06

08

1

12

14

16

1 2 3

Schedule performance

index

CMM maturity level

Budgeted cost of work performed

Budgeted cost of work scheduled

behind ahead

Air Force study of contractors

Flowe amp Thordahl (1994)

Click to edit Master title style Cost Performance

Cost performance

index

0

05

1

15

2

25

1 2 3 CMM Maturity Level

Budgeted cost of work performed Actual cost of

work performed

overrun underrun

Air Force study of contractors

Flowe amp Thordahl (1994)

Click to edit Master title style Crosbyrsquos Cost of Quality Model Project Cost

Performance Cost of Quality Generating plans documentation Development - requirements - design - code - integration Prevention Appraisal

Reviews - system - requirements - design - test plan - test scripts Walkthroughs Testing (first time) Independent Verification and Validation (first time) Audits

Training Policies Procedures and methods Tools Planning Quality Improvement Projects Data Gathering and Analysis Root Cause Analysis Quality Reporting

Nonconformance Conformance Fixing Defects Reworking any Document Updating Source Code Re-reviews Re-tests Lab costs for re-tests Patches(Internal and External) Engineering Changes Change Control Boards External Failures and fines Customer Support Help Desks

Dion (1993) Crosby (1979) Quality Is Free

Click to edit Master title style Raytheons Cost of Quality

bull Performance mdash cost of building it right first time bull Nonconformance mdash cost of rework bull Appraisal mdash cost of testing bull Prevention mdash cost of preventing nonconformance

Dion (1993) Haley (1995)

23

18

Year Level Perform Nonconf Appraise Prevent

1988 1 34 41 15 7

1990 2 55 18 15 12

1992 3 66 11

1994 4 76 6 75

Click to edit Master title style Raytheonrsquos Productivity

0

1

2

3

4

1988 1989 1990 1991 1992 1993 1994

Growth relative to 1989 baseline

Lydon (1995)

Click to edit Master title style Raytheonrsquos Cost Reduction

0

02

04

06

08

1

12

1988 1989 1990 1991 1992 1993 1994

Reduction relative to 1989 baseline

Lydon (1995)

Click to edit Master title style

Haley (1995)

1993 1992 1991 1990 1989 1988 1994

50

40

30

20

10

0

-10

Actual cost

Budgeted cost

over-run under-run

4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1

Raytheonrsquos Cost Predictability

Click to edit Master title style OMRONrsquos Reuse gains

Sakamoto et al (1996)

0

5 0 0

1 0 0 0

1 5 0 0

2 0 0 0

2 5 0 0

3 0 0 0

1 9 8 7 1 9 8 8 1 9 8 9 1 9 9 0 1 9 9 1 1 9 9 2 1 9 9 3 1 9 9 4

Steps

100

75

50

25

0

of reuse

Click to edit Master title style Section 4 Section 4 Measuring Quality and

Technical Debt

1) Structural quality measurement

2) Technical debt

3) Consortium for IT Software Quality (CISQ)

89

Click to edit Master title style What about the Product CMMIrsquos primary assumption

ldquoThe quality of a software system is governed by quality of the process used to develop itrdquo

- Watts Humphrey

CMMIrsquos product quality problem

1 Assessments do not verify product quality

2 Compliance focus undermines CMMIrsquos primary assumption

3 Product quality focus at Level 4 in CMM was lost in quantitative process control in CMMI

4 The assumptions underlying control charts are violated by software data

Click to edit Master title style Six Sigmarsquos Challenge

Process focus ndash process improvement ndash Six Sigma Product focus ndash product improvement ndash Design for 6σ

Product focus supplements product focus to unlock even more value from software

Six Sigma projects must have significant benefits Huge benefits

Large benefits

Good benefits

Okay benefits

Small benefits

Now what

Ultimately we run out of projects with significant enough benefits to continue the programhellip

so what is the solution that allows continual improvement

Click to edit Master title style Testing is Not Enough ldquoAs higher levels of assurance are demandedhelliptesting cannot deliver the level of confidence required at a reasonable costrdquo

ldquoThe correctness of the code is rarely the weakest linkrdquo

Jackson D (2009) Spinellis D (2006)

ldquohellipa failure to satisfy a non-functional requirement can be critical even catastrophichellipnon-functional requirements are sometimes difficult to verify We cannot write a test case to verify a systemrsquos reliabilityhellipThe ability to associate code to non-functional properties can be a powerful weapon in a software engineerrsquos arsenalrdquo

Click to edit Master title style CASTrsquos Application Intelligence Platform

Application Analysis

Evaluation of 1200+ coding amp

architectural rules

Application meta-data

Transferability

Changeability

Robustness

Performance

Security

Quality Measurements

Detected Violations

Expensive operation in loop Static vs pooled connections Complex query on big table Large indices on big table

Empty CATCH block Uncontrolled data access Poor memory management Opened resource not closed

SQL injection Cross-site scripting Buffer overflow Uncontrolled format string

Unstructured code Misuse of inheritance Lack of comments Violated naming convention

Highly coupled component Duplicated code Index modified in loop High cyclomatic complexity

Language Parsers

Oracle PLSQL Sybase T-SQL SQL Server T-SQL IBM SQLPSM C C++ C Pro C Cobol CICS Visual Basic VBNet ASPNet Java J2EE JSP XML HTML Javascript VBScript PHP PowerBuilder Oracle Forms PeopleSoft SAP ABAP Netweaver Tibco Business Objects Universal Analyzer for other languages

Click to edit Master title style Modern Apps are a Technology Stack Ar

chite

ctur

al C

ompl

ianc

e

Data Flow Transaction Risk

EJB PLSQL

Oracle

SQL Server

DB2

TSQL

Hibernate

Spring

Struts NET

COBOL

Sybase IMS

Messaging

Integration quality Architectural

compliance Risk propagation Application security Resiliency checks Transaction integrity

Function point Effort estimation Data access control SDK versioning Calibration across

technologies IT organization level

System Level 3

Code style amp layout Expression complexity Code documentation Class or program design Basic coding standards Developer level

Unit Level 1

Java Java Java

Java Java

Web Services Java Java Single languagetechnology layer

Intra-technology architecture Intra-layer dependencies Inter-program invocation Security vulnerabilities Development team level

Technology Level 2

JSP ASPNET APIs

Click to edit Master title style Architecturally Complex Defects

Primary cause of operational

problems

20x as many

fixes to correct

Architecturally Complex Defect

A structural flaw involving interactions among multiple components that reside in different application layers

48

52 92

8 Architecturally

Complex Defects

Code unit-level violations

of total app defects

of total repair effort

Most architecturally complex defects involve an architectural hotspot ndash a badly-built component that should be replaced

Click to edit Master title style Propagated Risk

The impact of violation 284 in Component B is very widely propagated throughout the system ndash

it presents very high risk

The impact of violation 342 in Component A is only narrowly propagated in the system

ndash it presents low risk

Propagated Risk Index

A ranking of the risk created by each violation based on the its severity and the breadth of its impact in the system

Violation 342 in component A

Violation 284 in component B

Remediating violation 284 in Component B will have

the greater impact on reducing risk and cost

Click to edit Master title style Transaction Risk

Customers care most about the dependability of their transactions

The risk of a transaction is determined by the number and severity of violations along its path

Transactions can be ranked by risk to prioritize their remediation

The Transaction Risk Index can be further tuned using the operational frequency of each transaction

Transaction Risk Index

A ranking of transactions based on the risk created by the number and severity of violations along the transaction path

Transaction entryexit in the user interface layer

Transaction fulfillment in the data storage layer

1 3 2 4 5

Transaction 1 poses greatest risk

Transaction processing in the business logic layer

Click to edit Master title style Risk-Based Prioritization

Violation Health Factor Severity Propagated Risk Index

In a high risk transaction

Frequency of path execution

148 Security 9

371 Robustness 9

062 Performance 8

284 Robustness 7

016 Changeability 7

241 Security 5

173 Transferability 4

359 Transferability 3

Click to edit Master title style Static Analysis in US DoD

John Keane (2013)

Testing finds lt13 of structural defects

Static analysis before testing is 85 efficient

With testing and static analysis combined test schedules are reduced by 50

Click to edit Master title style Reducing Operational Incidents amp Costs

Study of structural quality measures and maintenance effort across 20 customers in a large global system integrator

TQI increase of 24 decreased corrective maintenance effort by 50

Total Quality Index

Click to edit Master title style Reducing Operational Losses

0

20

40

60

80

100

120

140

160

180

100 150 200 250 300 350 400

Technical Quality Index

Pro

du

ctio

n In

cid

ents

+$2000000 Annual Loss

$50000 Annual Loss

$5000 Annual Loss

Large international investment bank Business critical applications

Click to edit Master title style Factors Affecting Structural Quality

2017 CRASH Report Request from infocastsoftwarecom

Factor apps Robust Security Perform Change Transfer

Technology 1606 20 3 16 26 7

Java-EE

Industry 677 5 5 7 4 6

App type 510 1 1 1 1 1

Source 580 1 1 4

Shore 540

Maturity 72 28 25 15 24 12

Method 208 10 6 14 6

Team size 186 7 5 8 8

of users 262 4 3 5 4

Percentage of variation in structural quality scores accounted for by various factors

Click to edit Master title style The Technical Debt Metaphor

Structural quality problems in production code

Technical Debt

Principal borrowed

Interest on the debt

Business Risk Liability from debt

Opportunity cost

Interestmdashcontinuing IT costs attributable to the violations causing technical debt ie higher maintenance costs greater resource usage etc

Principalcost of fixing problems remaining in the code after release that must be remediated

Opportunity costmdashbenefits that could have been achieved had resources been put on new capability rather than retiring technical debt

Liabilitymdashbusiness costs related to outages breaches corrupted data etc

Technical Debt the future cost of fixing defects remaining in code at release a component of the cost of ownership

Todayrsquos talk focuses on the principal

Curtis et al (2012) IEEE Software

Click to edit Master title style Tech Debt by Quality Factor

Transferability

40

Changeability

30

Security 7

Robustness

18

70 of Technical Debt is in IT Cost (Transferability Changeability)

30 of Technical Debt is in Business Risk

(Robustness Performance Security)

Health Factor proportions are mostly

consistent across technologies

Curtis et al (2012) IEEE Software

Click to edit Master title style Join CISQ wwwit-cisqorg

Click to edit Master title style Section

References

115

Click to edit Master title style References 1 Anderson DJ (2010) Kanban Sequim WA Blue Hole Press

Basili V Trendowicz A Kowalczyk M Heidrich J Seaman C Munch J amp Rombach D (2014) Aligning Organizations through Measurement Heidelberg Springer

Boehm B (1987) Increasing software productivity IEEE Computer 20 (9) 43-57

Boehm B et al (2000) Software Cost Estimation with COCOMO II Prentice-Hall

Chrissis MB Konrad M amp Shrum S (2011) CMMI Third Edition Guidelines for Process Integration and Product Improvement Reading MA Addison Wesley

Cohn M (2006) Agile Estimating and Planning Upper Saddle River NJ Prentice-Hall

Crosby P (1979) Quality Is Free New York McGraw-Hill

Curtis B Sapiddi J amp Szynkarski A (2012) Estimating the principle of an applicationrsquos technical debt IEEE Software 29 (6) 34-42

Dion R (1993) Process improvement and the corporate balance sheet IEEE Software 10 (4) 28-35

Duncker R (1992) Presentation at the 25th Annual Conference of the Singapore Computer Society Singapore November 1992

Dyne K (1999) ESSI Benefits from Continuous Software Improvements In Proceedings of ESEPGrsquo99 Milton Keyes UK ESPI Foundation

Flowe RM amp Thordahl JB (1994) A Correlational Study of the SEIrsquos Capability Maturity Model and Software Development Performance in DoD Contracts (AFITGSSLAR94D-2) Dayton OH Air Force Institute of Technology Wright Patterson Air Force Base

116

Click to edit Master title style References 2 Haley T Ireland B Wojtaszek E Nash D amp Dion R (1995) Raytheon Electronic Systems

Experience in Software Process Improvement (Tech Rep CMUSEI-95-TR-017) ) Pittsburgh Software Engineering Institute Carnegie Mellon University

Ebert C amp Dumke R (2007) Software Measurement Berlin Springer-Verlag

Fagan E amp Davis N (2005) TSP PSP at Intuit Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Fenton N (2006) New directions for Software Metrics CIO Symposium on Software Best Practices

Garmus D amp Herron D (2001) Function Point Analysis Boston Addison-Wesley

George E (2005) Project management with TSP Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Herbsleb J Carleton A Rozum J Siegel J amp Zubrow D (1994) Benefits of CMM-Based Software Process Improvement Initial Results (Tech Rep CMUSEI-94-TR-13) Pittsburgh Software Engineering Institute Carnegie Mellon University

Humphrey W (1997) Introduction to the Personal Software Process Boston Addison Wesley

Humphrey W (2000) Introduction to the Team Software Process Boston Addison Wesley

Humphrey W (2006) TSP Leading a Development Team Boston Addison Wesley

Jackson D (2009) A direct path to dependable software Communications of the ACM 52 (4)

Kan SH (1995) Metrics and Models in Software Quality Engineering Reading MA Addison-Wesley

Keeni B (2000) IEEE Software 16 (4)

117

Click to edit Master title style References 2 Larman C (2004) Agile and Iterative Development Boston Addison Wesley

Lydon T (1995) Productivity drivers Process and capital In Proceedings of the 1995 SEPG Conference Pittsburgh Software Engineering Institute Carnegie Mellon University

McGarry J et al (2002) Practical Software Measurement Reading MA Addison Wesley

Paulk M Weber C Curtis B amp Chrissis M (1995) The Capability Maturity Model Guidelines for Improving the Software Process Reading MA Addison-Wesley

Peterman W (2000) IEEE Software JulyAugust issue

Poppendieck M amp Poppendieck T (2007) Implementing Lean Software Development Boston Addison-Wesley

Sakamoto K Kishida K amp Nakakoji K (1996) Cultural adaptation of the CMM In Fuggetta A amp Wolf A (Eds) Software Process Chichester UK Wiley 137-154

Sinha M (2006) Performance models ndash Enabling organizations prediction capability Proceedings of SEPGrsquo06 Pittsburgh Software Engineering Institute Carnegie Mellon University

Schwaber K amp Beedle M (2002) Agile Software Development Using Scrum

Spinellis D (2006) Code Quality Addison-Wesley

Van Epps S amp Marshall R (2012) Using TSP at the end of the development cycle Proceedings of TSP 2012 Pittsburgh Software Engineering Institute Carnegie Mellon University

Vu JD (1996) Software process improvement A business case In Proceedings of the European SEPG Conference Milton Keynes UK European Software Process Improvement Foundation

118

  • Best Practices for Software Process and Product Measurement
  • Instructor Dr Bill Curtis
  • Agenda
  • Section 1
  • Why Does Measurement Languish
  • Measurement Process Guidance
  • Measurement Categories
  • Productivity Analysis Objectives
  • Productivity Analysis Measures
  • Software Productivity
  • Software Size Measures
  • How Many Lines of Code
  • Function Point Estimation
  • Effort mdash Weakest Measure
  • How Quality Affects Productivity
  • Carry-forward Rework
  • Example of Quality Impact
  • Quality-Adjusted Productivity
  • Best Practices for Analysis
  • 1 Evaluate Demographics
  • 2 Segment Baselines
  • 3 Beware Sampling Effects
  • 4 Understand Variation
  • 5 Investigate Distributions
  • 6 Account for Maturity Effects
  • 7 Caution for External Data Sets
  • Slide Number 27
  • Measuring Project Progress
  • Requirements Change Impact
  • Phase Defect Removal Model
  • SCRUM Board
  • Burndown Chart and Velocity
  • Analyzing Velocity
  • Measuring and Managing Flow
  • Trends Over Sprints
  • Sources and Measures
  • Recommended Books
  • Personal Software Process (PSP)
  • Team Software Process (TSP)
  • Aggregated Personal Data Best
  • Defect Detection at Intuit
  • TSP Benefits
  • Section 3
  • CMMI Maturity Transitions
  • Enhanced Maturity Framework
  • Maturity Level Transformation
  • Measurement by Level
  • Average Improvement Results
  • Schedule Performance
  • Cost Performance
  • Crosbyrsquos Cost of Quality Model
  • Raytheons Cost of Quality
  • Raytheonrsquos Productivity
  • Raytheonrsquos Cost Reduction
  • Raytheonrsquos Cost Predictability
  • OMRONrsquos Reuse gains
  • Section 4
  • What about the Product
  • Six Sigmarsquos Challenge
  • Testing is Not Enough
  • CASTrsquos Application Intelligence Platform
  • Modern Apps are a Technology Stack
  • Architecturally Complex Defects
  • Propagated Risk
  • Transaction Risk
  • Risk-Based Prioritization
  • Static Analysis in US DoD
  • Reducing Operational Incidents amp Costs
  • Reducing Operational Losses
  • Slide Number 70
  • The Technical Debt Metaphor
  • Tech Debt by Quality Factor
  • Join CISQ wwwit-cisqorg
  • Section
  • References 1
  • References 2
  • References 2

PHASE

Escapes Previous Phase

KSLOC

Defect Injection

KSLOC

Subtotal

KSLOC

Removal

Effective-ness

Defect Removal

KSLOC

Escapes at phase exit KSLOC

Rqmts

0

12

12

0

0

12

Design

12

180

192

76

146

46

Code amp UT

46

154

200

71

142

58

Inte-gration

58

0

58

67

39

19

Sys Test

19

0

19

58

11

08

Opera-tion

08

Requirements Added Requirements Changed Requirements Deleted
Feb 2 20 1
Mar 2 11 1
Apr 45 35 3
May 15 61 1
Jun 1 20 4
Jul 15 15 0
Aug 18 05 2
Feb Feb Feb
Mar Mar Mar
Apr Apr Apr
May May May
Jun Jun Jun
Jul Jul Jul
Aug Aug Aug
Page 26: Best Practices for Software Process and Product Measurement · Software Process and Product Measurement Dr. Bill Curtis ... measurement best practice • Best guidance for starting

Click to edit Master title style 7 Caution for External Data Sets

26

DACS Us

Productivity

Siz

e Data definitions

Data collection

Data validity

Data age

Demographics

Click to edit Master title style

1) Tracking traditional projects

2) Iterative and Agile measurement

3) Team Software Process measurement

33

Section 2 Measuring to Manage

Software Projects

Click to edit Master title style Measuring Project Progress

300K 250K 200K 150K 100K 50K 0K

Jan Feb Mar Apr May Jun

Planned rate of code Growth

Actual code growth

Completion of coding

ldquoThe measures say we are on schedulerdquo

300K

250K

200K

150K

100K

50K

0K Jan Feb Mar Apr May Jun

Planned rate of code Growth

Actual code growth

Completion of coding

ldquoBut the code keeps growing We should have measured the requirements growthrdquo

50

100

150

Defect reports

Total

Open

Closed

125

75

25

Jan Feb Mar Apr May Jun

Completion of coding

ldquoAnd all the defect reports remaining to be fixed will keep us far behind schedulerdquo

Click to edit Master title style Requirements Change Impact

Number of requirements

affected

Chart6

Requirements Added
Requirements Changed
Requirements Deleted
Person hours of work or rework
2
20
1
2
11
1
45
35
3
15
61
1
1
20
4
15
15
0
18
05
2

Sheet1

Sheet1

Requirements Added
Requirements Changed
Requirements Deleted
Person hours of work or rework

Sheet2

Sheet3

Click to edit Master title style Phase Defect Removal Model

PHASE

Escapes Previous Phase KSLOC

Defect Injection KSLOC

Subtotal KSLOC

Removal Effective-ness

Defect Removal KSLOC

Escapes at phase exit KSLOC

Rqmts 0 12 12 0 0 12

Design 12 180 192 76 146 46

Code amp UT 46 154 200 71 142 58

Inte-gration 58 0 58 67 39 19

Sys Test 19 0 19 58 11 08

Opera-tion 08

Kan (1999) Metrics and Model in Software Quality Engineering

Click to edit Master title style SCRUM Board Scrumboard displays

the path to completion for each story by showing task status

Story points are an estimate of days for each story

Agile estimating is by feature (story) rather than by task

lsquoDonersquo does not always mean all tasks are finished

Kanban methods restrict the work in progress to a limited number of paths

Click to edit Master title style Burndown Chart and Velocity

Burndown chart displays story completion by day

Burndown chart indicates actual versus estimated progress

Velocity is the rate at which stories are completed

Velocity indicates sustainable progress

Velocity results provide one source of historical data for improving estimates

Story points without the context of productivity factors are dangerous for estimating

Click to edit Master title style Team Velocity

0

1

2

3

4

5

1 2 2 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20Days

Story Points per Hour

0

1

2

3

4

5

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

HrsStryPt

Analyzing Velocity

Item Cycle Times

02468

101214161820

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

Stories Hours HrsStryPt

0

10

20

30

40

50

60

1 2 2 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

Burndown Chart

Days

Click to edit Master title style Measuring and Managing Flow

Released

In progress

Backlogged

Cumulative Flow Diagram Days to delivery

outliers

Anderson (2010) Kanban

Click to edit Master title style

0

10

20

30

40

50

60

70

80

1 2 3 4 5 6 7 8 9 10 11 12

Sprints

Sto

ry p

oin

ts d

eliv

ered

Trends Over Sprints

0

10

20

30

40

50

60

70

80

1 2 3 4 5 6 7 8 9 10 11 12

Sprints

Sto

ry p

oin

ts d

eliv

ered

0

05

1

15

2

25

3

35

4

1 2 3 4 5 6 7 8 9 10 11 12

Sprints

Tech

nic

al d

ebt

Track and correlate multiple measures

across sprints

Predictive models

Click to edit Master title style Sources and Measures

Stories submitted

Stories pulled back

Growth in size of stories

Stories added mid-sprint

Stories under review

Failed tests

Non-development tasks

Assists to other projects

Data sources

Measures Project tracking

Code control

Bug tracking

Build system

Test environment

Deployment tools

Operational monitoring

Click to edit Master title style Recommended Books

Click to edit Master title style Personal Software Process (PSP)

PSP 0 Current process Time recording

Defect recording Defect type standards

PSP 01 Coding standard

Size measurement Process improvement

proposal

Baseline Personal Process

Humphrey (1997) Introduction to the Personal Software Process

PSP 1 Size estimating

Test report

Personal Planning

PSP 2 Code reviews

Design reviews

Personal Quality Management

PSP 21 Cyclic development

Cyclic Personal Process

PSP 11 Task planning

Schedule planning

PSP 21 Design templates

Click to edit Master title style Team Software Process (TSP) bull Built from personal processes of team members

bull Well defined team roles

bull Project launch workshops for planning

bull Team measurement and tracking

bull Post-mortems for improvement

bull Team application of lean principles

Humphrey (2000) Introduction to the Team Software Process

Click to edit Master title style Aggregated Personal Data Best

R2s range from 70 to 87

The ability to predict the amount of time required to produce a piece of software is dramatically improved by estimating at the individual level first and then combining the estimates rather than developing a single team estimate averaged over individuals

R2 = 53

Click to edit Master title style Defect Detection at Intuit

Consequently test changes from defect

detection into correctness verification

PSPTSP shifts defect detection from the test phase back up into development

Fagan (2005)

Personal reviews

33

Compile 33

Team reviews 19

Unit test 15

Test 14

Click to edit Master title style

Dramatic reductions

ndash Delivered defects

ndash Days per KLOC

ndash Schedule deviation

ndash Test defectskloc

ndash Variation in results

Humphrey (2006) TSP Leading a Development Team

TSP Benefits

Click to edit Master title style Section 3 Section 3 Measuring Software

Improvement Programs

1) Improvement program methods

2) Maturity-based measurement

3) Improvement program results

61

Click to edit Master title style CMMI Maturity Transitions

Level 5 Optimizing

Innovation management

Level 4 Quantitatively

Managed

Capability management

Level 3 Defined

Process management

Level 2 Managed

Project management

Level 1 Initial

Inconsistent management

Chrisis Konrad amp Shrum (2005) CMMI

Click to edit Master title style Enhanced Maturity Framework

Level 5 Innovating

Innovation management

Level 4 Optimized

Capability management

Level 3 Standardized

Organizational management

Level 2 Stabilized

Work unit management

Level 1 Initial

Inconsistent management

Click to edit Master title style Maturity Level Transformation

Level 5 Opportunistic improvements

Empowered culture

Level 5 Proactive

improvements Agile culture

Organization

Projects

Individual

Level 4 Process and results

managed statistically Precision culture

Level 3 Organization develops

standard processes Common culture

Level 2 Project managers stabilize projects

Local culture

Level 1 Ad Hoc inconsistent

development practices Relationship culture

Trust

Click to edit Master title style Measurement by Level

Level 5 mdash Exploratory data ∆ t = N x ROI

Level 4 mdash Predictive data X plusmn (196σ radicn )

Level 3 mdash Process data 83 defects per review

Level 2 mdash Management data Effort = α LOCβ

Level 1 mdash Haphazard data ldquoabout a million linesrdquo

Measure-maturity mismatch slows improvement and creates

dysfunction

Click to edit Master title style Average Improvement Results

Productivity growth 4 35 9 - 67

Pre-test defect detection 3 22 6 - 25

Time to market 2 darr 19 15 - 23

Field error reports 5 darr 39 10 - 94

Return on investment 5 501 41 - 881

Annual Annual Improvement benefit Orgs median range

Click to edit Master title style Schedule Performance

0

02

04

06

08

1

12

14

16

1 2 3

Schedule performance

index

CMM maturity level

Budgeted cost of work performed

Budgeted cost of work scheduled

behind ahead

Air Force study of contractors

Flowe amp Thordahl (1994)

Click to edit Master title style Cost Performance

Cost performance

index

0

05

1

15

2

25

1 2 3 CMM Maturity Level

Budgeted cost of work performed Actual cost of

work performed

overrun underrun

Air Force study of contractors

Flowe amp Thordahl (1994)

Click to edit Master title style Crosbyrsquos Cost of Quality Model Project Cost

Performance Cost of Quality Generating plans documentation Development - requirements - design - code - integration Prevention Appraisal

Reviews - system - requirements - design - test plan - test scripts Walkthroughs Testing (first time) Independent Verification and Validation (first time) Audits

Training Policies Procedures and methods Tools Planning Quality Improvement Projects Data Gathering and Analysis Root Cause Analysis Quality Reporting

Nonconformance Conformance Fixing Defects Reworking any Document Updating Source Code Re-reviews Re-tests Lab costs for re-tests Patches(Internal and External) Engineering Changes Change Control Boards External Failures and fines Customer Support Help Desks

Dion (1993) Crosby (1979) Quality Is Free

Click to edit Master title style Raytheons Cost of Quality

bull Performance mdash cost of building it right first time bull Nonconformance mdash cost of rework bull Appraisal mdash cost of testing bull Prevention mdash cost of preventing nonconformance

Dion (1993) Haley (1995)

23

18

Year Level Perform Nonconf Appraise Prevent

1988 1 34 41 15 7

1990 2 55 18 15 12

1992 3 66 11

1994 4 76 6 75

Click to edit Master title style Raytheonrsquos Productivity

0

1

2

3

4

1988 1989 1990 1991 1992 1993 1994

Growth relative to 1989 baseline

Lydon (1995)

Click to edit Master title style Raytheonrsquos Cost Reduction

0

02

04

06

08

1

12

1988 1989 1990 1991 1992 1993 1994

Reduction relative to 1989 baseline

Lydon (1995)

Click to edit Master title style

Haley (1995)

1993 1992 1991 1990 1989 1988 1994

50

40

30

20

10

0

-10

Actual cost

Budgeted cost

over-run under-run

4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1

Raytheonrsquos Cost Predictability

Click to edit Master title style OMRONrsquos Reuse gains

Sakamoto et al (1996)

0

5 0 0

1 0 0 0

1 5 0 0

2 0 0 0

2 5 0 0

3 0 0 0

1 9 8 7 1 9 8 8 1 9 8 9 1 9 9 0 1 9 9 1 1 9 9 2 1 9 9 3 1 9 9 4

Steps

100

75

50

25

0

of reuse

Click to edit Master title style Section 4 Section 4 Measuring Quality and

Technical Debt

1) Structural quality measurement

2) Technical debt

3) Consortium for IT Software Quality (CISQ)

89

Click to edit Master title style What about the Product CMMIrsquos primary assumption

ldquoThe quality of a software system is governed by quality of the process used to develop itrdquo

- Watts Humphrey

CMMIrsquos product quality problem

1 Assessments do not verify product quality

2 Compliance focus undermines CMMIrsquos primary assumption

3 Product quality focus at Level 4 in CMM was lost in quantitative process control in CMMI

4 The assumptions underlying control charts are violated by software data

Click to edit Master title style Six Sigmarsquos Challenge

Process focus ndash process improvement ndash Six Sigma Product focus ndash product improvement ndash Design for 6σ

Product focus supplements product focus to unlock even more value from software

Six Sigma projects must have significant benefits Huge benefits

Large benefits

Good benefits

Okay benefits

Small benefits

Now what

Ultimately we run out of projects with significant enough benefits to continue the programhellip

so what is the solution that allows continual improvement

Click to edit Master title style Testing is Not Enough ldquoAs higher levels of assurance are demandedhelliptesting cannot deliver the level of confidence required at a reasonable costrdquo

ldquoThe correctness of the code is rarely the weakest linkrdquo

Jackson D (2009) Spinellis D (2006)

ldquohellipa failure to satisfy a non-functional requirement can be critical even catastrophichellipnon-functional requirements are sometimes difficult to verify We cannot write a test case to verify a systemrsquos reliabilityhellipThe ability to associate code to non-functional properties can be a powerful weapon in a software engineerrsquos arsenalrdquo

Click to edit Master title style CASTrsquos Application Intelligence Platform

Application Analysis

Evaluation of 1200+ coding amp

architectural rules

Application meta-data

Transferability

Changeability

Robustness

Performance

Security

Quality Measurements

Detected Violations

Expensive operation in loop Static vs pooled connections Complex query on big table Large indices on big table

Empty CATCH block Uncontrolled data access Poor memory management Opened resource not closed

SQL injection Cross-site scripting Buffer overflow Uncontrolled format string

Unstructured code Misuse of inheritance Lack of comments Violated naming convention

Highly coupled component Duplicated code Index modified in loop High cyclomatic complexity

Language Parsers

Oracle PLSQL Sybase T-SQL SQL Server T-SQL IBM SQLPSM C C++ C Pro C Cobol CICS Visual Basic VBNet ASPNet Java J2EE JSP XML HTML Javascript VBScript PHP PowerBuilder Oracle Forms PeopleSoft SAP ABAP Netweaver Tibco Business Objects Universal Analyzer for other languages

Click to edit Master title style Modern Apps are a Technology Stack Ar

chite

ctur

al C

ompl

ianc

e

Data Flow Transaction Risk

EJB PLSQL

Oracle

SQL Server

DB2

TSQL

Hibernate

Spring

Struts NET

COBOL

Sybase IMS

Messaging

Integration quality Architectural

compliance Risk propagation Application security Resiliency checks Transaction integrity

Function point Effort estimation Data access control SDK versioning Calibration across

technologies IT organization level

System Level 3

Code style amp layout Expression complexity Code documentation Class or program design Basic coding standards Developer level

Unit Level 1

Java Java Java

Java Java

Web Services Java Java Single languagetechnology layer

Intra-technology architecture Intra-layer dependencies Inter-program invocation Security vulnerabilities Development team level

Technology Level 2

JSP ASPNET APIs

Click to edit Master title style Architecturally Complex Defects

Primary cause of operational

problems

20x as many

fixes to correct

Architecturally Complex Defect

A structural flaw involving interactions among multiple components that reside in different application layers

48

52 92

8 Architecturally

Complex Defects

Code unit-level violations

of total app defects

of total repair effort

Most architecturally complex defects involve an architectural hotspot ndash a badly-built component that should be replaced

Click to edit Master title style Propagated Risk

The impact of violation 284 in Component B is very widely propagated throughout the system ndash

it presents very high risk

The impact of violation 342 in Component A is only narrowly propagated in the system

ndash it presents low risk

Propagated Risk Index

A ranking of the risk created by each violation based on the its severity and the breadth of its impact in the system

Violation 342 in component A

Violation 284 in component B

Remediating violation 284 in Component B will have

the greater impact on reducing risk and cost

Click to edit Master title style Transaction Risk

Customers care most about the dependability of their transactions

The risk of a transaction is determined by the number and severity of violations along its path

Transactions can be ranked by risk to prioritize their remediation

The Transaction Risk Index can be further tuned using the operational frequency of each transaction

Transaction Risk Index

A ranking of transactions based on the risk created by the number and severity of violations along the transaction path

Transaction entryexit in the user interface layer

Transaction fulfillment in the data storage layer

1 3 2 4 5

Transaction 1 poses greatest risk

Transaction processing in the business logic layer

Click to edit Master title style Risk-Based Prioritization

Violation Health Factor Severity Propagated Risk Index

In a high risk transaction

Frequency of path execution

148 Security 9

371 Robustness 9

062 Performance 8

284 Robustness 7

016 Changeability 7

241 Security 5

173 Transferability 4

359 Transferability 3

Click to edit Master title style Static Analysis in US DoD

John Keane (2013)

Testing finds lt13 of structural defects

Static analysis before testing is 85 efficient

With testing and static analysis combined test schedules are reduced by 50

Click to edit Master title style Reducing Operational Incidents amp Costs

Study of structural quality measures and maintenance effort across 20 customers in a large global system integrator

TQI increase of 24 decreased corrective maintenance effort by 50

Total Quality Index

Click to edit Master title style Reducing Operational Losses

0

20

40

60

80

100

120

140

160

180

100 150 200 250 300 350 400

Technical Quality Index

Pro

du

ctio

n In

cid

ents

+$2000000 Annual Loss

$50000 Annual Loss

$5000 Annual Loss

Large international investment bank Business critical applications

Click to edit Master title style Factors Affecting Structural Quality

2017 CRASH Report Request from infocastsoftwarecom

Factor apps Robust Security Perform Change Transfer

Technology 1606 20 3 16 26 7

Java-EE

Industry 677 5 5 7 4 6

App type 510 1 1 1 1 1

Source 580 1 1 4

Shore 540

Maturity 72 28 25 15 24 12

Method 208 10 6 14 6

Team size 186 7 5 8 8

of users 262 4 3 5 4

Percentage of variation in structural quality scores accounted for by various factors

Click to edit Master title style The Technical Debt Metaphor

Structural quality problems in production code

Technical Debt

Principal borrowed

Interest on the debt

Business Risk Liability from debt

Opportunity cost

Interestmdashcontinuing IT costs attributable to the violations causing technical debt ie higher maintenance costs greater resource usage etc

Principalcost of fixing problems remaining in the code after release that must be remediated

Opportunity costmdashbenefits that could have been achieved had resources been put on new capability rather than retiring technical debt

Liabilitymdashbusiness costs related to outages breaches corrupted data etc

Technical Debt the future cost of fixing defects remaining in code at release a component of the cost of ownership

Todayrsquos talk focuses on the principal

Curtis et al (2012) IEEE Software

Click to edit Master title style Tech Debt by Quality Factor

Transferability

40

Changeability

30

Security 7

Robustness

18

70 of Technical Debt is in IT Cost (Transferability Changeability)

30 of Technical Debt is in Business Risk

(Robustness Performance Security)

Health Factor proportions are mostly

consistent across technologies

Curtis et al (2012) IEEE Software

Click to edit Master title style Join CISQ wwwit-cisqorg

Click to edit Master title style Section

References

115

Click to edit Master title style References 1 Anderson DJ (2010) Kanban Sequim WA Blue Hole Press

Basili V Trendowicz A Kowalczyk M Heidrich J Seaman C Munch J amp Rombach D (2014) Aligning Organizations through Measurement Heidelberg Springer

Boehm B (1987) Increasing software productivity IEEE Computer 20 (9) 43-57

Boehm B et al (2000) Software Cost Estimation with COCOMO II Prentice-Hall

Chrissis MB Konrad M amp Shrum S (2011) CMMI Third Edition Guidelines for Process Integration and Product Improvement Reading MA Addison Wesley

Cohn M (2006) Agile Estimating and Planning Upper Saddle River NJ Prentice-Hall

Crosby P (1979) Quality Is Free New York McGraw-Hill

Curtis B Sapiddi J amp Szynkarski A (2012) Estimating the principle of an applicationrsquos technical debt IEEE Software 29 (6) 34-42

Dion R (1993) Process improvement and the corporate balance sheet IEEE Software 10 (4) 28-35

Duncker R (1992) Presentation at the 25th Annual Conference of the Singapore Computer Society Singapore November 1992

Dyne K (1999) ESSI Benefits from Continuous Software Improvements In Proceedings of ESEPGrsquo99 Milton Keyes UK ESPI Foundation

Flowe RM amp Thordahl JB (1994) A Correlational Study of the SEIrsquos Capability Maturity Model and Software Development Performance in DoD Contracts (AFITGSSLAR94D-2) Dayton OH Air Force Institute of Technology Wright Patterson Air Force Base

116

Click to edit Master title style References 2 Haley T Ireland B Wojtaszek E Nash D amp Dion R (1995) Raytheon Electronic Systems

Experience in Software Process Improvement (Tech Rep CMUSEI-95-TR-017) ) Pittsburgh Software Engineering Institute Carnegie Mellon University

Ebert C amp Dumke R (2007) Software Measurement Berlin Springer-Verlag

Fagan E amp Davis N (2005) TSP PSP at Intuit Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Fenton N (2006) New directions for Software Metrics CIO Symposium on Software Best Practices

Garmus D amp Herron D (2001) Function Point Analysis Boston Addison-Wesley

George E (2005) Project management with TSP Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Herbsleb J Carleton A Rozum J Siegel J amp Zubrow D (1994) Benefits of CMM-Based Software Process Improvement Initial Results (Tech Rep CMUSEI-94-TR-13) Pittsburgh Software Engineering Institute Carnegie Mellon University

Humphrey W (1997) Introduction to the Personal Software Process Boston Addison Wesley

Humphrey W (2000) Introduction to the Team Software Process Boston Addison Wesley

Humphrey W (2006) TSP Leading a Development Team Boston Addison Wesley

Jackson D (2009) A direct path to dependable software Communications of the ACM 52 (4)

Kan SH (1995) Metrics and Models in Software Quality Engineering Reading MA Addison-Wesley

Keeni B (2000) IEEE Software 16 (4)

117

Click to edit Master title style References 2 Larman C (2004) Agile and Iterative Development Boston Addison Wesley

Lydon T (1995) Productivity drivers Process and capital In Proceedings of the 1995 SEPG Conference Pittsburgh Software Engineering Institute Carnegie Mellon University

McGarry J et al (2002) Practical Software Measurement Reading MA Addison Wesley

Paulk M Weber C Curtis B amp Chrissis M (1995) The Capability Maturity Model Guidelines for Improving the Software Process Reading MA Addison-Wesley

Peterman W (2000) IEEE Software JulyAugust issue

Poppendieck M amp Poppendieck T (2007) Implementing Lean Software Development Boston Addison-Wesley

Sakamoto K Kishida K amp Nakakoji K (1996) Cultural adaptation of the CMM In Fuggetta A amp Wolf A (Eds) Software Process Chichester UK Wiley 137-154

Sinha M (2006) Performance models ndash Enabling organizations prediction capability Proceedings of SEPGrsquo06 Pittsburgh Software Engineering Institute Carnegie Mellon University

Schwaber K amp Beedle M (2002) Agile Software Development Using Scrum

Spinellis D (2006) Code Quality Addison-Wesley

Van Epps S amp Marshall R (2012) Using TSP at the end of the development cycle Proceedings of TSP 2012 Pittsburgh Software Engineering Institute Carnegie Mellon University

Vu JD (1996) Software process improvement A business case In Proceedings of the European SEPG Conference Milton Keynes UK European Software Process Improvement Foundation

118

  • Best Practices for Software Process and Product Measurement
  • Instructor Dr Bill Curtis
  • Agenda
  • Section 1
  • Why Does Measurement Languish
  • Measurement Process Guidance
  • Measurement Categories
  • Productivity Analysis Objectives
  • Productivity Analysis Measures
  • Software Productivity
  • Software Size Measures
  • How Many Lines of Code
  • Function Point Estimation
  • Effort mdash Weakest Measure
  • How Quality Affects Productivity
  • Carry-forward Rework
  • Example of Quality Impact
  • Quality-Adjusted Productivity
  • Best Practices for Analysis
  • 1 Evaluate Demographics
  • 2 Segment Baselines
  • 3 Beware Sampling Effects
  • 4 Understand Variation
  • 5 Investigate Distributions
  • 6 Account for Maturity Effects
  • 7 Caution for External Data Sets
  • Slide Number 27
  • Measuring Project Progress
  • Requirements Change Impact
  • Phase Defect Removal Model
  • SCRUM Board
  • Burndown Chart and Velocity
  • Analyzing Velocity
  • Measuring and Managing Flow
  • Trends Over Sprints
  • Sources and Measures
  • Recommended Books
  • Personal Software Process (PSP)
  • Team Software Process (TSP)
  • Aggregated Personal Data Best
  • Defect Detection at Intuit
  • TSP Benefits
  • Section 3
  • CMMI Maturity Transitions
  • Enhanced Maturity Framework
  • Maturity Level Transformation
  • Measurement by Level
  • Average Improvement Results
  • Schedule Performance
  • Cost Performance
  • Crosbyrsquos Cost of Quality Model
  • Raytheons Cost of Quality
  • Raytheonrsquos Productivity
  • Raytheonrsquos Cost Reduction
  • Raytheonrsquos Cost Predictability
  • OMRONrsquos Reuse gains
  • Section 4
  • What about the Product
  • Six Sigmarsquos Challenge
  • Testing is Not Enough
  • CASTrsquos Application Intelligence Platform
  • Modern Apps are a Technology Stack
  • Architecturally Complex Defects
  • Propagated Risk
  • Transaction Risk
  • Risk-Based Prioritization
  • Static Analysis in US DoD
  • Reducing Operational Incidents amp Costs
  • Reducing Operational Losses
  • Slide Number 70
  • The Technical Debt Metaphor
  • Tech Debt by Quality Factor
  • Join CISQ wwwit-cisqorg
  • Section
  • References 1
  • References 2
  • References 2

PHASE

Escapes Previous Phase

KSLOC

Defect Injection

KSLOC

Subtotal

KSLOC

Removal

Effective-ness

Defect Removal

KSLOC

Escapes at phase exit KSLOC

Rqmts

0

12

12

0

0

12

Design

12

180

192

76

146

46

Code amp UT

46

154

200

71

142

58

Inte-gration

58

0

58

67

39

19

Sys Test

19

0

19

58

11

08

Opera-tion

08

Requirements Added Requirements Changed Requirements Deleted
Feb 2 20 1
Mar 2 11 1
Apr 45 35 3
May 15 61 1
Jun 1 20 4
Jul 15 15 0
Aug 18 05 2
Feb Feb Feb
Mar Mar Mar
Apr Apr Apr
May May May
Jun Jun Jun
Jul Jul Jul
Aug Aug Aug
Page 27: Best Practices for Software Process and Product Measurement · Software Process and Product Measurement Dr. Bill Curtis ... measurement best practice • Best guidance for starting

Click to edit Master title style

1) Tracking traditional projects

2) Iterative and Agile measurement

3) Team Software Process measurement

33

Section 2 Measuring to Manage

Software Projects

Click to edit Master title style Measuring Project Progress

300K 250K 200K 150K 100K 50K 0K

Jan Feb Mar Apr May Jun

Planned rate of code Growth

Actual code growth

Completion of coding

ldquoThe measures say we are on schedulerdquo

300K

250K

200K

150K

100K

50K

0K Jan Feb Mar Apr May Jun

Planned rate of code Growth

Actual code growth

Completion of coding

ldquoBut the code keeps growing We should have measured the requirements growthrdquo

50

100

150

Defect reports

Total

Open

Closed

125

75

25

Jan Feb Mar Apr May Jun

Completion of coding

ldquoAnd all the defect reports remaining to be fixed will keep us far behind schedulerdquo

Click to edit Master title style Requirements Change Impact

Number of requirements

affected

Chart6

Requirements Added
Requirements Changed
Requirements Deleted
Person hours of work or rework
2
20
1
2
11
1
45
35
3
15
61
1
1
20
4
15
15
0
18
05
2

Sheet1

Sheet1

Requirements Added
Requirements Changed
Requirements Deleted
Person hours of work or rework

Sheet2

Sheet3

Click to edit Master title style Phase Defect Removal Model

PHASE

Escapes Previous Phase KSLOC

Defect Injection KSLOC

Subtotal KSLOC

Removal Effective-ness

Defect Removal KSLOC

Escapes at phase exit KSLOC

Rqmts 0 12 12 0 0 12

Design 12 180 192 76 146 46

Code amp UT 46 154 200 71 142 58

Inte-gration 58 0 58 67 39 19

Sys Test 19 0 19 58 11 08

Opera-tion 08

Kan (1999) Metrics and Model in Software Quality Engineering

Click to edit Master title style SCRUM Board Scrumboard displays

the path to completion for each story by showing task status

Story points are an estimate of days for each story

Agile estimating is by feature (story) rather than by task

lsquoDonersquo does not always mean all tasks are finished

Kanban methods restrict the work in progress to a limited number of paths

Click to edit Master title style Burndown Chart and Velocity

Burndown chart displays story completion by day

Burndown chart indicates actual versus estimated progress

Velocity is the rate at which stories are completed

Velocity indicates sustainable progress

Velocity results provide one source of historical data for improving estimates

Story points without the context of productivity factors are dangerous for estimating

Click to edit Master title style Team Velocity

0

1

2

3

4

5

1 2 2 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20Days

Story Points per Hour

0

1

2

3

4

5

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

HrsStryPt

Analyzing Velocity

Item Cycle Times

02468

101214161820

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

Stories Hours HrsStryPt

0

10

20

30

40

50

60

1 2 2 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

Burndown Chart

Days

Click to edit Master title style Measuring and Managing Flow

Released

In progress

Backlogged

Cumulative Flow Diagram Days to delivery

outliers

Anderson (2010) Kanban

Click to edit Master title style

0

10

20

30

40

50

60

70

80

1 2 3 4 5 6 7 8 9 10 11 12

Sprints

Sto

ry p

oin

ts d

eliv

ered

Trends Over Sprints

0

10

20

30

40

50

60

70

80

1 2 3 4 5 6 7 8 9 10 11 12

Sprints

Sto

ry p

oin

ts d

eliv

ered

0

05

1

15

2

25

3

35

4

1 2 3 4 5 6 7 8 9 10 11 12

Sprints

Tech

nic

al d

ebt

Track and correlate multiple measures

across sprints

Predictive models

Click to edit Master title style Sources and Measures

Stories submitted

Stories pulled back

Growth in size of stories

Stories added mid-sprint

Stories under review

Failed tests

Non-development tasks

Assists to other projects

Data sources

Measures Project tracking

Code control

Bug tracking

Build system

Test environment

Deployment tools

Operational monitoring

Click to edit Master title style Recommended Books

Click to edit Master title style Personal Software Process (PSP)

PSP 0 Current process Time recording

Defect recording Defect type standards

PSP 01 Coding standard

Size measurement Process improvement

proposal

Baseline Personal Process

Humphrey (1997) Introduction to the Personal Software Process

PSP 1 Size estimating

Test report

Personal Planning

PSP 2 Code reviews

Design reviews

Personal Quality Management

PSP 21 Cyclic development

Cyclic Personal Process

PSP 11 Task planning

Schedule planning

PSP 21 Design templates

Click to edit Master title style Team Software Process (TSP) bull Built from personal processes of team members

bull Well defined team roles

bull Project launch workshops for planning

bull Team measurement and tracking

bull Post-mortems for improvement

bull Team application of lean principles

Humphrey (2000) Introduction to the Team Software Process

Click to edit Master title style Aggregated Personal Data Best

R2s range from 70 to 87

The ability to predict the amount of time required to produce a piece of software is dramatically improved by estimating at the individual level first and then combining the estimates rather than developing a single team estimate averaged over individuals

R2 = 53

Click to edit Master title style Defect Detection at Intuit

Consequently test changes from defect

detection into correctness verification

PSPTSP shifts defect detection from the test phase back up into development

Fagan (2005)

Personal reviews

33

Compile 33

Team reviews 19

Unit test 15

Test 14

Click to edit Master title style

Dramatic reductions

ndash Delivered defects

ndash Days per KLOC

ndash Schedule deviation

ndash Test defectskloc

ndash Variation in results

Humphrey (2006) TSP Leading a Development Team

TSP Benefits

Click to edit Master title style Section 3 Section 3 Measuring Software

Improvement Programs

1) Improvement program methods

2) Maturity-based measurement

3) Improvement program results

61

Click to edit Master title style CMMI Maturity Transitions

Level 5 Optimizing

Innovation management

Level 4 Quantitatively

Managed

Capability management

Level 3 Defined

Process management

Level 2 Managed

Project management

Level 1 Initial

Inconsistent management

Chrisis Konrad amp Shrum (2005) CMMI

Click to edit Master title style Enhanced Maturity Framework

Level 5 Innovating

Innovation management

Level 4 Optimized

Capability management

Level 3 Standardized

Organizational management

Level 2 Stabilized

Work unit management

Level 1 Initial

Inconsistent management

Click to edit Master title style Maturity Level Transformation

Level 5 Opportunistic improvements

Empowered culture

Level 5 Proactive

improvements Agile culture

Organization

Projects

Individual

Level 4 Process and results

managed statistically Precision culture

Level 3 Organization develops

standard processes Common culture

Level 2 Project managers stabilize projects

Local culture

Level 1 Ad Hoc inconsistent

development practices Relationship culture

Trust

Click to edit Master title style Measurement by Level

Level 5 mdash Exploratory data ∆ t = N x ROI

Level 4 mdash Predictive data X plusmn (196σ radicn )

Level 3 mdash Process data 83 defects per review

Level 2 mdash Management data Effort = α LOCβ

Level 1 mdash Haphazard data ldquoabout a million linesrdquo

Measure-maturity mismatch slows improvement and creates

dysfunction

Click to edit Master title style Average Improvement Results

Productivity growth 4 35 9 - 67

Pre-test defect detection 3 22 6 - 25

Time to market 2 darr 19 15 - 23

Field error reports 5 darr 39 10 - 94

Return on investment 5 501 41 - 881

Annual Annual Improvement benefit Orgs median range

Click to edit Master title style Schedule Performance

0

02

04

06

08

1

12

14

16

1 2 3

Schedule performance

index

CMM maturity level

Budgeted cost of work performed

Budgeted cost of work scheduled

behind ahead

Air Force study of contractors

Flowe amp Thordahl (1994)

Click to edit Master title style Cost Performance

Cost performance

index

0

05

1

15

2

25

1 2 3 CMM Maturity Level

Budgeted cost of work performed Actual cost of

work performed

overrun underrun

Air Force study of contractors

Flowe amp Thordahl (1994)

Click to edit Master title style Crosbyrsquos Cost of Quality Model Project Cost

Performance Cost of Quality Generating plans documentation Development - requirements - design - code - integration Prevention Appraisal

Reviews - system - requirements - design - test plan - test scripts Walkthroughs Testing (first time) Independent Verification and Validation (first time) Audits

Training Policies Procedures and methods Tools Planning Quality Improvement Projects Data Gathering and Analysis Root Cause Analysis Quality Reporting

Nonconformance Conformance Fixing Defects Reworking any Document Updating Source Code Re-reviews Re-tests Lab costs for re-tests Patches(Internal and External) Engineering Changes Change Control Boards External Failures and fines Customer Support Help Desks

Dion (1993) Crosby (1979) Quality Is Free

Click to edit Master title style Raytheons Cost of Quality

bull Performance mdash cost of building it right first time bull Nonconformance mdash cost of rework bull Appraisal mdash cost of testing bull Prevention mdash cost of preventing nonconformance

Dion (1993) Haley (1995)

23

18

Year Level Perform Nonconf Appraise Prevent

1988 1 34 41 15 7

1990 2 55 18 15 12

1992 3 66 11

1994 4 76 6 75

Click to edit Master title style Raytheonrsquos Productivity

0

1

2

3

4

1988 1989 1990 1991 1992 1993 1994

Growth relative to 1989 baseline

Lydon (1995)

Click to edit Master title style Raytheonrsquos Cost Reduction

0

02

04

06

08

1

12

1988 1989 1990 1991 1992 1993 1994

Reduction relative to 1989 baseline

Lydon (1995)

Click to edit Master title style

Haley (1995)

1993 1992 1991 1990 1989 1988 1994

50

40

30

20

10

0

-10

Actual cost

Budgeted cost

over-run under-run

4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1

Raytheonrsquos Cost Predictability

Click to edit Master title style OMRONrsquos Reuse gains

Sakamoto et al (1996)

0

5 0 0

1 0 0 0

1 5 0 0

2 0 0 0

2 5 0 0

3 0 0 0

1 9 8 7 1 9 8 8 1 9 8 9 1 9 9 0 1 9 9 1 1 9 9 2 1 9 9 3 1 9 9 4

Steps

100

75

50

25

0

of reuse

Click to edit Master title style Section 4 Section 4 Measuring Quality and

Technical Debt

1) Structural quality measurement

2) Technical debt

3) Consortium for IT Software Quality (CISQ)

89

Click to edit Master title style What about the Product CMMIrsquos primary assumption

ldquoThe quality of a software system is governed by quality of the process used to develop itrdquo

- Watts Humphrey

CMMIrsquos product quality problem

1 Assessments do not verify product quality

2 Compliance focus undermines CMMIrsquos primary assumption

3 Product quality focus at Level 4 in CMM was lost in quantitative process control in CMMI

4 The assumptions underlying control charts are violated by software data

Click to edit Master title style Six Sigmarsquos Challenge

Process focus ndash process improvement ndash Six Sigma Product focus ndash product improvement ndash Design for 6σ

Product focus supplements product focus to unlock even more value from software

Six Sigma projects must have significant benefits Huge benefits

Large benefits

Good benefits

Okay benefits

Small benefits

Now what

Ultimately we run out of projects with significant enough benefits to continue the programhellip

so what is the solution that allows continual improvement

Click to edit Master title style Testing is Not Enough ldquoAs higher levels of assurance are demandedhelliptesting cannot deliver the level of confidence required at a reasonable costrdquo

ldquoThe correctness of the code is rarely the weakest linkrdquo

Jackson D (2009) Spinellis D (2006)

ldquohellipa failure to satisfy a non-functional requirement can be critical even catastrophichellipnon-functional requirements are sometimes difficult to verify We cannot write a test case to verify a systemrsquos reliabilityhellipThe ability to associate code to non-functional properties can be a powerful weapon in a software engineerrsquos arsenalrdquo

Click to edit Master title style CASTrsquos Application Intelligence Platform

Application Analysis

Evaluation of 1200+ coding amp

architectural rules

Application meta-data

Transferability

Changeability

Robustness

Performance

Security

Quality Measurements

Detected Violations

Expensive operation in loop Static vs pooled connections Complex query on big table Large indices on big table

Empty CATCH block Uncontrolled data access Poor memory management Opened resource not closed

SQL injection Cross-site scripting Buffer overflow Uncontrolled format string

Unstructured code Misuse of inheritance Lack of comments Violated naming convention

Highly coupled component Duplicated code Index modified in loop High cyclomatic complexity

Language Parsers

Oracle PLSQL Sybase T-SQL SQL Server T-SQL IBM SQLPSM C C++ C Pro C Cobol CICS Visual Basic VBNet ASPNet Java J2EE JSP XML HTML Javascript VBScript PHP PowerBuilder Oracle Forms PeopleSoft SAP ABAP Netweaver Tibco Business Objects Universal Analyzer for other languages

Click to edit Master title style Modern Apps are a Technology Stack Ar

chite

ctur

al C

ompl

ianc

e

Data Flow Transaction Risk

EJB PLSQL

Oracle

SQL Server

DB2

TSQL

Hibernate

Spring

Struts NET

COBOL

Sybase IMS

Messaging

Integration quality Architectural

compliance Risk propagation Application security Resiliency checks Transaction integrity

Function point Effort estimation Data access control SDK versioning Calibration across

technologies IT organization level

System Level 3

Code style amp layout Expression complexity Code documentation Class or program design Basic coding standards Developer level

Unit Level 1

Java Java Java

Java Java

Web Services Java Java Single languagetechnology layer

Intra-technology architecture Intra-layer dependencies Inter-program invocation Security vulnerabilities Development team level

Technology Level 2

JSP ASPNET APIs

Click to edit Master title style Architecturally Complex Defects

Primary cause of operational

problems

20x as many

fixes to correct

Architecturally Complex Defect

A structural flaw involving interactions among multiple components that reside in different application layers

48

52 92

8 Architecturally

Complex Defects

Code unit-level violations

of total app defects

of total repair effort

Most architecturally complex defects involve an architectural hotspot ndash a badly-built component that should be replaced

Click to edit Master title style Propagated Risk

The impact of violation 284 in Component B is very widely propagated throughout the system ndash

it presents very high risk

The impact of violation 342 in Component A is only narrowly propagated in the system

ndash it presents low risk

Propagated Risk Index

A ranking of the risk created by each violation based on the its severity and the breadth of its impact in the system

Violation 342 in component A

Violation 284 in component B

Remediating violation 284 in Component B will have

the greater impact on reducing risk and cost

Click to edit Master title style Transaction Risk

Customers care most about the dependability of their transactions

The risk of a transaction is determined by the number and severity of violations along its path

Transactions can be ranked by risk to prioritize their remediation

The Transaction Risk Index can be further tuned using the operational frequency of each transaction

Transaction Risk Index

A ranking of transactions based on the risk created by the number and severity of violations along the transaction path

Transaction entryexit in the user interface layer

Transaction fulfillment in the data storage layer

1 3 2 4 5

Transaction 1 poses greatest risk

Transaction processing in the business logic layer

Click to edit Master title style Risk-Based Prioritization

Violation Health Factor Severity Propagated Risk Index

In a high risk transaction

Frequency of path execution

148 Security 9

371 Robustness 9

062 Performance 8

284 Robustness 7

016 Changeability 7

241 Security 5

173 Transferability 4

359 Transferability 3

Click to edit Master title style Static Analysis in US DoD

John Keane (2013)

Testing finds lt13 of structural defects

Static analysis before testing is 85 efficient

With testing and static analysis combined test schedules are reduced by 50

Click to edit Master title style Reducing Operational Incidents amp Costs

Study of structural quality measures and maintenance effort across 20 customers in a large global system integrator

TQI increase of 24 decreased corrective maintenance effort by 50

Total Quality Index

Click to edit Master title style Reducing Operational Losses

0

20

40

60

80

100

120

140

160

180

100 150 200 250 300 350 400

Technical Quality Index

Pro

du

ctio

n In

cid

ents

+$2000000 Annual Loss

$50000 Annual Loss

$5000 Annual Loss

Large international investment bank Business critical applications

Click to edit Master title style Factors Affecting Structural Quality

2017 CRASH Report Request from infocastsoftwarecom

Factor apps Robust Security Perform Change Transfer

Technology 1606 20 3 16 26 7

Java-EE

Industry 677 5 5 7 4 6

App type 510 1 1 1 1 1

Source 580 1 1 4

Shore 540

Maturity 72 28 25 15 24 12

Method 208 10 6 14 6

Team size 186 7 5 8 8

of users 262 4 3 5 4

Percentage of variation in structural quality scores accounted for by various factors

Click to edit Master title style The Technical Debt Metaphor

Structural quality problems in production code

Technical Debt

Principal borrowed

Interest on the debt

Business Risk Liability from debt

Opportunity cost

Interestmdashcontinuing IT costs attributable to the violations causing technical debt ie higher maintenance costs greater resource usage etc

Principalcost of fixing problems remaining in the code after release that must be remediated

Opportunity costmdashbenefits that could have been achieved had resources been put on new capability rather than retiring technical debt

Liabilitymdashbusiness costs related to outages breaches corrupted data etc

Technical Debt the future cost of fixing defects remaining in code at release a component of the cost of ownership

Todayrsquos talk focuses on the principal

Curtis et al (2012) IEEE Software

Click to edit Master title style Tech Debt by Quality Factor

Transferability

40

Changeability

30

Security 7

Robustness

18

70 of Technical Debt is in IT Cost (Transferability Changeability)

30 of Technical Debt is in Business Risk

(Robustness Performance Security)

Health Factor proportions are mostly

consistent across technologies

Curtis et al (2012) IEEE Software

Click to edit Master title style Join CISQ wwwit-cisqorg

Click to edit Master title style Section

References

115

Click to edit Master title style References 1 Anderson DJ (2010) Kanban Sequim WA Blue Hole Press

Basili V Trendowicz A Kowalczyk M Heidrich J Seaman C Munch J amp Rombach D (2014) Aligning Organizations through Measurement Heidelberg Springer

Boehm B (1987) Increasing software productivity IEEE Computer 20 (9) 43-57

Boehm B et al (2000) Software Cost Estimation with COCOMO II Prentice-Hall

Chrissis MB Konrad M amp Shrum S (2011) CMMI Third Edition Guidelines for Process Integration and Product Improvement Reading MA Addison Wesley

Cohn M (2006) Agile Estimating and Planning Upper Saddle River NJ Prentice-Hall

Crosby P (1979) Quality Is Free New York McGraw-Hill

Curtis B Sapiddi J amp Szynkarski A (2012) Estimating the principle of an applicationrsquos technical debt IEEE Software 29 (6) 34-42

Dion R (1993) Process improvement and the corporate balance sheet IEEE Software 10 (4) 28-35

Duncker R (1992) Presentation at the 25th Annual Conference of the Singapore Computer Society Singapore November 1992

Dyne K (1999) ESSI Benefits from Continuous Software Improvements In Proceedings of ESEPGrsquo99 Milton Keyes UK ESPI Foundation

Flowe RM amp Thordahl JB (1994) A Correlational Study of the SEIrsquos Capability Maturity Model and Software Development Performance in DoD Contracts (AFITGSSLAR94D-2) Dayton OH Air Force Institute of Technology Wright Patterson Air Force Base

116

Click to edit Master title style References 2 Haley T Ireland B Wojtaszek E Nash D amp Dion R (1995) Raytheon Electronic Systems

Experience in Software Process Improvement (Tech Rep CMUSEI-95-TR-017) ) Pittsburgh Software Engineering Institute Carnegie Mellon University

Ebert C amp Dumke R (2007) Software Measurement Berlin Springer-Verlag

Fagan E amp Davis N (2005) TSP PSP at Intuit Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Fenton N (2006) New directions for Software Metrics CIO Symposium on Software Best Practices

Garmus D amp Herron D (2001) Function Point Analysis Boston Addison-Wesley

George E (2005) Project management with TSP Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Herbsleb J Carleton A Rozum J Siegel J amp Zubrow D (1994) Benefits of CMM-Based Software Process Improvement Initial Results (Tech Rep CMUSEI-94-TR-13) Pittsburgh Software Engineering Institute Carnegie Mellon University

Humphrey W (1997) Introduction to the Personal Software Process Boston Addison Wesley

Humphrey W (2000) Introduction to the Team Software Process Boston Addison Wesley

Humphrey W (2006) TSP Leading a Development Team Boston Addison Wesley

Jackson D (2009) A direct path to dependable software Communications of the ACM 52 (4)

Kan SH (1995) Metrics and Models in Software Quality Engineering Reading MA Addison-Wesley

Keeni B (2000) IEEE Software 16 (4)

117

Click to edit Master title style References 2 Larman C (2004) Agile and Iterative Development Boston Addison Wesley

Lydon T (1995) Productivity drivers Process and capital In Proceedings of the 1995 SEPG Conference Pittsburgh Software Engineering Institute Carnegie Mellon University

McGarry J et al (2002) Practical Software Measurement Reading MA Addison Wesley

Paulk M Weber C Curtis B amp Chrissis M (1995) The Capability Maturity Model Guidelines for Improving the Software Process Reading MA Addison-Wesley

Peterman W (2000) IEEE Software JulyAugust issue

Poppendieck M amp Poppendieck T (2007) Implementing Lean Software Development Boston Addison-Wesley

Sakamoto K Kishida K amp Nakakoji K (1996) Cultural adaptation of the CMM In Fuggetta A amp Wolf A (Eds) Software Process Chichester UK Wiley 137-154

Sinha M (2006) Performance models ndash Enabling organizations prediction capability Proceedings of SEPGrsquo06 Pittsburgh Software Engineering Institute Carnegie Mellon University

Schwaber K amp Beedle M (2002) Agile Software Development Using Scrum

Spinellis D (2006) Code Quality Addison-Wesley

Van Epps S amp Marshall R (2012) Using TSP at the end of the development cycle Proceedings of TSP 2012 Pittsburgh Software Engineering Institute Carnegie Mellon University

Vu JD (1996) Software process improvement A business case In Proceedings of the European SEPG Conference Milton Keynes UK European Software Process Improvement Foundation

118

  • Best Practices for Software Process and Product Measurement
  • Instructor Dr Bill Curtis
  • Agenda
  • Section 1
  • Why Does Measurement Languish
  • Measurement Process Guidance
  • Measurement Categories
  • Productivity Analysis Objectives
  • Productivity Analysis Measures
  • Software Productivity
  • Software Size Measures
  • How Many Lines of Code
  • Function Point Estimation
  • Effort mdash Weakest Measure
  • How Quality Affects Productivity
  • Carry-forward Rework
  • Example of Quality Impact
  • Quality-Adjusted Productivity
  • Best Practices for Analysis
  • 1 Evaluate Demographics
  • 2 Segment Baselines
  • 3 Beware Sampling Effects
  • 4 Understand Variation
  • 5 Investigate Distributions
  • 6 Account for Maturity Effects
  • 7 Caution for External Data Sets
  • Slide Number 27
  • Measuring Project Progress
  • Requirements Change Impact
  • Phase Defect Removal Model
  • SCRUM Board
  • Burndown Chart and Velocity
  • Analyzing Velocity
  • Measuring and Managing Flow
  • Trends Over Sprints
  • Sources and Measures
  • Recommended Books
  • Personal Software Process (PSP)
  • Team Software Process (TSP)
  • Aggregated Personal Data Best
  • Defect Detection at Intuit
  • TSP Benefits
  • Section 3
  • CMMI Maturity Transitions
  • Enhanced Maturity Framework
  • Maturity Level Transformation
  • Measurement by Level
  • Average Improvement Results
  • Schedule Performance
  • Cost Performance
  • Crosbyrsquos Cost of Quality Model
  • Raytheons Cost of Quality
  • Raytheonrsquos Productivity
  • Raytheonrsquos Cost Reduction
  • Raytheonrsquos Cost Predictability
  • OMRONrsquos Reuse gains
  • Section 4
  • What about the Product
  • Six Sigmarsquos Challenge
  • Testing is Not Enough
  • CASTrsquos Application Intelligence Platform
  • Modern Apps are a Technology Stack
  • Architecturally Complex Defects
  • Propagated Risk
  • Transaction Risk
  • Risk-Based Prioritization
  • Static Analysis in US DoD
  • Reducing Operational Incidents amp Costs
  • Reducing Operational Losses
  • Slide Number 70
  • The Technical Debt Metaphor
  • Tech Debt by Quality Factor
  • Join CISQ wwwit-cisqorg
  • Section
  • References 1
  • References 2
  • References 2

PHASE

Escapes Previous Phase

KSLOC

Defect Injection

KSLOC

Subtotal

KSLOC

Removal

Effective-ness

Defect Removal

KSLOC

Escapes at phase exit KSLOC

Rqmts

0

12

12

0

0

12

Design

12

180

192

76

146

46

Code amp UT

46

154

200

71

142

58

Inte-gration

58

0

58

67

39

19

Sys Test

19

0

19

58

11

08

Opera-tion

08

Requirements Added Requirements Changed Requirements Deleted
Feb 2 20 1
Mar 2 11 1
Apr 45 35 3
May 15 61 1
Jun 1 20 4
Jul 15 15 0
Aug 18 05 2
Feb Feb Feb
Mar Mar Mar
Apr Apr Apr
May May May
Jun Jun Jun
Jul Jul Jul
Aug Aug Aug
Page 28: Best Practices for Software Process and Product Measurement · Software Process and Product Measurement Dr. Bill Curtis ... measurement best practice • Best guidance for starting

Click to edit Master title style Measuring Project Progress

300K 250K 200K 150K 100K 50K 0K

Jan Feb Mar Apr May Jun

Planned rate of code Growth

Actual code growth

Completion of coding

ldquoThe measures say we are on schedulerdquo

300K

250K

200K

150K

100K

50K

0K Jan Feb Mar Apr May Jun

Planned rate of code Growth

Actual code growth

Completion of coding

ldquoBut the code keeps growing We should have measured the requirements growthrdquo

50

100

150

Defect reports

Total

Open

Closed

125

75

25

Jan Feb Mar Apr May Jun

Completion of coding

ldquoAnd all the defect reports remaining to be fixed will keep us far behind schedulerdquo

Click to edit Master title style Requirements Change Impact

Number of requirements

affected

Chart6

Requirements Added
Requirements Changed
Requirements Deleted
Person hours of work or rework
2
20
1
2
11
1
45
35
3
15
61
1
1
20
4
15
15
0
18
05
2

Sheet1

Sheet1

Requirements Added
Requirements Changed
Requirements Deleted
Person hours of work or rework

Sheet2

Sheet3

Click to edit Master title style Phase Defect Removal Model

PHASE

Escapes Previous Phase KSLOC

Defect Injection KSLOC

Subtotal KSLOC

Removal Effective-ness

Defect Removal KSLOC

Escapes at phase exit KSLOC

Rqmts 0 12 12 0 0 12

Design 12 180 192 76 146 46

Code amp UT 46 154 200 71 142 58

Inte-gration 58 0 58 67 39 19

Sys Test 19 0 19 58 11 08

Opera-tion 08

Kan (1999) Metrics and Model in Software Quality Engineering

Click to edit Master title style SCRUM Board Scrumboard displays

the path to completion for each story by showing task status

Story points are an estimate of days for each story

Agile estimating is by feature (story) rather than by task

lsquoDonersquo does not always mean all tasks are finished

Kanban methods restrict the work in progress to a limited number of paths

Click to edit Master title style Burndown Chart and Velocity

Burndown chart displays story completion by day

Burndown chart indicates actual versus estimated progress

Velocity is the rate at which stories are completed

Velocity indicates sustainable progress

Velocity results provide one source of historical data for improving estimates

Story points without the context of productivity factors are dangerous for estimating

Click to edit Master title style Team Velocity

0

1

2

3

4

5

1 2 2 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20Days

Story Points per Hour

0

1

2

3

4

5

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

HrsStryPt

Analyzing Velocity

Item Cycle Times

02468

101214161820

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

Stories Hours HrsStryPt

0

10

20

30

40

50

60

1 2 2 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

Burndown Chart

Days

Click to edit Master title style Measuring and Managing Flow

Released

In progress

Backlogged

Cumulative Flow Diagram Days to delivery

outliers

Anderson (2010) Kanban

Click to edit Master title style

0

10

20

30

40

50

60

70

80

1 2 3 4 5 6 7 8 9 10 11 12

Sprints

Sto

ry p

oin

ts d

eliv

ered

Trends Over Sprints

0

10

20

30

40

50

60

70

80

1 2 3 4 5 6 7 8 9 10 11 12

Sprints

Sto

ry p

oin

ts d

eliv

ered

0

05

1

15

2

25

3

35

4

1 2 3 4 5 6 7 8 9 10 11 12

Sprints

Tech

nic

al d

ebt

Track and correlate multiple measures

across sprints

Predictive models

Click to edit Master title style Sources and Measures

Stories submitted

Stories pulled back

Growth in size of stories

Stories added mid-sprint

Stories under review

Failed tests

Non-development tasks

Assists to other projects

Data sources

Measures Project tracking

Code control

Bug tracking

Build system

Test environment

Deployment tools

Operational monitoring

Click to edit Master title style Recommended Books

Click to edit Master title style Personal Software Process (PSP)

PSP 0 Current process Time recording

Defect recording Defect type standards

PSP 01 Coding standard

Size measurement Process improvement

proposal

Baseline Personal Process

Humphrey (1997) Introduction to the Personal Software Process

PSP 1 Size estimating

Test report

Personal Planning

PSP 2 Code reviews

Design reviews

Personal Quality Management

PSP 21 Cyclic development

Cyclic Personal Process

PSP 11 Task planning

Schedule planning

PSP 21 Design templates

Click to edit Master title style Team Software Process (TSP) bull Built from personal processes of team members

bull Well defined team roles

bull Project launch workshops for planning

bull Team measurement and tracking

bull Post-mortems for improvement

bull Team application of lean principles

Humphrey (2000) Introduction to the Team Software Process

Click to edit Master title style Aggregated Personal Data Best

R2s range from 70 to 87

The ability to predict the amount of time required to produce a piece of software is dramatically improved by estimating at the individual level first and then combining the estimates rather than developing a single team estimate averaged over individuals

R2 = 53

Click to edit Master title style Defect Detection at Intuit

Consequently test changes from defect

detection into correctness verification

PSPTSP shifts defect detection from the test phase back up into development

Fagan (2005)

Personal reviews

33

Compile 33

Team reviews 19

Unit test 15

Test 14

Click to edit Master title style

Dramatic reductions

ndash Delivered defects

ndash Days per KLOC

ndash Schedule deviation

ndash Test defectskloc

ndash Variation in results

Humphrey (2006) TSP Leading a Development Team

TSP Benefits

Click to edit Master title style Section 3 Section 3 Measuring Software

Improvement Programs

1) Improvement program methods

2) Maturity-based measurement

3) Improvement program results

61

Click to edit Master title style CMMI Maturity Transitions

Level 5 Optimizing

Innovation management

Level 4 Quantitatively

Managed

Capability management

Level 3 Defined

Process management

Level 2 Managed

Project management

Level 1 Initial

Inconsistent management

Chrisis Konrad amp Shrum (2005) CMMI

Click to edit Master title style Enhanced Maturity Framework

Level 5 Innovating

Innovation management

Level 4 Optimized

Capability management

Level 3 Standardized

Organizational management

Level 2 Stabilized

Work unit management

Level 1 Initial

Inconsistent management

Click to edit Master title style Maturity Level Transformation

Level 5 Opportunistic improvements

Empowered culture

Level 5 Proactive

improvements Agile culture

Organization

Projects

Individual

Level 4 Process and results

managed statistically Precision culture

Level 3 Organization develops

standard processes Common culture

Level 2 Project managers stabilize projects

Local culture

Level 1 Ad Hoc inconsistent

development practices Relationship culture

Trust

Click to edit Master title style Measurement by Level

Level 5 mdash Exploratory data ∆ t = N x ROI

Level 4 mdash Predictive data X plusmn (196σ radicn )

Level 3 mdash Process data 83 defects per review

Level 2 mdash Management data Effort = α LOCβ

Level 1 mdash Haphazard data ldquoabout a million linesrdquo

Measure-maturity mismatch slows improvement and creates

dysfunction

Click to edit Master title style Average Improvement Results

Productivity growth 4 35 9 - 67

Pre-test defect detection 3 22 6 - 25

Time to market 2 darr 19 15 - 23

Field error reports 5 darr 39 10 - 94

Return on investment 5 501 41 - 881

Annual Annual Improvement benefit Orgs median range

Click to edit Master title style Schedule Performance

0

02

04

06

08

1

12

14

16

1 2 3

Schedule performance

index

CMM maturity level

Budgeted cost of work performed

Budgeted cost of work scheduled

behind ahead

Air Force study of contractors

Flowe amp Thordahl (1994)

Click to edit Master title style Cost Performance

Cost performance

index

0

05

1

15

2

25

1 2 3 CMM Maturity Level

Budgeted cost of work performed Actual cost of

work performed

overrun underrun

Air Force study of contractors

Flowe amp Thordahl (1994)

Click to edit Master title style Crosbyrsquos Cost of Quality Model Project Cost

Performance Cost of Quality Generating plans documentation Development - requirements - design - code - integration Prevention Appraisal

Reviews - system - requirements - design - test plan - test scripts Walkthroughs Testing (first time) Independent Verification and Validation (first time) Audits

Training Policies Procedures and methods Tools Planning Quality Improvement Projects Data Gathering and Analysis Root Cause Analysis Quality Reporting

Nonconformance Conformance Fixing Defects Reworking any Document Updating Source Code Re-reviews Re-tests Lab costs for re-tests Patches(Internal and External) Engineering Changes Change Control Boards External Failures and fines Customer Support Help Desks

Dion (1993) Crosby (1979) Quality Is Free

Click to edit Master title style Raytheons Cost of Quality

bull Performance mdash cost of building it right first time bull Nonconformance mdash cost of rework bull Appraisal mdash cost of testing bull Prevention mdash cost of preventing nonconformance

Dion (1993) Haley (1995)

23

18

Year Level Perform Nonconf Appraise Prevent

1988 1 34 41 15 7

1990 2 55 18 15 12

1992 3 66 11

1994 4 76 6 75

Click to edit Master title style Raytheonrsquos Productivity

0

1

2

3

4

1988 1989 1990 1991 1992 1993 1994

Growth relative to 1989 baseline

Lydon (1995)

Click to edit Master title style Raytheonrsquos Cost Reduction

0

02

04

06

08

1

12

1988 1989 1990 1991 1992 1993 1994

Reduction relative to 1989 baseline

Lydon (1995)

Click to edit Master title style

Haley (1995)

1993 1992 1991 1990 1989 1988 1994

50

40

30

20

10

0

-10

Actual cost

Budgeted cost

over-run under-run

4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1

Raytheonrsquos Cost Predictability

Click to edit Master title style OMRONrsquos Reuse gains

Sakamoto et al (1996)

0

5 0 0

1 0 0 0

1 5 0 0

2 0 0 0

2 5 0 0

3 0 0 0

1 9 8 7 1 9 8 8 1 9 8 9 1 9 9 0 1 9 9 1 1 9 9 2 1 9 9 3 1 9 9 4

Steps

100

75

50

25

0

of reuse

Click to edit Master title style Section 4 Section 4 Measuring Quality and

Technical Debt

1) Structural quality measurement

2) Technical debt

3) Consortium for IT Software Quality (CISQ)

89

Click to edit Master title style What about the Product CMMIrsquos primary assumption

ldquoThe quality of a software system is governed by quality of the process used to develop itrdquo

- Watts Humphrey

CMMIrsquos product quality problem

1 Assessments do not verify product quality

2 Compliance focus undermines CMMIrsquos primary assumption

3 Product quality focus at Level 4 in CMM was lost in quantitative process control in CMMI

4 The assumptions underlying control charts are violated by software data

Click to edit Master title style Six Sigmarsquos Challenge

Process focus ndash process improvement ndash Six Sigma Product focus ndash product improvement ndash Design for 6σ

Product focus supplements product focus to unlock even more value from software

Six Sigma projects must have significant benefits Huge benefits

Large benefits

Good benefits

Okay benefits

Small benefits

Now what

Ultimately we run out of projects with significant enough benefits to continue the programhellip

so what is the solution that allows continual improvement

Click to edit Master title style Testing is Not Enough ldquoAs higher levels of assurance are demandedhelliptesting cannot deliver the level of confidence required at a reasonable costrdquo

ldquoThe correctness of the code is rarely the weakest linkrdquo

Jackson D (2009) Spinellis D (2006)

ldquohellipa failure to satisfy a non-functional requirement can be critical even catastrophichellipnon-functional requirements are sometimes difficult to verify We cannot write a test case to verify a systemrsquos reliabilityhellipThe ability to associate code to non-functional properties can be a powerful weapon in a software engineerrsquos arsenalrdquo

Click to edit Master title style CASTrsquos Application Intelligence Platform

Application Analysis

Evaluation of 1200+ coding amp

architectural rules

Application meta-data

Transferability

Changeability

Robustness

Performance

Security

Quality Measurements

Detected Violations

Expensive operation in loop Static vs pooled connections Complex query on big table Large indices on big table

Empty CATCH block Uncontrolled data access Poor memory management Opened resource not closed

SQL injection Cross-site scripting Buffer overflow Uncontrolled format string

Unstructured code Misuse of inheritance Lack of comments Violated naming convention

Highly coupled component Duplicated code Index modified in loop High cyclomatic complexity

Language Parsers

Oracle PLSQL Sybase T-SQL SQL Server T-SQL IBM SQLPSM C C++ C Pro C Cobol CICS Visual Basic VBNet ASPNet Java J2EE JSP XML HTML Javascript VBScript PHP PowerBuilder Oracle Forms PeopleSoft SAP ABAP Netweaver Tibco Business Objects Universal Analyzer for other languages

Click to edit Master title style Modern Apps are a Technology Stack Ar

chite

ctur

al C

ompl

ianc

e

Data Flow Transaction Risk

EJB PLSQL

Oracle

SQL Server

DB2

TSQL

Hibernate

Spring

Struts NET

COBOL

Sybase IMS

Messaging

Integration quality Architectural

compliance Risk propagation Application security Resiliency checks Transaction integrity

Function point Effort estimation Data access control SDK versioning Calibration across

technologies IT organization level

System Level 3

Code style amp layout Expression complexity Code documentation Class or program design Basic coding standards Developer level

Unit Level 1

Java Java Java

Java Java

Web Services Java Java Single languagetechnology layer

Intra-technology architecture Intra-layer dependencies Inter-program invocation Security vulnerabilities Development team level

Technology Level 2

JSP ASPNET APIs

Click to edit Master title style Architecturally Complex Defects

Primary cause of operational

problems

20x as many

fixes to correct

Architecturally Complex Defect

A structural flaw involving interactions among multiple components that reside in different application layers

48

52 92

8 Architecturally

Complex Defects

Code unit-level violations

of total app defects

of total repair effort

Most architecturally complex defects involve an architectural hotspot ndash a badly-built component that should be replaced

Click to edit Master title style Propagated Risk

The impact of violation 284 in Component B is very widely propagated throughout the system ndash

it presents very high risk

The impact of violation 342 in Component A is only narrowly propagated in the system

ndash it presents low risk

Propagated Risk Index

A ranking of the risk created by each violation based on the its severity and the breadth of its impact in the system

Violation 342 in component A

Violation 284 in component B

Remediating violation 284 in Component B will have

the greater impact on reducing risk and cost

Click to edit Master title style Transaction Risk

Customers care most about the dependability of their transactions

The risk of a transaction is determined by the number and severity of violations along its path

Transactions can be ranked by risk to prioritize their remediation

The Transaction Risk Index can be further tuned using the operational frequency of each transaction

Transaction Risk Index

A ranking of transactions based on the risk created by the number and severity of violations along the transaction path

Transaction entryexit in the user interface layer

Transaction fulfillment in the data storage layer

1 3 2 4 5

Transaction 1 poses greatest risk

Transaction processing in the business logic layer

Click to edit Master title style Risk-Based Prioritization

Violation Health Factor Severity Propagated Risk Index

In a high risk transaction

Frequency of path execution

148 Security 9

371 Robustness 9

062 Performance 8

284 Robustness 7

016 Changeability 7

241 Security 5

173 Transferability 4

359 Transferability 3

Click to edit Master title style Static Analysis in US DoD

John Keane (2013)

Testing finds lt13 of structural defects

Static analysis before testing is 85 efficient

With testing and static analysis combined test schedules are reduced by 50

Click to edit Master title style Reducing Operational Incidents amp Costs

Study of structural quality measures and maintenance effort across 20 customers in a large global system integrator

TQI increase of 24 decreased corrective maintenance effort by 50

Total Quality Index

Click to edit Master title style Reducing Operational Losses

0

20

40

60

80

100

120

140

160

180

100 150 200 250 300 350 400

Technical Quality Index

Pro

du

ctio

n In

cid

ents

+$2000000 Annual Loss

$50000 Annual Loss

$5000 Annual Loss

Large international investment bank Business critical applications

Click to edit Master title style Factors Affecting Structural Quality

2017 CRASH Report Request from infocastsoftwarecom

Factor apps Robust Security Perform Change Transfer

Technology 1606 20 3 16 26 7

Java-EE

Industry 677 5 5 7 4 6

App type 510 1 1 1 1 1

Source 580 1 1 4

Shore 540

Maturity 72 28 25 15 24 12

Method 208 10 6 14 6

Team size 186 7 5 8 8

of users 262 4 3 5 4

Percentage of variation in structural quality scores accounted for by various factors

Click to edit Master title style The Technical Debt Metaphor

Structural quality problems in production code

Technical Debt

Principal borrowed

Interest on the debt

Business Risk Liability from debt

Opportunity cost

Interestmdashcontinuing IT costs attributable to the violations causing technical debt ie higher maintenance costs greater resource usage etc

Principalcost of fixing problems remaining in the code after release that must be remediated

Opportunity costmdashbenefits that could have been achieved had resources been put on new capability rather than retiring technical debt

Liabilitymdashbusiness costs related to outages breaches corrupted data etc

Technical Debt the future cost of fixing defects remaining in code at release a component of the cost of ownership

Todayrsquos talk focuses on the principal

Curtis et al (2012) IEEE Software

Click to edit Master title style Tech Debt by Quality Factor

Transferability

40

Changeability

30

Security 7

Robustness

18

70 of Technical Debt is in IT Cost (Transferability Changeability)

30 of Technical Debt is in Business Risk

(Robustness Performance Security)

Health Factor proportions are mostly

consistent across technologies

Curtis et al (2012) IEEE Software

Click to edit Master title style Join CISQ wwwit-cisqorg

Click to edit Master title style Section

References

115

Click to edit Master title style References 1 Anderson DJ (2010) Kanban Sequim WA Blue Hole Press

Basili V Trendowicz A Kowalczyk M Heidrich J Seaman C Munch J amp Rombach D (2014) Aligning Organizations through Measurement Heidelberg Springer

Boehm B (1987) Increasing software productivity IEEE Computer 20 (9) 43-57

Boehm B et al (2000) Software Cost Estimation with COCOMO II Prentice-Hall

Chrissis MB Konrad M amp Shrum S (2011) CMMI Third Edition Guidelines for Process Integration and Product Improvement Reading MA Addison Wesley

Cohn M (2006) Agile Estimating and Planning Upper Saddle River NJ Prentice-Hall

Crosby P (1979) Quality Is Free New York McGraw-Hill

Curtis B Sapiddi J amp Szynkarski A (2012) Estimating the principle of an applicationrsquos technical debt IEEE Software 29 (6) 34-42

Dion R (1993) Process improvement and the corporate balance sheet IEEE Software 10 (4) 28-35

Duncker R (1992) Presentation at the 25th Annual Conference of the Singapore Computer Society Singapore November 1992

Dyne K (1999) ESSI Benefits from Continuous Software Improvements In Proceedings of ESEPGrsquo99 Milton Keyes UK ESPI Foundation

Flowe RM amp Thordahl JB (1994) A Correlational Study of the SEIrsquos Capability Maturity Model and Software Development Performance in DoD Contracts (AFITGSSLAR94D-2) Dayton OH Air Force Institute of Technology Wright Patterson Air Force Base

116

Click to edit Master title style References 2 Haley T Ireland B Wojtaszek E Nash D amp Dion R (1995) Raytheon Electronic Systems

Experience in Software Process Improvement (Tech Rep CMUSEI-95-TR-017) ) Pittsburgh Software Engineering Institute Carnegie Mellon University

Ebert C amp Dumke R (2007) Software Measurement Berlin Springer-Verlag

Fagan E amp Davis N (2005) TSP PSP at Intuit Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Fenton N (2006) New directions for Software Metrics CIO Symposium on Software Best Practices

Garmus D amp Herron D (2001) Function Point Analysis Boston Addison-Wesley

George E (2005) Project management with TSP Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Herbsleb J Carleton A Rozum J Siegel J amp Zubrow D (1994) Benefits of CMM-Based Software Process Improvement Initial Results (Tech Rep CMUSEI-94-TR-13) Pittsburgh Software Engineering Institute Carnegie Mellon University

Humphrey W (1997) Introduction to the Personal Software Process Boston Addison Wesley

Humphrey W (2000) Introduction to the Team Software Process Boston Addison Wesley

Humphrey W (2006) TSP Leading a Development Team Boston Addison Wesley

Jackson D (2009) A direct path to dependable software Communications of the ACM 52 (4)

Kan SH (1995) Metrics and Models in Software Quality Engineering Reading MA Addison-Wesley

Keeni B (2000) IEEE Software 16 (4)

117

Click to edit Master title style References 2 Larman C (2004) Agile and Iterative Development Boston Addison Wesley

Lydon T (1995) Productivity drivers Process and capital In Proceedings of the 1995 SEPG Conference Pittsburgh Software Engineering Institute Carnegie Mellon University

McGarry J et al (2002) Practical Software Measurement Reading MA Addison Wesley

Paulk M Weber C Curtis B amp Chrissis M (1995) The Capability Maturity Model Guidelines for Improving the Software Process Reading MA Addison-Wesley

Peterman W (2000) IEEE Software JulyAugust issue

Poppendieck M amp Poppendieck T (2007) Implementing Lean Software Development Boston Addison-Wesley

Sakamoto K Kishida K amp Nakakoji K (1996) Cultural adaptation of the CMM In Fuggetta A amp Wolf A (Eds) Software Process Chichester UK Wiley 137-154

Sinha M (2006) Performance models ndash Enabling organizations prediction capability Proceedings of SEPGrsquo06 Pittsburgh Software Engineering Institute Carnegie Mellon University

Schwaber K amp Beedle M (2002) Agile Software Development Using Scrum

Spinellis D (2006) Code Quality Addison-Wesley

Van Epps S amp Marshall R (2012) Using TSP at the end of the development cycle Proceedings of TSP 2012 Pittsburgh Software Engineering Institute Carnegie Mellon University

Vu JD (1996) Software process improvement A business case In Proceedings of the European SEPG Conference Milton Keynes UK European Software Process Improvement Foundation

118

  • Best Practices for Software Process and Product Measurement
  • Instructor Dr Bill Curtis
  • Agenda
  • Section 1
  • Why Does Measurement Languish
  • Measurement Process Guidance
  • Measurement Categories
  • Productivity Analysis Objectives
  • Productivity Analysis Measures
  • Software Productivity
  • Software Size Measures
  • How Many Lines of Code
  • Function Point Estimation
  • Effort mdash Weakest Measure
  • How Quality Affects Productivity
  • Carry-forward Rework
  • Example of Quality Impact
  • Quality-Adjusted Productivity
  • Best Practices for Analysis
  • 1 Evaluate Demographics
  • 2 Segment Baselines
  • 3 Beware Sampling Effects
  • 4 Understand Variation
  • 5 Investigate Distributions
  • 6 Account for Maturity Effects
  • 7 Caution for External Data Sets
  • Slide Number 27
  • Measuring Project Progress
  • Requirements Change Impact
  • Phase Defect Removal Model
  • SCRUM Board
  • Burndown Chart and Velocity
  • Analyzing Velocity
  • Measuring and Managing Flow
  • Trends Over Sprints
  • Sources and Measures
  • Recommended Books
  • Personal Software Process (PSP)
  • Team Software Process (TSP)
  • Aggregated Personal Data Best
  • Defect Detection at Intuit
  • TSP Benefits
  • Section 3
  • CMMI Maturity Transitions
  • Enhanced Maturity Framework
  • Maturity Level Transformation
  • Measurement by Level
  • Average Improvement Results
  • Schedule Performance
  • Cost Performance
  • Crosbyrsquos Cost of Quality Model
  • Raytheons Cost of Quality
  • Raytheonrsquos Productivity
  • Raytheonrsquos Cost Reduction
  • Raytheonrsquos Cost Predictability
  • OMRONrsquos Reuse gains
  • Section 4
  • What about the Product
  • Six Sigmarsquos Challenge
  • Testing is Not Enough
  • CASTrsquos Application Intelligence Platform
  • Modern Apps are a Technology Stack
  • Architecturally Complex Defects
  • Propagated Risk
  • Transaction Risk
  • Risk-Based Prioritization
  • Static Analysis in US DoD
  • Reducing Operational Incidents amp Costs
  • Reducing Operational Losses
  • Slide Number 70
  • The Technical Debt Metaphor
  • Tech Debt by Quality Factor
  • Join CISQ wwwit-cisqorg
  • Section
  • References 1
  • References 2
  • References 2

PHASE

Escapes Previous Phase

KSLOC

Defect Injection

KSLOC

Subtotal

KSLOC

Removal

Effective-ness

Defect Removal

KSLOC

Escapes at phase exit KSLOC

Rqmts

0

12

12

0

0

12

Design

12

180

192

76

146

46

Code amp UT

46

154

200

71

142

58

Inte-gration

58

0

58

67

39

19

Sys Test

19

0

19

58

11

08

Opera-tion

08

Requirements Added Requirements Changed Requirements Deleted
Feb 2 20 1
Mar 2 11 1
Apr 45 35 3
May 15 61 1
Jun 1 20 4
Jul 15 15 0
Aug 18 05 2
Feb Feb Feb
Mar Mar Mar
Apr Apr Apr
May May May
Jun Jun Jun
Jul Jul Jul
Aug Aug Aug
Page 29: Best Practices for Software Process and Product Measurement · Software Process and Product Measurement Dr. Bill Curtis ... measurement best practice • Best guidance for starting

Click to edit Master title style Requirements Change Impact

Number of requirements

affected

Chart6

Requirements Added
Requirements Changed
Requirements Deleted
Person hours of work or rework
2
20
1
2
11
1
45
35
3
15
61
1
1
20
4
15
15
0
18
05
2

Sheet1

Sheet1

Requirements Added
Requirements Changed
Requirements Deleted
Person hours of work or rework

Sheet2

Sheet3

Click to edit Master title style Phase Defect Removal Model

PHASE

Escapes Previous Phase KSLOC

Defect Injection KSLOC

Subtotal KSLOC

Removal Effective-ness

Defect Removal KSLOC

Escapes at phase exit KSLOC

Rqmts 0 12 12 0 0 12

Design 12 180 192 76 146 46

Code amp UT 46 154 200 71 142 58

Inte-gration 58 0 58 67 39 19

Sys Test 19 0 19 58 11 08

Opera-tion 08

Kan (1999) Metrics and Model in Software Quality Engineering

Click to edit Master title style SCRUM Board Scrumboard displays

the path to completion for each story by showing task status

Story points are an estimate of days for each story

Agile estimating is by feature (story) rather than by task

lsquoDonersquo does not always mean all tasks are finished

Kanban methods restrict the work in progress to a limited number of paths

Click to edit Master title style Burndown Chart and Velocity

Burndown chart displays story completion by day

Burndown chart indicates actual versus estimated progress

Velocity is the rate at which stories are completed

Velocity indicates sustainable progress

Velocity results provide one source of historical data for improving estimates

Story points without the context of productivity factors are dangerous for estimating

Click to edit Master title style Team Velocity

0

1

2

3

4

5

1 2 2 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20Days

Story Points per Hour

0

1

2

3

4

5

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

HrsStryPt

Analyzing Velocity

Item Cycle Times

02468

101214161820

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

Stories Hours HrsStryPt

0

10

20

30

40

50

60

1 2 2 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

Burndown Chart

Days

Click to edit Master title style Measuring and Managing Flow

Released

In progress

Backlogged

Cumulative Flow Diagram Days to delivery

outliers

Anderson (2010) Kanban

Click to edit Master title style

0

10

20

30

40

50

60

70

80

1 2 3 4 5 6 7 8 9 10 11 12

Sprints

Sto

ry p

oin

ts d

eliv

ered

Trends Over Sprints

0

10

20

30

40

50

60

70

80

1 2 3 4 5 6 7 8 9 10 11 12

Sprints

Sto

ry p

oin

ts d

eliv

ered

0

05

1

15

2

25

3

35

4

1 2 3 4 5 6 7 8 9 10 11 12

Sprints

Tech

nic

al d

ebt

Track and correlate multiple measures

across sprints

Predictive models

Click to edit Master title style Sources and Measures

Stories submitted

Stories pulled back

Growth in size of stories

Stories added mid-sprint

Stories under review

Failed tests

Non-development tasks

Assists to other projects

Data sources

Measures Project tracking

Code control

Bug tracking

Build system

Test environment

Deployment tools

Operational monitoring

Click to edit Master title style Recommended Books

Click to edit Master title style Personal Software Process (PSP)

PSP 0 Current process Time recording

Defect recording Defect type standards

PSP 01 Coding standard

Size measurement Process improvement

proposal

Baseline Personal Process

Humphrey (1997) Introduction to the Personal Software Process

PSP 1 Size estimating

Test report

Personal Planning

PSP 2 Code reviews

Design reviews

Personal Quality Management

PSP 21 Cyclic development

Cyclic Personal Process

PSP 11 Task planning

Schedule planning

PSP 21 Design templates

Click to edit Master title style Team Software Process (TSP) bull Built from personal processes of team members

bull Well defined team roles

bull Project launch workshops for planning

bull Team measurement and tracking

bull Post-mortems for improvement

bull Team application of lean principles

Humphrey (2000) Introduction to the Team Software Process

Click to edit Master title style Aggregated Personal Data Best

R2s range from 70 to 87

The ability to predict the amount of time required to produce a piece of software is dramatically improved by estimating at the individual level first and then combining the estimates rather than developing a single team estimate averaged over individuals

R2 = 53

Click to edit Master title style Defect Detection at Intuit

Consequently test changes from defect

detection into correctness verification

PSPTSP shifts defect detection from the test phase back up into development

Fagan (2005)

Personal reviews

33

Compile 33

Team reviews 19

Unit test 15

Test 14

Click to edit Master title style

Dramatic reductions

ndash Delivered defects

ndash Days per KLOC

ndash Schedule deviation

ndash Test defectskloc

ndash Variation in results

Humphrey (2006) TSP Leading a Development Team

TSP Benefits

Click to edit Master title style Section 3 Section 3 Measuring Software

Improvement Programs

1) Improvement program methods

2) Maturity-based measurement

3) Improvement program results

61

Click to edit Master title style CMMI Maturity Transitions

Level 5 Optimizing

Innovation management

Level 4 Quantitatively

Managed

Capability management

Level 3 Defined

Process management

Level 2 Managed

Project management

Level 1 Initial

Inconsistent management

Chrisis Konrad amp Shrum (2005) CMMI

Click to edit Master title style Enhanced Maturity Framework

Level 5 Innovating

Innovation management

Level 4 Optimized

Capability management

Level 3 Standardized

Organizational management

Level 2 Stabilized

Work unit management

Level 1 Initial

Inconsistent management

Click to edit Master title style Maturity Level Transformation

Level 5 Opportunistic improvements

Empowered culture

Level 5 Proactive

improvements Agile culture

Organization

Projects

Individual

Level 4 Process and results

managed statistically Precision culture

Level 3 Organization develops

standard processes Common culture

Level 2 Project managers stabilize projects

Local culture

Level 1 Ad Hoc inconsistent

development practices Relationship culture

Trust

Click to edit Master title style Measurement by Level

Level 5 mdash Exploratory data ∆ t = N x ROI

Level 4 mdash Predictive data X plusmn (196σ radicn )

Level 3 mdash Process data 83 defects per review

Level 2 mdash Management data Effort = α LOCβ

Level 1 mdash Haphazard data ldquoabout a million linesrdquo

Measure-maturity mismatch slows improvement and creates

dysfunction

Click to edit Master title style Average Improvement Results

Productivity growth 4 35 9 - 67

Pre-test defect detection 3 22 6 - 25

Time to market 2 darr 19 15 - 23

Field error reports 5 darr 39 10 - 94

Return on investment 5 501 41 - 881

Annual Annual Improvement benefit Orgs median range

Click to edit Master title style Schedule Performance

0

02

04

06

08

1

12

14

16

1 2 3

Schedule performance

index

CMM maturity level

Budgeted cost of work performed

Budgeted cost of work scheduled

behind ahead

Air Force study of contractors

Flowe amp Thordahl (1994)

Click to edit Master title style Cost Performance

Cost performance

index

0

05

1

15

2

25

1 2 3 CMM Maturity Level

Budgeted cost of work performed Actual cost of

work performed

overrun underrun

Air Force study of contractors

Flowe amp Thordahl (1994)

Click to edit Master title style Crosbyrsquos Cost of Quality Model Project Cost

Performance Cost of Quality Generating plans documentation Development - requirements - design - code - integration Prevention Appraisal

Reviews - system - requirements - design - test plan - test scripts Walkthroughs Testing (first time) Independent Verification and Validation (first time) Audits

Training Policies Procedures and methods Tools Planning Quality Improvement Projects Data Gathering and Analysis Root Cause Analysis Quality Reporting

Nonconformance Conformance Fixing Defects Reworking any Document Updating Source Code Re-reviews Re-tests Lab costs for re-tests Patches(Internal and External) Engineering Changes Change Control Boards External Failures and fines Customer Support Help Desks

Dion (1993) Crosby (1979) Quality Is Free

Click to edit Master title style Raytheons Cost of Quality

bull Performance mdash cost of building it right first time bull Nonconformance mdash cost of rework bull Appraisal mdash cost of testing bull Prevention mdash cost of preventing nonconformance

Dion (1993) Haley (1995)

23

18

Year Level Perform Nonconf Appraise Prevent

1988 1 34 41 15 7

1990 2 55 18 15 12

1992 3 66 11

1994 4 76 6 75

Click to edit Master title style Raytheonrsquos Productivity

0

1

2

3

4

1988 1989 1990 1991 1992 1993 1994

Growth relative to 1989 baseline

Lydon (1995)

Click to edit Master title style Raytheonrsquos Cost Reduction

0

02

04

06

08

1

12

1988 1989 1990 1991 1992 1993 1994

Reduction relative to 1989 baseline

Lydon (1995)

Click to edit Master title style

Haley (1995)

1993 1992 1991 1990 1989 1988 1994

50

40

30

20

10

0

-10

Actual cost

Budgeted cost

over-run under-run

4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1

Raytheonrsquos Cost Predictability

Click to edit Master title style OMRONrsquos Reuse gains

Sakamoto et al (1996)

0

5 0 0

1 0 0 0

1 5 0 0

2 0 0 0

2 5 0 0

3 0 0 0

1 9 8 7 1 9 8 8 1 9 8 9 1 9 9 0 1 9 9 1 1 9 9 2 1 9 9 3 1 9 9 4

Steps

100

75

50

25

0

of reuse

Click to edit Master title style Section 4 Section 4 Measuring Quality and

Technical Debt

1) Structural quality measurement

2) Technical debt

3) Consortium for IT Software Quality (CISQ)

89

Click to edit Master title style What about the Product CMMIrsquos primary assumption

ldquoThe quality of a software system is governed by quality of the process used to develop itrdquo

- Watts Humphrey

CMMIrsquos product quality problem

1 Assessments do not verify product quality

2 Compliance focus undermines CMMIrsquos primary assumption

3 Product quality focus at Level 4 in CMM was lost in quantitative process control in CMMI

4 The assumptions underlying control charts are violated by software data

Click to edit Master title style Six Sigmarsquos Challenge

Process focus ndash process improvement ndash Six Sigma Product focus ndash product improvement ndash Design for 6σ

Product focus supplements product focus to unlock even more value from software

Six Sigma projects must have significant benefits Huge benefits

Large benefits

Good benefits

Okay benefits

Small benefits

Now what

Ultimately we run out of projects with significant enough benefits to continue the programhellip

so what is the solution that allows continual improvement

Click to edit Master title style Testing is Not Enough ldquoAs higher levels of assurance are demandedhelliptesting cannot deliver the level of confidence required at a reasonable costrdquo

ldquoThe correctness of the code is rarely the weakest linkrdquo

Jackson D (2009) Spinellis D (2006)

ldquohellipa failure to satisfy a non-functional requirement can be critical even catastrophichellipnon-functional requirements are sometimes difficult to verify We cannot write a test case to verify a systemrsquos reliabilityhellipThe ability to associate code to non-functional properties can be a powerful weapon in a software engineerrsquos arsenalrdquo

Click to edit Master title style CASTrsquos Application Intelligence Platform

Application Analysis

Evaluation of 1200+ coding amp

architectural rules

Application meta-data

Transferability

Changeability

Robustness

Performance

Security

Quality Measurements

Detected Violations

Expensive operation in loop Static vs pooled connections Complex query on big table Large indices on big table

Empty CATCH block Uncontrolled data access Poor memory management Opened resource not closed

SQL injection Cross-site scripting Buffer overflow Uncontrolled format string

Unstructured code Misuse of inheritance Lack of comments Violated naming convention

Highly coupled component Duplicated code Index modified in loop High cyclomatic complexity

Language Parsers

Oracle PLSQL Sybase T-SQL SQL Server T-SQL IBM SQLPSM C C++ C Pro C Cobol CICS Visual Basic VBNet ASPNet Java J2EE JSP XML HTML Javascript VBScript PHP PowerBuilder Oracle Forms PeopleSoft SAP ABAP Netweaver Tibco Business Objects Universal Analyzer for other languages

Click to edit Master title style Modern Apps are a Technology Stack Ar

chite

ctur

al C

ompl

ianc

e

Data Flow Transaction Risk

EJB PLSQL

Oracle

SQL Server

DB2

TSQL

Hibernate

Spring

Struts NET

COBOL

Sybase IMS

Messaging

Integration quality Architectural

compliance Risk propagation Application security Resiliency checks Transaction integrity

Function point Effort estimation Data access control SDK versioning Calibration across

technologies IT organization level

System Level 3

Code style amp layout Expression complexity Code documentation Class or program design Basic coding standards Developer level

Unit Level 1

Java Java Java

Java Java

Web Services Java Java Single languagetechnology layer

Intra-technology architecture Intra-layer dependencies Inter-program invocation Security vulnerabilities Development team level

Technology Level 2

JSP ASPNET APIs

Click to edit Master title style Architecturally Complex Defects

Primary cause of operational

problems

20x as many

fixes to correct

Architecturally Complex Defect

A structural flaw involving interactions among multiple components that reside in different application layers

48

52 92

8 Architecturally

Complex Defects

Code unit-level violations

of total app defects

of total repair effort

Most architecturally complex defects involve an architectural hotspot ndash a badly-built component that should be replaced

Click to edit Master title style Propagated Risk

The impact of violation 284 in Component B is very widely propagated throughout the system ndash

it presents very high risk

The impact of violation 342 in Component A is only narrowly propagated in the system

ndash it presents low risk

Propagated Risk Index

A ranking of the risk created by each violation based on the its severity and the breadth of its impact in the system

Violation 342 in component A

Violation 284 in component B

Remediating violation 284 in Component B will have

the greater impact on reducing risk and cost

Click to edit Master title style Transaction Risk

Customers care most about the dependability of their transactions

The risk of a transaction is determined by the number and severity of violations along its path

Transactions can be ranked by risk to prioritize their remediation

The Transaction Risk Index can be further tuned using the operational frequency of each transaction

Transaction Risk Index

A ranking of transactions based on the risk created by the number and severity of violations along the transaction path

Transaction entryexit in the user interface layer

Transaction fulfillment in the data storage layer

1 3 2 4 5

Transaction 1 poses greatest risk

Transaction processing in the business logic layer

Click to edit Master title style Risk-Based Prioritization

Violation Health Factor Severity Propagated Risk Index

In a high risk transaction

Frequency of path execution

148 Security 9

371 Robustness 9

062 Performance 8

284 Robustness 7

016 Changeability 7

241 Security 5

173 Transferability 4

359 Transferability 3

Click to edit Master title style Static Analysis in US DoD

John Keane (2013)

Testing finds lt13 of structural defects

Static analysis before testing is 85 efficient

With testing and static analysis combined test schedules are reduced by 50

Click to edit Master title style Reducing Operational Incidents amp Costs

Study of structural quality measures and maintenance effort across 20 customers in a large global system integrator

TQI increase of 24 decreased corrective maintenance effort by 50

Total Quality Index

Click to edit Master title style Reducing Operational Losses

0

20

40

60

80

100

120

140

160

180

100 150 200 250 300 350 400

Technical Quality Index

Pro

du

ctio

n In

cid

ents

+$2000000 Annual Loss

$50000 Annual Loss

$5000 Annual Loss

Large international investment bank Business critical applications

Click to edit Master title style Factors Affecting Structural Quality

2017 CRASH Report Request from infocastsoftwarecom

Factor apps Robust Security Perform Change Transfer

Technology 1606 20 3 16 26 7

Java-EE

Industry 677 5 5 7 4 6

App type 510 1 1 1 1 1

Source 580 1 1 4

Shore 540

Maturity 72 28 25 15 24 12

Method 208 10 6 14 6

Team size 186 7 5 8 8

of users 262 4 3 5 4

Percentage of variation in structural quality scores accounted for by various factors

Click to edit Master title style The Technical Debt Metaphor

Structural quality problems in production code

Technical Debt

Principal borrowed

Interest on the debt

Business Risk Liability from debt

Opportunity cost

Interestmdashcontinuing IT costs attributable to the violations causing technical debt ie higher maintenance costs greater resource usage etc

Principalcost of fixing problems remaining in the code after release that must be remediated

Opportunity costmdashbenefits that could have been achieved had resources been put on new capability rather than retiring technical debt

Liabilitymdashbusiness costs related to outages breaches corrupted data etc

Technical Debt the future cost of fixing defects remaining in code at release a component of the cost of ownership

Todayrsquos talk focuses on the principal

Curtis et al (2012) IEEE Software

Click to edit Master title style Tech Debt by Quality Factor

Transferability

40

Changeability

30

Security 7

Robustness

18

70 of Technical Debt is in IT Cost (Transferability Changeability)

30 of Technical Debt is in Business Risk

(Robustness Performance Security)

Health Factor proportions are mostly

consistent across technologies

Curtis et al (2012) IEEE Software

Click to edit Master title style Join CISQ wwwit-cisqorg

Click to edit Master title style Section

References

115

Click to edit Master title style References 1 Anderson DJ (2010) Kanban Sequim WA Blue Hole Press

Basili V Trendowicz A Kowalczyk M Heidrich J Seaman C Munch J amp Rombach D (2014) Aligning Organizations through Measurement Heidelberg Springer

Boehm B (1987) Increasing software productivity IEEE Computer 20 (9) 43-57

Boehm B et al (2000) Software Cost Estimation with COCOMO II Prentice-Hall

Chrissis MB Konrad M amp Shrum S (2011) CMMI Third Edition Guidelines for Process Integration and Product Improvement Reading MA Addison Wesley

Cohn M (2006) Agile Estimating and Planning Upper Saddle River NJ Prentice-Hall

Crosby P (1979) Quality Is Free New York McGraw-Hill

Curtis B Sapiddi J amp Szynkarski A (2012) Estimating the principle of an applicationrsquos technical debt IEEE Software 29 (6) 34-42

Dion R (1993) Process improvement and the corporate balance sheet IEEE Software 10 (4) 28-35

Duncker R (1992) Presentation at the 25th Annual Conference of the Singapore Computer Society Singapore November 1992

Dyne K (1999) ESSI Benefits from Continuous Software Improvements In Proceedings of ESEPGrsquo99 Milton Keyes UK ESPI Foundation

Flowe RM amp Thordahl JB (1994) A Correlational Study of the SEIrsquos Capability Maturity Model and Software Development Performance in DoD Contracts (AFITGSSLAR94D-2) Dayton OH Air Force Institute of Technology Wright Patterson Air Force Base

116

Click to edit Master title style References 2 Haley T Ireland B Wojtaszek E Nash D amp Dion R (1995) Raytheon Electronic Systems

Experience in Software Process Improvement (Tech Rep CMUSEI-95-TR-017) ) Pittsburgh Software Engineering Institute Carnegie Mellon University

Ebert C amp Dumke R (2007) Software Measurement Berlin Springer-Verlag

Fagan E amp Davis N (2005) TSP PSP at Intuit Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Fenton N (2006) New directions for Software Metrics CIO Symposium on Software Best Practices

Garmus D amp Herron D (2001) Function Point Analysis Boston Addison-Wesley

George E (2005) Project management with TSP Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Herbsleb J Carleton A Rozum J Siegel J amp Zubrow D (1994) Benefits of CMM-Based Software Process Improvement Initial Results (Tech Rep CMUSEI-94-TR-13) Pittsburgh Software Engineering Institute Carnegie Mellon University

Humphrey W (1997) Introduction to the Personal Software Process Boston Addison Wesley

Humphrey W (2000) Introduction to the Team Software Process Boston Addison Wesley

Humphrey W (2006) TSP Leading a Development Team Boston Addison Wesley

Jackson D (2009) A direct path to dependable software Communications of the ACM 52 (4)

Kan SH (1995) Metrics and Models in Software Quality Engineering Reading MA Addison-Wesley

Keeni B (2000) IEEE Software 16 (4)

117

Click to edit Master title style References 2 Larman C (2004) Agile and Iterative Development Boston Addison Wesley

Lydon T (1995) Productivity drivers Process and capital In Proceedings of the 1995 SEPG Conference Pittsburgh Software Engineering Institute Carnegie Mellon University

McGarry J et al (2002) Practical Software Measurement Reading MA Addison Wesley

Paulk M Weber C Curtis B amp Chrissis M (1995) The Capability Maturity Model Guidelines for Improving the Software Process Reading MA Addison-Wesley

Peterman W (2000) IEEE Software JulyAugust issue

Poppendieck M amp Poppendieck T (2007) Implementing Lean Software Development Boston Addison-Wesley

Sakamoto K Kishida K amp Nakakoji K (1996) Cultural adaptation of the CMM In Fuggetta A amp Wolf A (Eds) Software Process Chichester UK Wiley 137-154

Sinha M (2006) Performance models ndash Enabling organizations prediction capability Proceedings of SEPGrsquo06 Pittsburgh Software Engineering Institute Carnegie Mellon University

Schwaber K amp Beedle M (2002) Agile Software Development Using Scrum

Spinellis D (2006) Code Quality Addison-Wesley

Van Epps S amp Marshall R (2012) Using TSP at the end of the development cycle Proceedings of TSP 2012 Pittsburgh Software Engineering Institute Carnegie Mellon University

Vu JD (1996) Software process improvement A business case In Proceedings of the European SEPG Conference Milton Keynes UK European Software Process Improvement Foundation

118

  • Best Practices for Software Process and Product Measurement
  • Instructor Dr Bill Curtis
  • Agenda
  • Section 1
  • Why Does Measurement Languish
  • Measurement Process Guidance
  • Measurement Categories
  • Productivity Analysis Objectives
  • Productivity Analysis Measures
  • Software Productivity
  • Software Size Measures
  • How Many Lines of Code
  • Function Point Estimation
  • Effort mdash Weakest Measure
  • How Quality Affects Productivity
  • Carry-forward Rework
  • Example of Quality Impact
  • Quality-Adjusted Productivity
  • Best Practices for Analysis
  • 1 Evaluate Demographics
  • 2 Segment Baselines
  • 3 Beware Sampling Effects
  • 4 Understand Variation
  • 5 Investigate Distributions
  • 6 Account for Maturity Effects
  • 7 Caution for External Data Sets
  • Slide Number 27
  • Measuring Project Progress
  • Requirements Change Impact
  • Phase Defect Removal Model
  • SCRUM Board
  • Burndown Chart and Velocity
  • Analyzing Velocity
  • Measuring and Managing Flow
  • Trends Over Sprints
  • Sources and Measures
  • Recommended Books
  • Personal Software Process (PSP)
  • Team Software Process (TSP)
  • Aggregated Personal Data Best
  • Defect Detection at Intuit
  • TSP Benefits
  • Section 3
  • CMMI Maturity Transitions
  • Enhanced Maturity Framework
  • Maturity Level Transformation
  • Measurement by Level
  • Average Improvement Results
  • Schedule Performance
  • Cost Performance
  • Crosbyrsquos Cost of Quality Model
  • Raytheons Cost of Quality
  • Raytheonrsquos Productivity
  • Raytheonrsquos Cost Reduction
  • Raytheonrsquos Cost Predictability
  • OMRONrsquos Reuse gains
  • Section 4
  • What about the Product
  • Six Sigmarsquos Challenge
  • Testing is Not Enough
  • CASTrsquos Application Intelligence Platform
  • Modern Apps are a Technology Stack
  • Architecturally Complex Defects
  • Propagated Risk
  • Transaction Risk
  • Risk-Based Prioritization
  • Static Analysis in US DoD
  • Reducing Operational Incidents amp Costs
  • Reducing Operational Losses
  • Slide Number 70
  • The Technical Debt Metaphor
  • Tech Debt by Quality Factor
  • Join CISQ wwwit-cisqorg
  • Section
  • References 1
  • References 2
  • References 2

PHASE

Escapes Previous Phase

KSLOC

Defect Injection

KSLOC

Subtotal

KSLOC

Removal

Effective-ness

Defect Removal

KSLOC

Escapes at phase exit KSLOC

Rqmts

0

12

12

0

0

12

Design

12

180

192

76

146

46

Code amp UT

46

154

200

71

142

58

Inte-gration

58

0

58

67

39

19

Sys Test

19

0

19

58

11

08

Opera-tion

08

Requirements Added Requirements Changed Requirements Deleted
Feb 2 20 1
Mar 2 11 1
Apr 45 35 3
May 15 61 1
Jun 1 20 4
Jul 15 15 0
Aug 18 05 2
Feb Feb Feb
Mar Mar Mar
Apr Apr Apr
May May May
Jun Jun Jun
Jul Jul Jul
Aug Aug Aug
Page 30: Best Practices for Software Process and Product Measurement · Software Process and Product Measurement Dr. Bill Curtis ... measurement best practice • Best guidance for starting

Chart6

Requirements Added
Requirements Changed
Requirements Deleted
Person hours of work or rework
2
20
1
2
11
1
45
35
3
15
61
1
1
20
4
15
15
0
18
05
2

Sheet1

Sheet1

Requirements Added
Requirements Changed
Requirements Deleted
Person hours of work or rework

Sheet2

Sheet3

Click to edit Master title style Phase Defect Removal Model

PHASE

Escapes Previous Phase KSLOC

Defect Injection KSLOC

Subtotal KSLOC

Removal Effective-ness

Defect Removal KSLOC

Escapes at phase exit KSLOC

Rqmts 0 12 12 0 0 12

Design 12 180 192 76 146 46

Code amp UT 46 154 200 71 142 58

Inte-gration 58 0 58 67 39 19

Sys Test 19 0 19 58 11 08

Opera-tion 08

Kan (1999) Metrics and Model in Software Quality Engineering

Click to edit Master title style SCRUM Board Scrumboard displays

the path to completion for each story by showing task status

Story points are an estimate of days for each story

Agile estimating is by feature (story) rather than by task

lsquoDonersquo does not always mean all tasks are finished

Kanban methods restrict the work in progress to a limited number of paths

Click to edit Master title style Burndown Chart and Velocity

Burndown chart displays story completion by day

Burndown chart indicates actual versus estimated progress

Velocity is the rate at which stories are completed

Velocity indicates sustainable progress

Velocity results provide one source of historical data for improving estimates

Story points without the context of productivity factors are dangerous for estimating

Click to edit Master title style Team Velocity

0

1

2

3

4

5

1 2 2 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20Days

Story Points per Hour

0

1

2

3

4

5

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

HrsStryPt

Analyzing Velocity

Item Cycle Times

02468

101214161820

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

Stories Hours HrsStryPt

0

10

20

30

40

50

60

1 2 2 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

Burndown Chart

Days

Click to edit Master title style Measuring and Managing Flow

Released

In progress

Backlogged

Cumulative Flow Diagram Days to delivery

outliers

Anderson (2010) Kanban

Click to edit Master title style

0

10

20

30

40

50

60

70

80

1 2 3 4 5 6 7 8 9 10 11 12

Sprints

Sto

ry p

oin

ts d

eliv

ered

Trends Over Sprints

0

10

20

30

40

50

60

70

80

1 2 3 4 5 6 7 8 9 10 11 12

Sprints

Sto

ry p

oin

ts d

eliv

ered

0

05

1

15

2

25

3

35

4

1 2 3 4 5 6 7 8 9 10 11 12

Sprints

Tech

nic

al d

ebt

Track and correlate multiple measures

across sprints

Predictive models

Click to edit Master title style Sources and Measures

Stories submitted

Stories pulled back

Growth in size of stories

Stories added mid-sprint

Stories under review

Failed tests

Non-development tasks

Assists to other projects

Data sources

Measures Project tracking

Code control

Bug tracking

Build system

Test environment

Deployment tools

Operational monitoring

Click to edit Master title style Recommended Books

Click to edit Master title style Personal Software Process (PSP)

PSP 0 Current process Time recording

Defect recording Defect type standards

PSP 01 Coding standard

Size measurement Process improvement

proposal

Baseline Personal Process

Humphrey (1997) Introduction to the Personal Software Process

PSP 1 Size estimating

Test report

Personal Planning

PSP 2 Code reviews

Design reviews

Personal Quality Management

PSP 21 Cyclic development

Cyclic Personal Process

PSP 11 Task planning

Schedule planning

PSP 21 Design templates

Click to edit Master title style Team Software Process (TSP) bull Built from personal processes of team members

bull Well defined team roles

bull Project launch workshops for planning

bull Team measurement and tracking

bull Post-mortems for improvement

bull Team application of lean principles

Humphrey (2000) Introduction to the Team Software Process

Click to edit Master title style Aggregated Personal Data Best

R2s range from 70 to 87

The ability to predict the amount of time required to produce a piece of software is dramatically improved by estimating at the individual level first and then combining the estimates rather than developing a single team estimate averaged over individuals

R2 = 53

Click to edit Master title style Defect Detection at Intuit

Consequently test changes from defect

detection into correctness verification

PSPTSP shifts defect detection from the test phase back up into development

Fagan (2005)

Personal reviews

33

Compile 33

Team reviews 19

Unit test 15

Test 14

Click to edit Master title style

Dramatic reductions

ndash Delivered defects

ndash Days per KLOC

ndash Schedule deviation

ndash Test defectskloc

ndash Variation in results

Humphrey (2006) TSP Leading a Development Team

TSP Benefits

Click to edit Master title style Section 3 Section 3 Measuring Software

Improvement Programs

1) Improvement program methods

2) Maturity-based measurement

3) Improvement program results

61

Click to edit Master title style CMMI Maturity Transitions

Level 5 Optimizing

Innovation management

Level 4 Quantitatively

Managed

Capability management

Level 3 Defined

Process management

Level 2 Managed

Project management

Level 1 Initial

Inconsistent management

Chrisis Konrad amp Shrum (2005) CMMI

Click to edit Master title style Enhanced Maturity Framework

Level 5 Innovating

Innovation management

Level 4 Optimized

Capability management

Level 3 Standardized

Organizational management

Level 2 Stabilized

Work unit management

Level 1 Initial

Inconsistent management

Click to edit Master title style Maturity Level Transformation

Level 5 Opportunistic improvements

Empowered culture

Level 5 Proactive

improvements Agile culture

Organization

Projects

Individual

Level 4 Process and results

managed statistically Precision culture

Level 3 Organization develops

standard processes Common culture

Level 2 Project managers stabilize projects

Local culture

Level 1 Ad Hoc inconsistent

development practices Relationship culture

Trust

Click to edit Master title style Measurement by Level

Level 5 mdash Exploratory data ∆ t = N x ROI

Level 4 mdash Predictive data X plusmn (196σ radicn )

Level 3 mdash Process data 83 defects per review

Level 2 mdash Management data Effort = α LOCβ

Level 1 mdash Haphazard data ldquoabout a million linesrdquo

Measure-maturity mismatch slows improvement and creates

dysfunction

Click to edit Master title style Average Improvement Results

Productivity growth 4 35 9 - 67

Pre-test defect detection 3 22 6 - 25

Time to market 2 darr 19 15 - 23

Field error reports 5 darr 39 10 - 94

Return on investment 5 501 41 - 881

Annual Annual Improvement benefit Orgs median range

Click to edit Master title style Schedule Performance

0

02

04

06

08

1

12

14

16

1 2 3

Schedule performance

index

CMM maturity level

Budgeted cost of work performed

Budgeted cost of work scheduled

behind ahead

Air Force study of contractors

Flowe amp Thordahl (1994)

Click to edit Master title style Cost Performance

Cost performance

index

0

05

1

15

2

25

1 2 3 CMM Maturity Level

Budgeted cost of work performed Actual cost of

work performed

overrun underrun

Air Force study of contractors

Flowe amp Thordahl (1994)

Click to edit Master title style Crosbyrsquos Cost of Quality Model Project Cost

Performance Cost of Quality Generating plans documentation Development - requirements - design - code - integration Prevention Appraisal

Reviews - system - requirements - design - test plan - test scripts Walkthroughs Testing (first time) Independent Verification and Validation (first time) Audits

Training Policies Procedures and methods Tools Planning Quality Improvement Projects Data Gathering and Analysis Root Cause Analysis Quality Reporting

Nonconformance Conformance Fixing Defects Reworking any Document Updating Source Code Re-reviews Re-tests Lab costs for re-tests Patches(Internal and External) Engineering Changes Change Control Boards External Failures and fines Customer Support Help Desks

Dion (1993) Crosby (1979) Quality Is Free

Click to edit Master title style Raytheons Cost of Quality

bull Performance mdash cost of building it right first time bull Nonconformance mdash cost of rework bull Appraisal mdash cost of testing bull Prevention mdash cost of preventing nonconformance

Dion (1993) Haley (1995)

23

18

Year Level Perform Nonconf Appraise Prevent

1988 1 34 41 15 7

1990 2 55 18 15 12

1992 3 66 11

1994 4 76 6 75

Click to edit Master title style Raytheonrsquos Productivity

0

1

2

3

4

1988 1989 1990 1991 1992 1993 1994

Growth relative to 1989 baseline

Lydon (1995)

Click to edit Master title style Raytheonrsquos Cost Reduction

0

02

04

06

08

1

12

1988 1989 1990 1991 1992 1993 1994

Reduction relative to 1989 baseline

Lydon (1995)

Click to edit Master title style

Haley (1995)

1993 1992 1991 1990 1989 1988 1994

50

40

30

20

10

0

-10

Actual cost

Budgeted cost

over-run under-run

4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1

Raytheonrsquos Cost Predictability

Click to edit Master title style OMRONrsquos Reuse gains

Sakamoto et al (1996)

0

5 0 0

1 0 0 0

1 5 0 0

2 0 0 0

2 5 0 0

3 0 0 0

1 9 8 7 1 9 8 8 1 9 8 9 1 9 9 0 1 9 9 1 1 9 9 2 1 9 9 3 1 9 9 4

Steps

100

75

50

25

0

of reuse

Click to edit Master title style Section 4 Section 4 Measuring Quality and

Technical Debt

1) Structural quality measurement

2) Technical debt

3) Consortium for IT Software Quality (CISQ)

89

Click to edit Master title style What about the Product CMMIrsquos primary assumption

ldquoThe quality of a software system is governed by quality of the process used to develop itrdquo

- Watts Humphrey

CMMIrsquos product quality problem

1 Assessments do not verify product quality

2 Compliance focus undermines CMMIrsquos primary assumption

3 Product quality focus at Level 4 in CMM was lost in quantitative process control in CMMI

4 The assumptions underlying control charts are violated by software data

Click to edit Master title style Six Sigmarsquos Challenge

Process focus ndash process improvement ndash Six Sigma Product focus ndash product improvement ndash Design for 6σ

Product focus supplements product focus to unlock even more value from software

Six Sigma projects must have significant benefits Huge benefits

Large benefits

Good benefits

Okay benefits

Small benefits

Now what

Ultimately we run out of projects with significant enough benefits to continue the programhellip

so what is the solution that allows continual improvement

Click to edit Master title style Testing is Not Enough ldquoAs higher levels of assurance are demandedhelliptesting cannot deliver the level of confidence required at a reasonable costrdquo

ldquoThe correctness of the code is rarely the weakest linkrdquo

Jackson D (2009) Spinellis D (2006)

ldquohellipa failure to satisfy a non-functional requirement can be critical even catastrophichellipnon-functional requirements are sometimes difficult to verify We cannot write a test case to verify a systemrsquos reliabilityhellipThe ability to associate code to non-functional properties can be a powerful weapon in a software engineerrsquos arsenalrdquo

Click to edit Master title style CASTrsquos Application Intelligence Platform

Application Analysis

Evaluation of 1200+ coding amp

architectural rules

Application meta-data

Transferability

Changeability

Robustness

Performance

Security

Quality Measurements

Detected Violations

Expensive operation in loop Static vs pooled connections Complex query on big table Large indices on big table

Empty CATCH block Uncontrolled data access Poor memory management Opened resource not closed

SQL injection Cross-site scripting Buffer overflow Uncontrolled format string

Unstructured code Misuse of inheritance Lack of comments Violated naming convention

Highly coupled component Duplicated code Index modified in loop High cyclomatic complexity

Language Parsers

Oracle PLSQL Sybase T-SQL SQL Server T-SQL IBM SQLPSM C C++ C Pro C Cobol CICS Visual Basic VBNet ASPNet Java J2EE JSP XML HTML Javascript VBScript PHP PowerBuilder Oracle Forms PeopleSoft SAP ABAP Netweaver Tibco Business Objects Universal Analyzer for other languages

Click to edit Master title style Modern Apps are a Technology Stack Ar

chite

ctur

al C

ompl

ianc

e

Data Flow Transaction Risk

EJB PLSQL

Oracle

SQL Server

DB2

TSQL

Hibernate

Spring

Struts NET

COBOL

Sybase IMS

Messaging

Integration quality Architectural

compliance Risk propagation Application security Resiliency checks Transaction integrity

Function point Effort estimation Data access control SDK versioning Calibration across

technologies IT organization level

System Level 3

Code style amp layout Expression complexity Code documentation Class or program design Basic coding standards Developer level

Unit Level 1

Java Java Java

Java Java

Web Services Java Java Single languagetechnology layer

Intra-technology architecture Intra-layer dependencies Inter-program invocation Security vulnerabilities Development team level

Technology Level 2

JSP ASPNET APIs

Click to edit Master title style Architecturally Complex Defects

Primary cause of operational

problems

20x as many

fixes to correct

Architecturally Complex Defect

A structural flaw involving interactions among multiple components that reside in different application layers

48

52 92

8 Architecturally

Complex Defects

Code unit-level violations

of total app defects

of total repair effort

Most architecturally complex defects involve an architectural hotspot ndash a badly-built component that should be replaced

Click to edit Master title style Propagated Risk

The impact of violation 284 in Component B is very widely propagated throughout the system ndash

it presents very high risk

The impact of violation 342 in Component A is only narrowly propagated in the system

ndash it presents low risk

Propagated Risk Index

A ranking of the risk created by each violation based on the its severity and the breadth of its impact in the system

Violation 342 in component A

Violation 284 in component B

Remediating violation 284 in Component B will have

the greater impact on reducing risk and cost

Click to edit Master title style Transaction Risk

Customers care most about the dependability of their transactions

The risk of a transaction is determined by the number and severity of violations along its path

Transactions can be ranked by risk to prioritize their remediation

The Transaction Risk Index can be further tuned using the operational frequency of each transaction

Transaction Risk Index

A ranking of transactions based on the risk created by the number and severity of violations along the transaction path

Transaction entryexit in the user interface layer

Transaction fulfillment in the data storage layer

1 3 2 4 5

Transaction 1 poses greatest risk

Transaction processing in the business logic layer

Click to edit Master title style Risk-Based Prioritization

Violation Health Factor Severity Propagated Risk Index

In a high risk transaction

Frequency of path execution

148 Security 9

371 Robustness 9

062 Performance 8

284 Robustness 7

016 Changeability 7

241 Security 5

173 Transferability 4

359 Transferability 3

Click to edit Master title style Static Analysis in US DoD

John Keane (2013)

Testing finds lt13 of structural defects

Static analysis before testing is 85 efficient

With testing and static analysis combined test schedules are reduced by 50

Click to edit Master title style Reducing Operational Incidents amp Costs

Study of structural quality measures and maintenance effort across 20 customers in a large global system integrator

TQI increase of 24 decreased corrective maintenance effort by 50

Total Quality Index

Click to edit Master title style Reducing Operational Losses

0

20

40

60

80

100

120

140

160

180

100 150 200 250 300 350 400

Technical Quality Index

Pro

du

ctio

n In

cid

ents

+$2000000 Annual Loss

$50000 Annual Loss

$5000 Annual Loss

Large international investment bank Business critical applications

Click to edit Master title style Factors Affecting Structural Quality

2017 CRASH Report Request from infocastsoftwarecom

Factor apps Robust Security Perform Change Transfer

Technology 1606 20 3 16 26 7

Java-EE

Industry 677 5 5 7 4 6

App type 510 1 1 1 1 1

Source 580 1 1 4

Shore 540

Maturity 72 28 25 15 24 12

Method 208 10 6 14 6

Team size 186 7 5 8 8

of users 262 4 3 5 4

Percentage of variation in structural quality scores accounted for by various factors

Click to edit Master title style The Technical Debt Metaphor

Structural quality problems in production code

Technical Debt

Principal borrowed

Interest on the debt

Business Risk Liability from debt

Opportunity cost

Interestmdashcontinuing IT costs attributable to the violations causing technical debt ie higher maintenance costs greater resource usage etc

Principalcost of fixing problems remaining in the code after release that must be remediated

Opportunity costmdashbenefits that could have been achieved had resources been put on new capability rather than retiring technical debt

Liabilitymdashbusiness costs related to outages breaches corrupted data etc

Technical Debt the future cost of fixing defects remaining in code at release a component of the cost of ownership

Todayrsquos talk focuses on the principal

Curtis et al (2012) IEEE Software

Click to edit Master title style Tech Debt by Quality Factor

Transferability

40

Changeability

30

Security 7

Robustness

18

70 of Technical Debt is in IT Cost (Transferability Changeability)

30 of Technical Debt is in Business Risk

(Robustness Performance Security)

Health Factor proportions are mostly

consistent across technologies

Curtis et al (2012) IEEE Software

Click to edit Master title style Join CISQ wwwit-cisqorg

Click to edit Master title style Section

References

115

Click to edit Master title style References 1 Anderson DJ (2010) Kanban Sequim WA Blue Hole Press

Basili V Trendowicz A Kowalczyk M Heidrich J Seaman C Munch J amp Rombach D (2014) Aligning Organizations through Measurement Heidelberg Springer

Boehm B (1987) Increasing software productivity IEEE Computer 20 (9) 43-57

Boehm B et al (2000) Software Cost Estimation with COCOMO II Prentice-Hall

Chrissis MB Konrad M amp Shrum S (2011) CMMI Third Edition Guidelines for Process Integration and Product Improvement Reading MA Addison Wesley

Cohn M (2006) Agile Estimating and Planning Upper Saddle River NJ Prentice-Hall

Crosby P (1979) Quality Is Free New York McGraw-Hill

Curtis B Sapiddi J amp Szynkarski A (2012) Estimating the principle of an applicationrsquos technical debt IEEE Software 29 (6) 34-42

Dion R (1993) Process improvement and the corporate balance sheet IEEE Software 10 (4) 28-35

Duncker R (1992) Presentation at the 25th Annual Conference of the Singapore Computer Society Singapore November 1992

Dyne K (1999) ESSI Benefits from Continuous Software Improvements In Proceedings of ESEPGrsquo99 Milton Keyes UK ESPI Foundation

Flowe RM amp Thordahl JB (1994) A Correlational Study of the SEIrsquos Capability Maturity Model and Software Development Performance in DoD Contracts (AFITGSSLAR94D-2) Dayton OH Air Force Institute of Technology Wright Patterson Air Force Base

116

Click to edit Master title style References 2 Haley T Ireland B Wojtaszek E Nash D amp Dion R (1995) Raytheon Electronic Systems

Experience in Software Process Improvement (Tech Rep CMUSEI-95-TR-017) ) Pittsburgh Software Engineering Institute Carnegie Mellon University

Ebert C amp Dumke R (2007) Software Measurement Berlin Springer-Verlag

Fagan E amp Davis N (2005) TSP PSP at Intuit Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Fenton N (2006) New directions for Software Metrics CIO Symposium on Software Best Practices

Garmus D amp Herron D (2001) Function Point Analysis Boston Addison-Wesley

George E (2005) Project management with TSP Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Herbsleb J Carleton A Rozum J Siegel J amp Zubrow D (1994) Benefits of CMM-Based Software Process Improvement Initial Results (Tech Rep CMUSEI-94-TR-13) Pittsburgh Software Engineering Institute Carnegie Mellon University

Humphrey W (1997) Introduction to the Personal Software Process Boston Addison Wesley

Humphrey W (2000) Introduction to the Team Software Process Boston Addison Wesley

Humphrey W (2006) TSP Leading a Development Team Boston Addison Wesley

Jackson D (2009) A direct path to dependable software Communications of the ACM 52 (4)

Kan SH (1995) Metrics and Models in Software Quality Engineering Reading MA Addison-Wesley

Keeni B (2000) IEEE Software 16 (4)

117

Click to edit Master title style References 2 Larman C (2004) Agile and Iterative Development Boston Addison Wesley

Lydon T (1995) Productivity drivers Process and capital In Proceedings of the 1995 SEPG Conference Pittsburgh Software Engineering Institute Carnegie Mellon University

McGarry J et al (2002) Practical Software Measurement Reading MA Addison Wesley

Paulk M Weber C Curtis B amp Chrissis M (1995) The Capability Maturity Model Guidelines for Improving the Software Process Reading MA Addison-Wesley

Peterman W (2000) IEEE Software JulyAugust issue

Poppendieck M amp Poppendieck T (2007) Implementing Lean Software Development Boston Addison-Wesley

Sakamoto K Kishida K amp Nakakoji K (1996) Cultural adaptation of the CMM In Fuggetta A amp Wolf A (Eds) Software Process Chichester UK Wiley 137-154

Sinha M (2006) Performance models ndash Enabling organizations prediction capability Proceedings of SEPGrsquo06 Pittsburgh Software Engineering Institute Carnegie Mellon University

Schwaber K amp Beedle M (2002) Agile Software Development Using Scrum

Spinellis D (2006) Code Quality Addison-Wesley

Van Epps S amp Marshall R (2012) Using TSP at the end of the development cycle Proceedings of TSP 2012 Pittsburgh Software Engineering Institute Carnegie Mellon University

Vu JD (1996) Software process improvement A business case In Proceedings of the European SEPG Conference Milton Keynes UK European Software Process Improvement Foundation

118

  • Best Practices for Software Process and Product Measurement
  • Instructor Dr Bill Curtis
  • Agenda
  • Section 1
  • Why Does Measurement Languish
  • Measurement Process Guidance
  • Measurement Categories
  • Productivity Analysis Objectives
  • Productivity Analysis Measures
  • Software Productivity
  • Software Size Measures
  • How Many Lines of Code
  • Function Point Estimation
  • Effort mdash Weakest Measure
  • How Quality Affects Productivity
  • Carry-forward Rework
  • Example of Quality Impact
  • Quality-Adjusted Productivity
  • Best Practices for Analysis
  • 1 Evaluate Demographics
  • 2 Segment Baselines
  • 3 Beware Sampling Effects
  • 4 Understand Variation
  • 5 Investigate Distributions
  • 6 Account for Maturity Effects
  • 7 Caution for External Data Sets
  • Slide Number 27
  • Measuring Project Progress
  • Requirements Change Impact
  • Phase Defect Removal Model
  • SCRUM Board
  • Burndown Chart and Velocity
  • Analyzing Velocity
  • Measuring and Managing Flow
  • Trends Over Sprints
  • Sources and Measures
  • Recommended Books
  • Personal Software Process (PSP)
  • Team Software Process (TSP)
  • Aggregated Personal Data Best
  • Defect Detection at Intuit
  • TSP Benefits
  • Section 3
  • CMMI Maturity Transitions
  • Enhanced Maturity Framework
  • Maturity Level Transformation
  • Measurement by Level
  • Average Improvement Results
  • Schedule Performance
  • Cost Performance
  • Crosbyrsquos Cost of Quality Model
  • Raytheons Cost of Quality
  • Raytheonrsquos Productivity
  • Raytheonrsquos Cost Reduction
  • Raytheonrsquos Cost Predictability
  • OMRONrsquos Reuse gains
  • Section 4
  • What about the Product
  • Six Sigmarsquos Challenge
  • Testing is Not Enough
  • CASTrsquos Application Intelligence Platform
  • Modern Apps are a Technology Stack
  • Architecturally Complex Defects
  • Propagated Risk
  • Transaction Risk
  • Risk-Based Prioritization
  • Static Analysis in US DoD
  • Reducing Operational Incidents amp Costs
  • Reducing Operational Losses
  • Slide Number 70
  • The Technical Debt Metaphor
  • Tech Debt by Quality Factor
  • Join CISQ wwwit-cisqorg
  • Section
  • References 1
  • References 2
  • References 2

PHASE

Escapes Previous Phase

KSLOC

Defect Injection

KSLOC

Subtotal

KSLOC

Removal

Effective-ness

Defect Removal

KSLOC

Escapes at phase exit KSLOC

Rqmts

0

12

12

0

0

12

Design

12

180

192

76

146

46

Code amp UT

46

154

200

71

142

58

Inte-gration

58

0

58

67

39

19

Sys Test

19

0

19

58

11

08

Opera-tion

08

Requirements Added Requirements Changed Requirements Deleted
Feb 2 20 1
Mar 2 11 1
Apr 45 35 3
May 15 61 1
Jun 1 20 4
Jul 15 15 0
Aug 18 05 2
Feb Feb Feb
Mar Mar Mar
Apr Apr Apr
May May May
Jun Jun Jun
Jul Jul Jul
Aug Aug Aug
Page 31: Best Practices for Software Process and Product Measurement · Software Process and Product Measurement Dr. Bill Curtis ... measurement best practice • Best guidance for starting

Sheet1

Sheet1

Requirements Added
Requirements Changed
Requirements Deleted
Person hours of work or rework

Sheet2

Sheet3

Click to edit Master title style Phase Defect Removal Model

PHASE

Escapes Previous Phase KSLOC

Defect Injection KSLOC

Subtotal KSLOC

Removal Effective-ness

Defect Removal KSLOC

Escapes at phase exit KSLOC

Rqmts 0 12 12 0 0 12

Design 12 180 192 76 146 46

Code amp UT 46 154 200 71 142 58

Inte-gration 58 0 58 67 39 19

Sys Test 19 0 19 58 11 08

Opera-tion 08

Kan (1999) Metrics and Model in Software Quality Engineering

Click to edit Master title style SCRUM Board Scrumboard displays

the path to completion for each story by showing task status

Story points are an estimate of days for each story

Agile estimating is by feature (story) rather than by task

lsquoDonersquo does not always mean all tasks are finished

Kanban methods restrict the work in progress to a limited number of paths

Click to edit Master title style Burndown Chart and Velocity

Burndown chart displays story completion by day

Burndown chart indicates actual versus estimated progress

Velocity is the rate at which stories are completed

Velocity indicates sustainable progress

Velocity results provide one source of historical data for improving estimates

Story points without the context of productivity factors are dangerous for estimating

Click to edit Master title style Team Velocity

0

1

2

3

4

5

1 2 2 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20Days

Story Points per Hour

0

1

2

3

4

5

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

HrsStryPt

Analyzing Velocity

Item Cycle Times

02468

101214161820

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

Stories Hours HrsStryPt

0

10

20

30

40

50

60

1 2 2 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

Burndown Chart

Days

Click to edit Master title style Measuring and Managing Flow

Released

In progress

Backlogged

Cumulative Flow Diagram Days to delivery

outliers

Anderson (2010) Kanban

Click to edit Master title style

0

10

20

30

40

50

60

70

80

1 2 3 4 5 6 7 8 9 10 11 12

Sprints

Sto

ry p

oin

ts d

eliv

ered

Trends Over Sprints

0

10

20

30

40

50

60

70

80

1 2 3 4 5 6 7 8 9 10 11 12

Sprints

Sto

ry p

oin

ts d

eliv

ered

0

05

1

15

2

25

3

35

4

1 2 3 4 5 6 7 8 9 10 11 12

Sprints

Tech

nic

al d

ebt

Track and correlate multiple measures

across sprints

Predictive models

Click to edit Master title style Sources and Measures

Stories submitted

Stories pulled back

Growth in size of stories

Stories added mid-sprint

Stories under review

Failed tests

Non-development tasks

Assists to other projects

Data sources

Measures Project tracking

Code control

Bug tracking

Build system

Test environment

Deployment tools

Operational monitoring

Click to edit Master title style Recommended Books

Click to edit Master title style Personal Software Process (PSP)

PSP 0 Current process Time recording

Defect recording Defect type standards

PSP 01 Coding standard

Size measurement Process improvement

proposal

Baseline Personal Process

Humphrey (1997) Introduction to the Personal Software Process

PSP 1 Size estimating

Test report

Personal Planning

PSP 2 Code reviews

Design reviews

Personal Quality Management

PSP 21 Cyclic development

Cyclic Personal Process

PSP 11 Task planning

Schedule planning

PSP 21 Design templates

Click to edit Master title style Team Software Process (TSP) bull Built from personal processes of team members

bull Well defined team roles

bull Project launch workshops for planning

bull Team measurement and tracking

bull Post-mortems for improvement

bull Team application of lean principles

Humphrey (2000) Introduction to the Team Software Process

Click to edit Master title style Aggregated Personal Data Best

R2s range from 70 to 87

The ability to predict the amount of time required to produce a piece of software is dramatically improved by estimating at the individual level first and then combining the estimates rather than developing a single team estimate averaged over individuals

R2 = 53

Click to edit Master title style Defect Detection at Intuit

Consequently test changes from defect

detection into correctness verification

PSPTSP shifts defect detection from the test phase back up into development

Fagan (2005)

Personal reviews

33

Compile 33

Team reviews 19

Unit test 15

Test 14

Click to edit Master title style

Dramatic reductions

ndash Delivered defects

ndash Days per KLOC

ndash Schedule deviation

ndash Test defectskloc

ndash Variation in results

Humphrey (2006) TSP Leading a Development Team

TSP Benefits

Click to edit Master title style Section 3 Section 3 Measuring Software

Improvement Programs

1) Improvement program methods

2) Maturity-based measurement

3) Improvement program results

61

Click to edit Master title style CMMI Maturity Transitions

Level 5 Optimizing

Innovation management

Level 4 Quantitatively

Managed

Capability management

Level 3 Defined

Process management

Level 2 Managed

Project management

Level 1 Initial

Inconsistent management

Chrisis Konrad amp Shrum (2005) CMMI

Click to edit Master title style Enhanced Maturity Framework

Level 5 Innovating

Innovation management

Level 4 Optimized

Capability management

Level 3 Standardized

Organizational management

Level 2 Stabilized

Work unit management

Level 1 Initial

Inconsistent management

Click to edit Master title style Maturity Level Transformation

Level 5 Opportunistic improvements

Empowered culture

Level 5 Proactive

improvements Agile culture

Organization

Projects

Individual

Level 4 Process and results

managed statistically Precision culture

Level 3 Organization develops

standard processes Common culture

Level 2 Project managers stabilize projects

Local culture

Level 1 Ad Hoc inconsistent

development practices Relationship culture

Trust

Click to edit Master title style Measurement by Level

Level 5 mdash Exploratory data ∆ t = N x ROI

Level 4 mdash Predictive data X plusmn (196σ radicn )

Level 3 mdash Process data 83 defects per review

Level 2 mdash Management data Effort = α LOCβ

Level 1 mdash Haphazard data ldquoabout a million linesrdquo

Measure-maturity mismatch slows improvement and creates

dysfunction

Click to edit Master title style Average Improvement Results

Productivity growth 4 35 9 - 67

Pre-test defect detection 3 22 6 - 25

Time to market 2 darr 19 15 - 23

Field error reports 5 darr 39 10 - 94

Return on investment 5 501 41 - 881

Annual Annual Improvement benefit Orgs median range

Click to edit Master title style Schedule Performance

0

02

04

06

08

1

12

14

16

1 2 3

Schedule performance

index

CMM maturity level

Budgeted cost of work performed

Budgeted cost of work scheduled

behind ahead

Air Force study of contractors

Flowe amp Thordahl (1994)

Click to edit Master title style Cost Performance

Cost performance

index

0

05

1

15

2

25

1 2 3 CMM Maturity Level

Budgeted cost of work performed Actual cost of

work performed

overrun underrun

Air Force study of contractors

Flowe amp Thordahl (1994)

Click to edit Master title style Crosbyrsquos Cost of Quality Model Project Cost

Performance Cost of Quality Generating plans documentation Development - requirements - design - code - integration Prevention Appraisal

Reviews - system - requirements - design - test plan - test scripts Walkthroughs Testing (first time) Independent Verification and Validation (first time) Audits

Training Policies Procedures and methods Tools Planning Quality Improvement Projects Data Gathering and Analysis Root Cause Analysis Quality Reporting

Nonconformance Conformance Fixing Defects Reworking any Document Updating Source Code Re-reviews Re-tests Lab costs for re-tests Patches(Internal and External) Engineering Changes Change Control Boards External Failures and fines Customer Support Help Desks

Dion (1993) Crosby (1979) Quality Is Free

Click to edit Master title style Raytheons Cost of Quality

bull Performance mdash cost of building it right first time bull Nonconformance mdash cost of rework bull Appraisal mdash cost of testing bull Prevention mdash cost of preventing nonconformance

Dion (1993) Haley (1995)

23

18

Year Level Perform Nonconf Appraise Prevent

1988 1 34 41 15 7

1990 2 55 18 15 12

1992 3 66 11

1994 4 76 6 75

Click to edit Master title style Raytheonrsquos Productivity

0

1

2

3

4

1988 1989 1990 1991 1992 1993 1994

Growth relative to 1989 baseline

Lydon (1995)

Click to edit Master title style Raytheonrsquos Cost Reduction

0

02

04

06

08

1

12

1988 1989 1990 1991 1992 1993 1994

Reduction relative to 1989 baseline

Lydon (1995)

Click to edit Master title style

Haley (1995)

1993 1992 1991 1990 1989 1988 1994

50

40

30

20

10

0

-10

Actual cost

Budgeted cost

over-run under-run

4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1

Raytheonrsquos Cost Predictability

Click to edit Master title style OMRONrsquos Reuse gains

Sakamoto et al (1996)

0

5 0 0

1 0 0 0

1 5 0 0

2 0 0 0

2 5 0 0

3 0 0 0

1 9 8 7 1 9 8 8 1 9 8 9 1 9 9 0 1 9 9 1 1 9 9 2 1 9 9 3 1 9 9 4

Steps

100

75

50

25

0

of reuse

Click to edit Master title style Section 4 Section 4 Measuring Quality and

Technical Debt

1) Structural quality measurement

2) Technical debt

3) Consortium for IT Software Quality (CISQ)

89

Click to edit Master title style What about the Product CMMIrsquos primary assumption

ldquoThe quality of a software system is governed by quality of the process used to develop itrdquo

- Watts Humphrey

CMMIrsquos product quality problem

1 Assessments do not verify product quality

2 Compliance focus undermines CMMIrsquos primary assumption

3 Product quality focus at Level 4 in CMM was lost in quantitative process control in CMMI

4 The assumptions underlying control charts are violated by software data

Click to edit Master title style Six Sigmarsquos Challenge

Process focus ndash process improvement ndash Six Sigma Product focus ndash product improvement ndash Design for 6σ

Product focus supplements product focus to unlock even more value from software

Six Sigma projects must have significant benefits Huge benefits

Large benefits

Good benefits

Okay benefits

Small benefits

Now what

Ultimately we run out of projects with significant enough benefits to continue the programhellip

so what is the solution that allows continual improvement

Click to edit Master title style Testing is Not Enough ldquoAs higher levels of assurance are demandedhelliptesting cannot deliver the level of confidence required at a reasonable costrdquo

ldquoThe correctness of the code is rarely the weakest linkrdquo

Jackson D (2009) Spinellis D (2006)

ldquohellipa failure to satisfy a non-functional requirement can be critical even catastrophichellipnon-functional requirements are sometimes difficult to verify We cannot write a test case to verify a systemrsquos reliabilityhellipThe ability to associate code to non-functional properties can be a powerful weapon in a software engineerrsquos arsenalrdquo

Click to edit Master title style CASTrsquos Application Intelligence Platform

Application Analysis

Evaluation of 1200+ coding amp

architectural rules

Application meta-data

Transferability

Changeability

Robustness

Performance

Security

Quality Measurements

Detected Violations

Expensive operation in loop Static vs pooled connections Complex query on big table Large indices on big table

Empty CATCH block Uncontrolled data access Poor memory management Opened resource not closed

SQL injection Cross-site scripting Buffer overflow Uncontrolled format string

Unstructured code Misuse of inheritance Lack of comments Violated naming convention

Highly coupled component Duplicated code Index modified in loop High cyclomatic complexity

Language Parsers

Oracle PLSQL Sybase T-SQL SQL Server T-SQL IBM SQLPSM C C++ C Pro C Cobol CICS Visual Basic VBNet ASPNet Java J2EE JSP XML HTML Javascript VBScript PHP PowerBuilder Oracle Forms PeopleSoft SAP ABAP Netweaver Tibco Business Objects Universal Analyzer for other languages

Click to edit Master title style Modern Apps are a Technology Stack Ar

chite

ctur

al C

ompl

ianc

e

Data Flow Transaction Risk

EJB PLSQL

Oracle

SQL Server

DB2

TSQL

Hibernate

Spring

Struts NET

COBOL

Sybase IMS

Messaging

Integration quality Architectural

compliance Risk propagation Application security Resiliency checks Transaction integrity

Function point Effort estimation Data access control SDK versioning Calibration across

technologies IT organization level

System Level 3

Code style amp layout Expression complexity Code documentation Class or program design Basic coding standards Developer level

Unit Level 1

Java Java Java

Java Java

Web Services Java Java Single languagetechnology layer

Intra-technology architecture Intra-layer dependencies Inter-program invocation Security vulnerabilities Development team level

Technology Level 2

JSP ASPNET APIs

Click to edit Master title style Architecturally Complex Defects

Primary cause of operational

problems

20x as many

fixes to correct

Architecturally Complex Defect

A structural flaw involving interactions among multiple components that reside in different application layers

48

52 92

8 Architecturally

Complex Defects

Code unit-level violations

of total app defects

of total repair effort

Most architecturally complex defects involve an architectural hotspot ndash a badly-built component that should be replaced

Click to edit Master title style Propagated Risk

The impact of violation 284 in Component B is very widely propagated throughout the system ndash

it presents very high risk

The impact of violation 342 in Component A is only narrowly propagated in the system

ndash it presents low risk

Propagated Risk Index

A ranking of the risk created by each violation based on the its severity and the breadth of its impact in the system

Violation 342 in component A

Violation 284 in component B

Remediating violation 284 in Component B will have

the greater impact on reducing risk and cost

Click to edit Master title style Transaction Risk

Customers care most about the dependability of their transactions

The risk of a transaction is determined by the number and severity of violations along its path

Transactions can be ranked by risk to prioritize their remediation

The Transaction Risk Index can be further tuned using the operational frequency of each transaction

Transaction Risk Index

A ranking of transactions based on the risk created by the number and severity of violations along the transaction path

Transaction entryexit in the user interface layer

Transaction fulfillment in the data storage layer

1 3 2 4 5

Transaction 1 poses greatest risk

Transaction processing in the business logic layer

Click to edit Master title style Risk-Based Prioritization

Violation Health Factor Severity Propagated Risk Index

In a high risk transaction

Frequency of path execution

148 Security 9

371 Robustness 9

062 Performance 8

284 Robustness 7

016 Changeability 7

241 Security 5

173 Transferability 4

359 Transferability 3

Click to edit Master title style Static Analysis in US DoD

John Keane (2013)

Testing finds lt13 of structural defects

Static analysis before testing is 85 efficient

With testing and static analysis combined test schedules are reduced by 50

Click to edit Master title style Reducing Operational Incidents amp Costs

Study of structural quality measures and maintenance effort across 20 customers in a large global system integrator

TQI increase of 24 decreased corrective maintenance effort by 50

Total Quality Index

Click to edit Master title style Reducing Operational Losses

0

20

40

60

80

100

120

140

160

180

100 150 200 250 300 350 400

Technical Quality Index

Pro

du

ctio

n In

cid

ents

+$2000000 Annual Loss

$50000 Annual Loss

$5000 Annual Loss

Large international investment bank Business critical applications

Click to edit Master title style Factors Affecting Structural Quality

2017 CRASH Report Request from infocastsoftwarecom

Factor apps Robust Security Perform Change Transfer

Technology 1606 20 3 16 26 7

Java-EE

Industry 677 5 5 7 4 6

App type 510 1 1 1 1 1

Source 580 1 1 4

Shore 540

Maturity 72 28 25 15 24 12

Method 208 10 6 14 6

Team size 186 7 5 8 8

of users 262 4 3 5 4

Percentage of variation in structural quality scores accounted for by various factors

Click to edit Master title style The Technical Debt Metaphor

Structural quality problems in production code

Technical Debt

Principal borrowed

Interest on the debt

Business Risk Liability from debt

Opportunity cost

Interestmdashcontinuing IT costs attributable to the violations causing technical debt ie higher maintenance costs greater resource usage etc

Principalcost of fixing problems remaining in the code after release that must be remediated

Opportunity costmdashbenefits that could have been achieved had resources been put on new capability rather than retiring technical debt

Liabilitymdashbusiness costs related to outages breaches corrupted data etc

Technical Debt the future cost of fixing defects remaining in code at release a component of the cost of ownership

Todayrsquos talk focuses on the principal

Curtis et al (2012) IEEE Software

Click to edit Master title style Tech Debt by Quality Factor

Transferability

40

Changeability

30

Security 7

Robustness

18

70 of Technical Debt is in IT Cost (Transferability Changeability)

30 of Technical Debt is in Business Risk

(Robustness Performance Security)

Health Factor proportions are mostly

consistent across technologies

Curtis et al (2012) IEEE Software

Click to edit Master title style Join CISQ wwwit-cisqorg

Click to edit Master title style Section

References

115

Click to edit Master title style References 1 Anderson DJ (2010) Kanban Sequim WA Blue Hole Press

Basili V Trendowicz A Kowalczyk M Heidrich J Seaman C Munch J amp Rombach D (2014) Aligning Organizations through Measurement Heidelberg Springer

Boehm B (1987) Increasing software productivity IEEE Computer 20 (9) 43-57

Boehm B et al (2000) Software Cost Estimation with COCOMO II Prentice-Hall

Chrissis MB Konrad M amp Shrum S (2011) CMMI Third Edition Guidelines for Process Integration and Product Improvement Reading MA Addison Wesley

Cohn M (2006) Agile Estimating and Planning Upper Saddle River NJ Prentice-Hall

Crosby P (1979) Quality Is Free New York McGraw-Hill

Curtis B Sapiddi J amp Szynkarski A (2012) Estimating the principle of an applicationrsquos technical debt IEEE Software 29 (6) 34-42

Dion R (1993) Process improvement and the corporate balance sheet IEEE Software 10 (4) 28-35

Duncker R (1992) Presentation at the 25th Annual Conference of the Singapore Computer Society Singapore November 1992

Dyne K (1999) ESSI Benefits from Continuous Software Improvements In Proceedings of ESEPGrsquo99 Milton Keyes UK ESPI Foundation

Flowe RM amp Thordahl JB (1994) A Correlational Study of the SEIrsquos Capability Maturity Model and Software Development Performance in DoD Contracts (AFITGSSLAR94D-2) Dayton OH Air Force Institute of Technology Wright Patterson Air Force Base

116

Click to edit Master title style References 2 Haley T Ireland B Wojtaszek E Nash D amp Dion R (1995) Raytheon Electronic Systems

Experience in Software Process Improvement (Tech Rep CMUSEI-95-TR-017) ) Pittsburgh Software Engineering Institute Carnegie Mellon University

Ebert C amp Dumke R (2007) Software Measurement Berlin Springer-Verlag

Fagan E amp Davis N (2005) TSP PSP at Intuit Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Fenton N (2006) New directions for Software Metrics CIO Symposium on Software Best Practices

Garmus D amp Herron D (2001) Function Point Analysis Boston Addison-Wesley

George E (2005) Project management with TSP Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Herbsleb J Carleton A Rozum J Siegel J amp Zubrow D (1994) Benefits of CMM-Based Software Process Improvement Initial Results (Tech Rep CMUSEI-94-TR-13) Pittsburgh Software Engineering Institute Carnegie Mellon University

Humphrey W (1997) Introduction to the Personal Software Process Boston Addison Wesley

Humphrey W (2000) Introduction to the Team Software Process Boston Addison Wesley

Humphrey W (2006) TSP Leading a Development Team Boston Addison Wesley

Jackson D (2009) A direct path to dependable software Communications of the ACM 52 (4)

Kan SH (1995) Metrics and Models in Software Quality Engineering Reading MA Addison-Wesley

Keeni B (2000) IEEE Software 16 (4)

117

Click to edit Master title style References 2 Larman C (2004) Agile and Iterative Development Boston Addison Wesley

Lydon T (1995) Productivity drivers Process and capital In Proceedings of the 1995 SEPG Conference Pittsburgh Software Engineering Institute Carnegie Mellon University

McGarry J et al (2002) Practical Software Measurement Reading MA Addison Wesley

Paulk M Weber C Curtis B amp Chrissis M (1995) The Capability Maturity Model Guidelines for Improving the Software Process Reading MA Addison-Wesley

Peterman W (2000) IEEE Software JulyAugust issue

Poppendieck M amp Poppendieck T (2007) Implementing Lean Software Development Boston Addison-Wesley

Sakamoto K Kishida K amp Nakakoji K (1996) Cultural adaptation of the CMM In Fuggetta A amp Wolf A (Eds) Software Process Chichester UK Wiley 137-154

Sinha M (2006) Performance models ndash Enabling organizations prediction capability Proceedings of SEPGrsquo06 Pittsburgh Software Engineering Institute Carnegie Mellon University

Schwaber K amp Beedle M (2002) Agile Software Development Using Scrum

Spinellis D (2006) Code Quality Addison-Wesley

Van Epps S amp Marshall R (2012) Using TSP at the end of the development cycle Proceedings of TSP 2012 Pittsburgh Software Engineering Institute Carnegie Mellon University

Vu JD (1996) Software process improvement A business case In Proceedings of the European SEPG Conference Milton Keynes UK European Software Process Improvement Foundation

118

  • Best Practices for Software Process and Product Measurement
  • Instructor Dr Bill Curtis
  • Agenda
  • Section 1
  • Why Does Measurement Languish
  • Measurement Process Guidance
  • Measurement Categories
  • Productivity Analysis Objectives
  • Productivity Analysis Measures
  • Software Productivity
  • Software Size Measures
  • How Many Lines of Code
  • Function Point Estimation
  • Effort mdash Weakest Measure
  • How Quality Affects Productivity
  • Carry-forward Rework
  • Example of Quality Impact
  • Quality-Adjusted Productivity
  • Best Practices for Analysis
  • 1 Evaluate Demographics
  • 2 Segment Baselines
  • 3 Beware Sampling Effects
  • 4 Understand Variation
  • 5 Investigate Distributions
  • 6 Account for Maturity Effects
  • 7 Caution for External Data Sets
  • Slide Number 27
  • Measuring Project Progress
  • Requirements Change Impact
  • Phase Defect Removal Model
  • SCRUM Board
  • Burndown Chart and Velocity
  • Analyzing Velocity
  • Measuring and Managing Flow
  • Trends Over Sprints
  • Sources and Measures
  • Recommended Books
  • Personal Software Process (PSP)
  • Team Software Process (TSP)
  • Aggregated Personal Data Best
  • Defect Detection at Intuit
  • TSP Benefits
  • Section 3
  • CMMI Maturity Transitions
  • Enhanced Maturity Framework
  • Maturity Level Transformation
  • Measurement by Level
  • Average Improvement Results
  • Schedule Performance
  • Cost Performance
  • Crosbyrsquos Cost of Quality Model
  • Raytheons Cost of Quality
  • Raytheonrsquos Productivity
  • Raytheonrsquos Cost Reduction
  • Raytheonrsquos Cost Predictability
  • OMRONrsquos Reuse gains
  • Section 4
  • What about the Product
  • Six Sigmarsquos Challenge
  • Testing is Not Enough
  • CASTrsquos Application Intelligence Platform
  • Modern Apps are a Technology Stack
  • Architecturally Complex Defects
  • Propagated Risk
  • Transaction Risk
  • Risk-Based Prioritization
  • Static Analysis in US DoD
  • Reducing Operational Incidents amp Costs
  • Reducing Operational Losses
  • Slide Number 70
  • The Technical Debt Metaphor
  • Tech Debt by Quality Factor
  • Join CISQ wwwit-cisqorg
  • Section
  • References 1
  • References 2
  • References 2

PHASE

Escapes Previous Phase

KSLOC

Defect Injection

KSLOC

Subtotal

KSLOC

Removal

Effective-ness

Defect Removal

KSLOC

Escapes at phase exit KSLOC

Rqmts

0

12

12

0

0

12

Design

12

180

192

76

146

46

Code amp UT

46

154

200

71

142

58

Inte-gration

58

0

58

67

39

19

Sys Test

19

0

19

58

11

08

Opera-tion

08

Requirements Added Requirements Changed Requirements Deleted
Feb 2 20 1
Mar 2 11 1
Apr 45 35 3
May 15 61 1
Jun 1 20 4
Jul 15 15 0
Aug 18 05 2
Page 32: Best Practices for Software Process and Product Measurement · Software Process and Product Measurement Dr. Bill Curtis ... measurement best practice • Best guidance for starting

Sheet1

Requirements Added
Requirements Changed
Requirements Deleted
Person hours of work or rework

Sheet2

Sheet3

Click to edit Master title style Phase Defect Removal Model

PHASE

Escapes Previous Phase KSLOC

Defect Injection KSLOC

Subtotal KSLOC

Removal Effective-ness

Defect Removal KSLOC

Escapes at phase exit KSLOC

Rqmts 0 12 12 0 0 12

Design 12 180 192 76 146 46

Code amp UT 46 154 200 71 142 58

Inte-gration 58 0 58 67 39 19

Sys Test 19 0 19 58 11 08

Opera-tion 08

Kan (1999) Metrics and Model in Software Quality Engineering

Click to edit Master title style SCRUM Board Scrumboard displays

the path to completion for each story by showing task status

Story points are an estimate of days for each story

Agile estimating is by feature (story) rather than by task

lsquoDonersquo does not always mean all tasks are finished

Kanban methods restrict the work in progress to a limited number of paths

Click to edit Master title style Burndown Chart and Velocity

Burndown chart displays story completion by day

Burndown chart indicates actual versus estimated progress

Velocity is the rate at which stories are completed

Velocity indicates sustainable progress

Velocity results provide one source of historical data for improving estimates

Story points without the context of productivity factors are dangerous for estimating

Click to edit Master title style Team Velocity

0

1

2

3

4

5

1 2 2 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20Days

Story Points per Hour

0

1

2

3

4

5

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

HrsStryPt

Analyzing Velocity

Item Cycle Times

02468

101214161820

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

Stories Hours HrsStryPt

0

10

20

30

40

50

60

1 2 2 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

Burndown Chart

Days

Click to edit Master title style Measuring and Managing Flow

Released

In progress

Backlogged

Cumulative Flow Diagram Days to delivery

outliers

Anderson (2010) Kanban

Click to edit Master title style

0

10

20

30

40

50

60

70

80

1 2 3 4 5 6 7 8 9 10 11 12

Sprints

Sto

ry p

oin

ts d

eliv

ered

Trends Over Sprints

0

10

20

30

40

50

60

70

80

1 2 3 4 5 6 7 8 9 10 11 12

Sprints

Sto

ry p

oin

ts d

eliv

ered

0

05

1

15

2

25

3

35

4

1 2 3 4 5 6 7 8 9 10 11 12

Sprints

Tech

nic

al d

ebt

Track and correlate multiple measures

across sprints

Predictive models

Click to edit Master title style Sources and Measures

Stories submitted

Stories pulled back

Growth in size of stories

Stories added mid-sprint

Stories under review

Failed tests

Non-development tasks

Assists to other projects

Data sources

Measures Project tracking

Code control

Bug tracking

Build system

Test environment

Deployment tools

Operational monitoring

Click to edit Master title style Recommended Books

Click to edit Master title style Personal Software Process (PSP)

PSP 0 Current process Time recording

Defect recording Defect type standards

PSP 01 Coding standard

Size measurement Process improvement

proposal

Baseline Personal Process

Humphrey (1997) Introduction to the Personal Software Process

PSP 1 Size estimating

Test report

Personal Planning

PSP 2 Code reviews

Design reviews

Personal Quality Management

PSP 21 Cyclic development

Cyclic Personal Process

PSP 11 Task planning

Schedule planning

PSP 21 Design templates

Click to edit Master title style Team Software Process (TSP) bull Built from personal processes of team members

bull Well defined team roles

bull Project launch workshops for planning

bull Team measurement and tracking

bull Post-mortems for improvement

bull Team application of lean principles

Humphrey (2000) Introduction to the Team Software Process

Click to edit Master title style Aggregated Personal Data Best

R2s range from 70 to 87

The ability to predict the amount of time required to produce a piece of software is dramatically improved by estimating at the individual level first and then combining the estimates rather than developing a single team estimate averaged over individuals

R2 = 53

Click to edit Master title style Defect Detection at Intuit

Consequently test changes from defect

detection into correctness verification

PSPTSP shifts defect detection from the test phase back up into development

Fagan (2005)

Personal reviews

33

Compile 33

Team reviews 19

Unit test 15

Test 14

Click to edit Master title style

Dramatic reductions

ndash Delivered defects

ndash Days per KLOC

ndash Schedule deviation

ndash Test defectskloc

ndash Variation in results

Humphrey (2006) TSP Leading a Development Team

TSP Benefits

Click to edit Master title style Section 3 Section 3 Measuring Software

Improvement Programs

1) Improvement program methods

2) Maturity-based measurement

3) Improvement program results

61

Click to edit Master title style CMMI Maturity Transitions

Level 5 Optimizing

Innovation management

Level 4 Quantitatively

Managed

Capability management

Level 3 Defined

Process management

Level 2 Managed

Project management

Level 1 Initial

Inconsistent management

Chrisis Konrad amp Shrum (2005) CMMI

Click to edit Master title style Enhanced Maturity Framework

Level 5 Innovating

Innovation management

Level 4 Optimized

Capability management

Level 3 Standardized

Organizational management

Level 2 Stabilized

Work unit management

Level 1 Initial

Inconsistent management

Click to edit Master title style Maturity Level Transformation

Level 5 Opportunistic improvements

Empowered culture

Level 5 Proactive

improvements Agile culture

Organization

Projects

Individual

Level 4 Process and results

managed statistically Precision culture

Level 3 Organization develops

standard processes Common culture

Level 2 Project managers stabilize projects

Local culture

Level 1 Ad Hoc inconsistent

development practices Relationship culture

Trust

Click to edit Master title style Measurement by Level

Level 5 mdash Exploratory data ∆ t = N x ROI

Level 4 mdash Predictive data X plusmn (196σ radicn )

Level 3 mdash Process data 83 defects per review

Level 2 mdash Management data Effort = α LOCβ

Level 1 mdash Haphazard data ldquoabout a million linesrdquo

Measure-maturity mismatch slows improvement and creates

dysfunction

Click to edit Master title style Average Improvement Results

Productivity growth 4 35 9 - 67

Pre-test defect detection 3 22 6 - 25

Time to market 2 darr 19 15 - 23

Field error reports 5 darr 39 10 - 94

Return on investment 5 501 41 - 881

Annual Annual Improvement benefit Orgs median range

Click to edit Master title style Schedule Performance

0

02

04

06

08

1

12

14

16

1 2 3

Schedule performance

index

CMM maturity level

Budgeted cost of work performed

Budgeted cost of work scheduled

behind ahead

Air Force study of contractors

Flowe amp Thordahl (1994)

Click to edit Master title style Cost Performance

Cost performance

index

0

05

1

15

2

25

1 2 3 CMM Maturity Level

Budgeted cost of work performed Actual cost of

work performed

overrun underrun

Air Force study of contractors

Flowe amp Thordahl (1994)

Click to edit Master title style Crosbyrsquos Cost of Quality Model Project Cost

Performance Cost of Quality Generating plans documentation Development - requirements - design - code - integration Prevention Appraisal

Reviews - system - requirements - design - test plan - test scripts Walkthroughs Testing (first time) Independent Verification and Validation (first time) Audits

Training Policies Procedures and methods Tools Planning Quality Improvement Projects Data Gathering and Analysis Root Cause Analysis Quality Reporting

Nonconformance Conformance Fixing Defects Reworking any Document Updating Source Code Re-reviews Re-tests Lab costs for re-tests Patches(Internal and External) Engineering Changes Change Control Boards External Failures and fines Customer Support Help Desks

Dion (1993) Crosby (1979) Quality Is Free

Click to edit Master title style Raytheons Cost of Quality

bull Performance mdash cost of building it right first time bull Nonconformance mdash cost of rework bull Appraisal mdash cost of testing bull Prevention mdash cost of preventing nonconformance

Dion (1993) Haley (1995)

23

18

Year Level Perform Nonconf Appraise Prevent

1988 1 34 41 15 7

1990 2 55 18 15 12

1992 3 66 11

1994 4 76 6 75

Click to edit Master title style Raytheonrsquos Productivity

0

1

2

3

4

1988 1989 1990 1991 1992 1993 1994

Growth relative to 1989 baseline

Lydon (1995)

Click to edit Master title style Raytheonrsquos Cost Reduction

0

02

04

06

08

1

12

1988 1989 1990 1991 1992 1993 1994

Reduction relative to 1989 baseline

Lydon (1995)

Click to edit Master title style

Haley (1995)

1993 1992 1991 1990 1989 1988 1994

50

40

30

20

10

0

-10

Actual cost

Budgeted cost

over-run under-run

4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1

Raytheonrsquos Cost Predictability

Click to edit Master title style OMRONrsquos Reuse gains

Sakamoto et al (1996)

0

5 0 0

1 0 0 0

1 5 0 0

2 0 0 0

2 5 0 0

3 0 0 0

1 9 8 7 1 9 8 8 1 9 8 9 1 9 9 0 1 9 9 1 1 9 9 2 1 9 9 3 1 9 9 4

Steps

100

75

50

25

0

of reuse

Click to edit Master title style Section 4 Section 4 Measuring Quality and

Technical Debt

1) Structural quality measurement

2) Technical debt

3) Consortium for IT Software Quality (CISQ)

89

Click to edit Master title style What about the Product CMMIrsquos primary assumption

ldquoThe quality of a software system is governed by quality of the process used to develop itrdquo

- Watts Humphrey

CMMIrsquos product quality problem

1 Assessments do not verify product quality

2 Compliance focus undermines CMMIrsquos primary assumption

3 Product quality focus at Level 4 in CMM was lost in quantitative process control in CMMI

4 The assumptions underlying control charts are violated by software data

Click to edit Master title style Six Sigmarsquos Challenge

Process focus ndash process improvement ndash Six Sigma Product focus ndash product improvement ndash Design for 6σ

Product focus supplements product focus to unlock even more value from software

Six Sigma projects must have significant benefits Huge benefits

Large benefits

Good benefits

Okay benefits

Small benefits

Now what

Ultimately we run out of projects with significant enough benefits to continue the programhellip

so what is the solution that allows continual improvement

Click to edit Master title style Testing is Not Enough ldquoAs higher levels of assurance are demandedhelliptesting cannot deliver the level of confidence required at a reasonable costrdquo

ldquoThe correctness of the code is rarely the weakest linkrdquo

Jackson D (2009) Spinellis D (2006)

ldquohellipa failure to satisfy a non-functional requirement can be critical even catastrophichellipnon-functional requirements are sometimes difficult to verify We cannot write a test case to verify a systemrsquos reliabilityhellipThe ability to associate code to non-functional properties can be a powerful weapon in a software engineerrsquos arsenalrdquo

Click to edit Master title style CASTrsquos Application Intelligence Platform

Application Analysis

Evaluation of 1200+ coding amp

architectural rules

Application meta-data

Transferability

Changeability

Robustness

Performance

Security

Quality Measurements

Detected Violations

Expensive operation in loop Static vs pooled connections Complex query on big table Large indices on big table

Empty CATCH block Uncontrolled data access Poor memory management Opened resource not closed

SQL injection Cross-site scripting Buffer overflow Uncontrolled format string

Unstructured code Misuse of inheritance Lack of comments Violated naming convention

Highly coupled component Duplicated code Index modified in loop High cyclomatic complexity

Language Parsers

Oracle PLSQL Sybase T-SQL SQL Server T-SQL IBM SQLPSM C C++ C Pro C Cobol CICS Visual Basic VBNet ASPNet Java J2EE JSP XML HTML Javascript VBScript PHP PowerBuilder Oracle Forms PeopleSoft SAP ABAP Netweaver Tibco Business Objects Universal Analyzer for other languages

Click to edit Master title style Modern Apps are a Technology Stack Ar

chite

ctur

al C

ompl

ianc

e

Data Flow Transaction Risk

EJB PLSQL

Oracle

SQL Server

DB2

TSQL

Hibernate

Spring

Struts NET

COBOL

Sybase IMS

Messaging

Integration quality Architectural

compliance Risk propagation Application security Resiliency checks Transaction integrity

Function point Effort estimation Data access control SDK versioning Calibration across

technologies IT organization level

System Level 3

Code style amp layout Expression complexity Code documentation Class or program design Basic coding standards Developer level

Unit Level 1

Java Java Java

Java Java

Web Services Java Java Single languagetechnology layer

Intra-technology architecture Intra-layer dependencies Inter-program invocation Security vulnerabilities Development team level

Technology Level 2

JSP ASPNET APIs

Click to edit Master title style Architecturally Complex Defects

Primary cause of operational

problems

20x as many

fixes to correct

Architecturally Complex Defect

A structural flaw involving interactions among multiple components that reside in different application layers

48

52 92

8 Architecturally

Complex Defects

Code unit-level violations

of total app defects

of total repair effort

Most architecturally complex defects involve an architectural hotspot ndash a badly-built component that should be replaced

Click to edit Master title style Propagated Risk

The impact of violation 284 in Component B is very widely propagated throughout the system ndash

it presents very high risk

The impact of violation 342 in Component A is only narrowly propagated in the system

ndash it presents low risk

Propagated Risk Index

A ranking of the risk created by each violation based on the its severity and the breadth of its impact in the system

Violation 342 in component A

Violation 284 in component B

Remediating violation 284 in Component B will have

the greater impact on reducing risk and cost

Click to edit Master title style Transaction Risk

Customers care most about the dependability of their transactions

The risk of a transaction is determined by the number and severity of violations along its path

Transactions can be ranked by risk to prioritize their remediation

The Transaction Risk Index can be further tuned using the operational frequency of each transaction

Transaction Risk Index

A ranking of transactions based on the risk created by the number and severity of violations along the transaction path

Transaction entryexit in the user interface layer

Transaction fulfillment in the data storage layer

1 3 2 4 5

Transaction 1 poses greatest risk

Transaction processing in the business logic layer

Click to edit Master title style Risk-Based Prioritization

Violation Health Factor Severity Propagated Risk Index

In a high risk transaction

Frequency of path execution

148 Security 9

371 Robustness 9

062 Performance 8

284 Robustness 7

016 Changeability 7

241 Security 5

173 Transferability 4

359 Transferability 3

Click to edit Master title style Static Analysis in US DoD

John Keane (2013)

Testing finds lt13 of structural defects

Static analysis before testing is 85 efficient

With testing and static analysis combined test schedules are reduced by 50

Click to edit Master title style Reducing Operational Incidents amp Costs

Study of structural quality measures and maintenance effort across 20 customers in a large global system integrator

TQI increase of 24 decreased corrective maintenance effort by 50

Total Quality Index

Click to edit Master title style Reducing Operational Losses

0

20

40

60

80

100

120

140

160

180

100 150 200 250 300 350 400

Technical Quality Index

Pro

du

ctio

n In

cid

ents

+$2000000 Annual Loss

$50000 Annual Loss

$5000 Annual Loss

Large international investment bank Business critical applications

Click to edit Master title style Factors Affecting Structural Quality

2017 CRASH Report Request from infocastsoftwarecom

Factor apps Robust Security Perform Change Transfer

Technology 1606 20 3 16 26 7

Java-EE

Industry 677 5 5 7 4 6

App type 510 1 1 1 1 1

Source 580 1 1 4

Shore 540

Maturity 72 28 25 15 24 12

Method 208 10 6 14 6

Team size 186 7 5 8 8

of users 262 4 3 5 4

Percentage of variation in structural quality scores accounted for by various factors

Click to edit Master title style The Technical Debt Metaphor

Structural quality problems in production code

Technical Debt

Principal borrowed

Interest on the debt

Business Risk Liability from debt

Opportunity cost

Interestmdashcontinuing IT costs attributable to the violations causing technical debt ie higher maintenance costs greater resource usage etc

Principalcost of fixing problems remaining in the code after release that must be remediated

Opportunity costmdashbenefits that could have been achieved had resources been put on new capability rather than retiring technical debt

Liabilitymdashbusiness costs related to outages breaches corrupted data etc

Technical Debt the future cost of fixing defects remaining in code at release a component of the cost of ownership

Todayrsquos talk focuses on the principal

Curtis et al (2012) IEEE Software

Click to edit Master title style Tech Debt by Quality Factor

Transferability

40

Changeability

30

Security 7

Robustness

18

70 of Technical Debt is in IT Cost (Transferability Changeability)

30 of Technical Debt is in Business Risk

(Robustness Performance Security)

Health Factor proportions are mostly

consistent across technologies

Curtis et al (2012) IEEE Software

Click to edit Master title style Join CISQ wwwit-cisqorg

Click to edit Master title style Section

References

115

Click to edit Master title style References 1 Anderson DJ (2010) Kanban Sequim WA Blue Hole Press

Basili V Trendowicz A Kowalczyk M Heidrich J Seaman C Munch J amp Rombach D (2014) Aligning Organizations through Measurement Heidelberg Springer

Boehm B (1987) Increasing software productivity IEEE Computer 20 (9) 43-57

Boehm B et al (2000) Software Cost Estimation with COCOMO II Prentice-Hall

Chrissis MB Konrad M amp Shrum S (2011) CMMI Third Edition Guidelines for Process Integration and Product Improvement Reading MA Addison Wesley

Cohn M (2006) Agile Estimating and Planning Upper Saddle River NJ Prentice-Hall

Crosby P (1979) Quality Is Free New York McGraw-Hill

Curtis B Sapiddi J amp Szynkarski A (2012) Estimating the principle of an applicationrsquos technical debt IEEE Software 29 (6) 34-42

Dion R (1993) Process improvement and the corporate balance sheet IEEE Software 10 (4) 28-35

Duncker R (1992) Presentation at the 25th Annual Conference of the Singapore Computer Society Singapore November 1992

Dyne K (1999) ESSI Benefits from Continuous Software Improvements In Proceedings of ESEPGrsquo99 Milton Keyes UK ESPI Foundation

Flowe RM amp Thordahl JB (1994) A Correlational Study of the SEIrsquos Capability Maturity Model and Software Development Performance in DoD Contracts (AFITGSSLAR94D-2) Dayton OH Air Force Institute of Technology Wright Patterson Air Force Base

116

Click to edit Master title style References 2 Haley T Ireland B Wojtaszek E Nash D amp Dion R (1995) Raytheon Electronic Systems

Experience in Software Process Improvement (Tech Rep CMUSEI-95-TR-017) ) Pittsburgh Software Engineering Institute Carnegie Mellon University

Ebert C amp Dumke R (2007) Software Measurement Berlin Springer-Verlag

Fagan E amp Davis N (2005) TSP PSP at Intuit Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Fenton N (2006) New directions for Software Metrics CIO Symposium on Software Best Practices

Garmus D amp Herron D (2001) Function Point Analysis Boston Addison-Wesley

George E (2005) Project management with TSP Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Herbsleb J Carleton A Rozum J Siegel J amp Zubrow D (1994) Benefits of CMM-Based Software Process Improvement Initial Results (Tech Rep CMUSEI-94-TR-13) Pittsburgh Software Engineering Institute Carnegie Mellon University

Humphrey W (1997) Introduction to the Personal Software Process Boston Addison Wesley

Humphrey W (2000) Introduction to the Team Software Process Boston Addison Wesley

Humphrey W (2006) TSP Leading a Development Team Boston Addison Wesley

Jackson D (2009) A direct path to dependable software Communications of the ACM 52 (4)

Kan SH (1995) Metrics and Models in Software Quality Engineering Reading MA Addison-Wesley

Keeni B (2000) IEEE Software 16 (4)

117

Click to edit Master title style References 2 Larman C (2004) Agile and Iterative Development Boston Addison Wesley

Lydon T (1995) Productivity drivers Process and capital In Proceedings of the 1995 SEPG Conference Pittsburgh Software Engineering Institute Carnegie Mellon University

McGarry J et al (2002) Practical Software Measurement Reading MA Addison Wesley

Paulk M Weber C Curtis B amp Chrissis M (1995) The Capability Maturity Model Guidelines for Improving the Software Process Reading MA Addison-Wesley

Peterman W (2000) IEEE Software JulyAugust issue

Poppendieck M amp Poppendieck T (2007) Implementing Lean Software Development Boston Addison-Wesley

Sakamoto K Kishida K amp Nakakoji K (1996) Cultural adaptation of the CMM In Fuggetta A amp Wolf A (Eds) Software Process Chichester UK Wiley 137-154

Sinha M (2006) Performance models ndash Enabling organizations prediction capability Proceedings of SEPGrsquo06 Pittsburgh Software Engineering Institute Carnegie Mellon University

Schwaber K amp Beedle M (2002) Agile Software Development Using Scrum

Spinellis D (2006) Code Quality Addison-Wesley

Van Epps S amp Marshall R (2012) Using TSP at the end of the development cycle Proceedings of TSP 2012 Pittsburgh Software Engineering Institute Carnegie Mellon University

Vu JD (1996) Software process improvement A business case In Proceedings of the European SEPG Conference Milton Keynes UK European Software Process Improvement Foundation

118

  • Best Practices for Software Process and Product Measurement
  • Instructor Dr Bill Curtis
  • Agenda
  • Section 1
  • Why Does Measurement Languish
  • Measurement Process Guidance
  • Measurement Categories
  • Productivity Analysis Objectives
  • Productivity Analysis Measures
  • Software Productivity
  • Software Size Measures
  • How Many Lines of Code
  • Function Point Estimation
  • Effort mdash Weakest Measure
  • How Quality Affects Productivity
  • Carry-forward Rework
  • Example of Quality Impact
  • Quality-Adjusted Productivity
  • Best Practices for Analysis
  • 1 Evaluate Demographics
  • 2 Segment Baselines
  • 3 Beware Sampling Effects
  • 4 Understand Variation
  • 5 Investigate Distributions
  • 6 Account for Maturity Effects
  • 7 Caution for External Data Sets
  • Slide Number 27
  • Measuring Project Progress
  • Requirements Change Impact
  • Phase Defect Removal Model
  • SCRUM Board
  • Burndown Chart and Velocity
  • Analyzing Velocity
  • Measuring and Managing Flow
  • Trends Over Sprints
  • Sources and Measures
  • Recommended Books
  • Personal Software Process (PSP)
  • Team Software Process (TSP)
  • Aggregated Personal Data Best
  • Defect Detection at Intuit
  • TSP Benefits
  • Section 3
  • CMMI Maturity Transitions
  • Enhanced Maturity Framework
  • Maturity Level Transformation
  • Measurement by Level
  • Average Improvement Results
  • Schedule Performance
  • Cost Performance
  • Crosbyrsquos Cost of Quality Model
  • Raytheons Cost of Quality
  • Raytheonrsquos Productivity
  • Raytheonrsquos Cost Reduction
  • Raytheonrsquos Cost Predictability
  • OMRONrsquos Reuse gains
  • Section 4
  • What about the Product
  • Six Sigmarsquos Challenge
  • Testing is Not Enough
  • CASTrsquos Application Intelligence Platform
  • Modern Apps are a Technology Stack
  • Architecturally Complex Defects
  • Propagated Risk
  • Transaction Risk
  • Risk-Based Prioritization
  • Static Analysis in US DoD
  • Reducing Operational Incidents amp Costs
  • Reducing Operational Losses
  • Slide Number 70
  • The Technical Debt Metaphor
  • Tech Debt by Quality Factor
  • Join CISQ wwwit-cisqorg
  • Section
  • References 1
  • References 2
  • References 2

PHASE

Escapes Previous Phase

KSLOC

Defect Injection

KSLOC

Subtotal

KSLOC

Removal

Effective-ness

Defect Removal

KSLOC

Escapes at phase exit KSLOC

Rqmts

0

12

12

0

0

12

Design

12

180

192

76

146

46

Code amp UT

46

154

200

71

142

58

Inte-gration

58

0

58

67

39

19

Sys Test

19

0

19

58

11

08

Opera-tion

08

Page 33: Best Practices for Software Process and Product Measurement · Software Process and Product Measurement Dr. Bill Curtis ... measurement best practice • Best guidance for starting

Sheet2

Sheet3

Click to edit Master title style Phase Defect Removal Model

PHASE

Escapes Previous Phase KSLOC

Defect Injection KSLOC

Subtotal KSLOC

Removal Effective-ness

Defect Removal KSLOC

Escapes at phase exit KSLOC

Rqmts 0 12 12 0 0 12

Design 12 180 192 76 146 46

Code amp UT 46 154 200 71 142 58

Inte-gration 58 0 58 67 39 19

Sys Test 19 0 19 58 11 08

Opera-tion 08

Kan (1999) Metrics and Model in Software Quality Engineering

Click to edit Master title style SCRUM Board Scrumboard displays

the path to completion for each story by showing task status

Story points are an estimate of days for each story

Agile estimating is by feature (story) rather than by task

lsquoDonersquo does not always mean all tasks are finished

Kanban methods restrict the work in progress to a limited number of paths

Click to edit Master title style Burndown Chart and Velocity

Burndown chart displays story completion by day

Burndown chart indicates actual versus estimated progress

Velocity is the rate at which stories are completed

Velocity indicates sustainable progress

Velocity results provide one source of historical data for improving estimates

Story points without the context of productivity factors are dangerous for estimating

Click to edit Master title style Team Velocity

0

1

2

3

4

5

1 2 2 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20Days

Story Points per Hour

0

1

2

3

4

5

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

HrsStryPt

Analyzing Velocity

Item Cycle Times

02468

101214161820

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

Stories Hours HrsStryPt

0

10

20

30

40

50

60

1 2 2 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

Burndown Chart

Days

Click to edit Master title style Measuring and Managing Flow

Released

In progress

Backlogged

Cumulative Flow Diagram Days to delivery

outliers

Anderson (2010) Kanban

Click to edit Master title style

0

10

20

30

40

50

60

70

80

1 2 3 4 5 6 7 8 9 10 11 12

Sprints

Sto

ry p

oin

ts d

eliv

ered

Trends Over Sprints

0

10

20

30

40

50

60

70

80

1 2 3 4 5 6 7 8 9 10 11 12

Sprints

Sto

ry p

oin

ts d

eliv

ered

0

05

1

15

2

25

3

35

4

1 2 3 4 5 6 7 8 9 10 11 12

Sprints

Tech

nic

al d

ebt

Track and correlate multiple measures

across sprints

Predictive models

Click to edit Master title style Sources and Measures

Stories submitted

Stories pulled back

Growth in size of stories

Stories added mid-sprint

Stories under review

Failed tests

Non-development tasks

Assists to other projects

Data sources

Measures Project tracking

Code control

Bug tracking

Build system

Test environment

Deployment tools

Operational monitoring

Click to edit Master title style Recommended Books

Click to edit Master title style Personal Software Process (PSP)

PSP 0 Current process Time recording

Defect recording Defect type standards

PSP 01 Coding standard

Size measurement Process improvement

proposal

Baseline Personal Process

Humphrey (1997) Introduction to the Personal Software Process

PSP 1 Size estimating

Test report

Personal Planning

PSP 2 Code reviews

Design reviews

Personal Quality Management

PSP 21 Cyclic development

Cyclic Personal Process

PSP 11 Task planning

Schedule planning

PSP 21 Design templates

Click to edit Master title style Team Software Process (TSP) bull Built from personal processes of team members

bull Well defined team roles

bull Project launch workshops for planning

bull Team measurement and tracking

bull Post-mortems for improvement

bull Team application of lean principles

Humphrey (2000) Introduction to the Team Software Process

Click to edit Master title style Aggregated Personal Data Best

R2s range from 70 to 87

The ability to predict the amount of time required to produce a piece of software is dramatically improved by estimating at the individual level first and then combining the estimates rather than developing a single team estimate averaged over individuals

R2 = 53

Click to edit Master title style Defect Detection at Intuit

Consequently test changes from defect

detection into correctness verification

PSPTSP shifts defect detection from the test phase back up into development

Fagan (2005)

Personal reviews

33

Compile 33

Team reviews 19

Unit test 15

Test 14

Click to edit Master title style

Dramatic reductions

ndash Delivered defects

ndash Days per KLOC

ndash Schedule deviation

ndash Test defectskloc

ndash Variation in results

Humphrey (2006) TSP Leading a Development Team

TSP Benefits

Click to edit Master title style Section 3 Section 3 Measuring Software

Improvement Programs

1) Improvement program methods

2) Maturity-based measurement

3) Improvement program results

61

Click to edit Master title style CMMI Maturity Transitions

Level 5 Optimizing

Innovation management

Level 4 Quantitatively

Managed

Capability management

Level 3 Defined

Process management

Level 2 Managed

Project management

Level 1 Initial

Inconsistent management

Chrisis Konrad amp Shrum (2005) CMMI

Click to edit Master title style Enhanced Maturity Framework

Level 5 Innovating

Innovation management

Level 4 Optimized

Capability management

Level 3 Standardized

Organizational management

Level 2 Stabilized

Work unit management

Level 1 Initial

Inconsistent management

Click to edit Master title style Maturity Level Transformation

Level 5 Opportunistic improvements

Empowered culture

Level 5 Proactive

improvements Agile culture

Organization

Projects

Individual

Level 4 Process and results

managed statistically Precision culture

Level 3 Organization develops

standard processes Common culture

Level 2 Project managers stabilize projects

Local culture

Level 1 Ad Hoc inconsistent

development practices Relationship culture

Trust

Click to edit Master title style Measurement by Level

Level 5 mdash Exploratory data ∆ t = N x ROI

Level 4 mdash Predictive data X plusmn (196σ radicn )

Level 3 mdash Process data 83 defects per review

Level 2 mdash Management data Effort = α LOCβ

Level 1 mdash Haphazard data ldquoabout a million linesrdquo

Measure-maturity mismatch slows improvement and creates

dysfunction

Click to edit Master title style Average Improvement Results

Productivity growth 4 35 9 - 67

Pre-test defect detection 3 22 6 - 25

Time to market 2 darr 19 15 - 23

Field error reports 5 darr 39 10 - 94

Return on investment 5 501 41 - 881

Annual Annual Improvement benefit Orgs median range

Click to edit Master title style Schedule Performance

0

02

04

06

08

1

12

14

16

1 2 3

Schedule performance

index

CMM maturity level

Budgeted cost of work performed

Budgeted cost of work scheduled

behind ahead

Air Force study of contractors

Flowe amp Thordahl (1994)

Click to edit Master title style Cost Performance

Cost performance

index

0

05

1

15

2

25

1 2 3 CMM Maturity Level

Budgeted cost of work performed Actual cost of

work performed

overrun underrun

Air Force study of contractors

Flowe amp Thordahl (1994)

Click to edit Master title style Crosbyrsquos Cost of Quality Model Project Cost

Performance Cost of Quality Generating plans documentation Development - requirements - design - code - integration Prevention Appraisal

Reviews - system - requirements - design - test plan - test scripts Walkthroughs Testing (first time) Independent Verification and Validation (first time) Audits

Training Policies Procedures and methods Tools Planning Quality Improvement Projects Data Gathering and Analysis Root Cause Analysis Quality Reporting

Nonconformance Conformance Fixing Defects Reworking any Document Updating Source Code Re-reviews Re-tests Lab costs for re-tests Patches(Internal and External) Engineering Changes Change Control Boards External Failures and fines Customer Support Help Desks

Dion (1993) Crosby (1979) Quality Is Free

Click to edit Master title style Raytheons Cost of Quality

bull Performance mdash cost of building it right first time bull Nonconformance mdash cost of rework bull Appraisal mdash cost of testing bull Prevention mdash cost of preventing nonconformance

Dion (1993) Haley (1995)

23

18

Year Level Perform Nonconf Appraise Prevent

1988 1 34 41 15 7

1990 2 55 18 15 12

1992 3 66 11

1994 4 76 6 75

Click to edit Master title style Raytheonrsquos Productivity

0

1

2

3

4

1988 1989 1990 1991 1992 1993 1994

Growth relative to 1989 baseline

Lydon (1995)

Click to edit Master title style Raytheonrsquos Cost Reduction

0

02

04

06

08

1

12

1988 1989 1990 1991 1992 1993 1994

Reduction relative to 1989 baseline

Lydon (1995)

Click to edit Master title style

Haley (1995)

1993 1992 1991 1990 1989 1988 1994

50

40

30

20

10

0

-10

Actual cost

Budgeted cost

over-run under-run

4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1

Raytheonrsquos Cost Predictability

Click to edit Master title style OMRONrsquos Reuse gains

Sakamoto et al (1996)

0

5 0 0

1 0 0 0

1 5 0 0

2 0 0 0

2 5 0 0

3 0 0 0

1 9 8 7 1 9 8 8 1 9 8 9 1 9 9 0 1 9 9 1 1 9 9 2 1 9 9 3 1 9 9 4

Steps

100

75

50

25

0

of reuse

Click to edit Master title style Section 4 Section 4 Measuring Quality and

Technical Debt

1) Structural quality measurement

2) Technical debt

3) Consortium for IT Software Quality (CISQ)

89

Click to edit Master title style What about the Product CMMIrsquos primary assumption

ldquoThe quality of a software system is governed by quality of the process used to develop itrdquo

- Watts Humphrey

CMMIrsquos product quality problem

1 Assessments do not verify product quality

2 Compliance focus undermines CMMIrsquos primary assumption

3 Product quality focus at Level 4 in CMM was lost in quantitative process control in CMMI

4 The assumptions underlying control charts are violated by software data

Click to edit Master title style Six Sigmarsquos Challenge

Process focus ndash process improvement ndash Six Sigma Product focus ndash product improvement ndash Design for 6σ

Product focus supplements product focus to unlock even more value from software

Six Sigma projects must have significant benefits Huge benefits

Large benefits

Good benefits

Okay benefits

Small benefits

Now what

Ultimately we run out of projects with significant enough benefits to continue the programhellip

so what is the solution that allows continual improvement

Click to edit Master title style Testing is Not Enough ldquoAs higher levels of assurance are demandedhelliptesting cannot deliver the level of confidence required at a reasonable costrdquo

ldquoThe correctness of the code is rarely the weakest linkrdquo

Jackson D (2009) Spinellis D (2006)

ldquohellipa failure to satisfy a non-functional requirement can be critical even catastrophichellipnon-functional requirements are sometimes difficult to verify We cannot write a test case to verify a systemrsquos reliabilityhellipThe ability to associate code to non-functional properties can be a powerful weapon in a software engineerrsquos arsenalrdquo

Click to edit Master title style CASTrsquos Application Intelligence Platform

Application Analysis

Evaluation of 1200+ coding amp

architectural rules

Application meta-data

Transferability

Changeability

Robustness

Performance

Security

Quality Measurements

Detected Violations

Expensive operation in loop Static vs pooled connections Complex query on big table Large indices on big table

Empty CATCH block Uncontrolled data access Poor memory management Opened resource not closed

SQL injection Cross-site scripting Buffer overflow Uncontrolled format string

Unstructured code Misuse of inheritance Lack of comments Violated naming convention

Highly coupled component Duplicated code Index modified in loop High cyclomatic complexity

Language Parsers

Oracle PLSQL Sybase T-SQL SQL Server T-SQL IBM SQLPSM C C++ C Pro C Cobol CICS Visual Basic VBNet ASPNet Java J2EE JSP XML HTML Javascript VBScript PHP PowerBuilder Oracle Forms PeopleSoft SAP ABAP Netweaver Tibco Business Objects Universal Analyzer for other languages

Click to edit Master title style Modern Apps are a Technology Stack Ar

chite

ctur

al C

ompl

ianc

e

Data Flow Transaction Risk

EJB PLSQL

Oracle

SQL Server

DB2

TSQL

Hibernate

Spring

Struts NET

COBOL

Sybase IMS

Messaging

Integration quality Architectural

compliance Risk propagation Application security Resiliency checks Transaction integrity

Function point Effort estimation Data access control SDK versioning Calibration across

technologies IT organization level

System Level 3

Code style amp layout Expression complexity Code documentation Class or program design Basic coding standards Developer level

Unit Level 1

Java Java Java

Java Java

Web Services Java Java Single languagetechnology layer

Intra-technology architecture Intra-layer dependencies Inter-program invocation Security vulnerabilities Development team level

Technology Level 2

JSP ASPNET APIs

Click to edit Master title style Architecturally Complex Defects

Primary cause of operational

problems

20x as many

fixes to correct

Architecturally Complex Defect

A structural flaw involving interactions among multiple components that reside in different application layers

48

52 92

8 Architecturally

Complex Defects

Code unit-level violations

of total app defects

of total repair effort

Most architecturally complex defects involve an architectural hotspot ndash a badly-built component that should be replaced

Click to edit Master title style Propagated Risk

The impact of violation 284 in Component B is very widely propagated throughout the system ndash

it presents very high risk

The impact of violation 342 in Component A is only narrowly propagated in the system

ndash it presents low risk

Propagated Risk Index

A ranking of the risk created by each violation based on the its severity and the breadth of its impact in the system

Violation 342 in component A

Violation 284 in component B

Remediating violation 284 in Component B will have

the greater impact on reducing risk and cost

Click to edit Master title style Transaction Risk

Customers care most about the dependability of their transactions

The risk of a transaction is determined by the number and severity of violations along its path

Transactions can be ranked by risk to prioritize their remediation

The Transaction Risk Index can be further tuned using the operational frequency of each transaction

Transaction Risk Index

A ranking of transactions based on the risk created by the number and severity of violations along the transaction path

Transaction entryexit in the user interface layer

Transaction fulfillment in the data storage layer

1 3 2 4 5

Transaction 1 poses greatest risk

Transaction processing in the business logic layer

Click to edit Master title style Risk-Based Prioritization

Violation Health Factor Severity Propagated Risk Index

In a high risk transaction

Frequency of path execution

148 Security 9

371 Robustness 9

062 Performance 8

284 Robustness 7

016 Changeability 7

241 Security 5

173 Transferability 4

359 Transferability 3

Click to edit Master title style Static Analysis in US DoD

John Keane (2013)

Testing finds lt13 of structural defects

Static analysis before testing is 85 efficient

With testing and static analysis combined test schedules are reduced by 50

Click to edit Master title style Reducing Operational Incidents amp Costs

Study of structural quality measures and maintenance effort across 20 customers in a large global system integrator

TQI increase of 24 decreased corrective maintenance effort by 50

Total Quality Index

Click to edit Master title style Reducing Operational Losses

0

20

40

60

80

100

120

140

160

180

100 150 200 250 300 350 400

Technical Quality Index

Pro

du

ctio

n In

cid

ents

+$2000000 Annual Loss

$50000 Annual Loss

$5000 Annual Loss

Large international investment bank Business critical applications

Click to edit Master title style Factors Affecting Structural Quality

2017 CRASH Report Request from infocastsoftwarecom

Factor apps Robust Security Perform Change Transfer

Technology 1606 20 3 16 26 7

Java-EE

Industry 677 5 5 7 4 6

App type 510 1 1 1 1 1

Source 580 1 1 4

Shore 540

Maturity 72 28 25 15 24 12

Method 208 10 6 14 6

Team size 186 7 5 8 8

of users 262 4 3 5 4

Percentage of variation in structural quality scores accounted for by various factors

Click to edit Master title style The Technical Debt Metaphor

Structural quality problems in production code

Technical Debt

Principal borrowed

Interest on the debt

Business Risk Liability from debt

Opportunity cost

Interestmdashcontinuing IT costs attributable to the violations causing technical debt ie higher maintenance costs greater resource usage etc

Principalcost of fixing problems remaining in the code after release that must be remediated

Opportunity costmdashbenefits that could have been achieved had resources been put on new capability rather than retiring technical debt

Liabilitymdashbusiness costs related to outages breaches corrupted data etc

Technical Debt the future cost of fixing defects remaining in code at release a component of the cost of ownership

Todayrsquos talk focuses on the principal

Curtis et al (2012) IEEE Software

Click to edit Master title style Tech Debt by Quality Factor

Transferability

40

Changeability

30

Security 7

Robustness

18

70 of Technical Debt is in IT Cost (Transferability Changeability)

30 of Technical Debt is in Business Risk

(Robustness Performance Security)

Health Factor proportions are mostly

consistent across technologies

Curtis et al (2012) IEEE Software

Click to edit Master title style Join CISQ wwwit-cisqorg

Click to edit Master title style Section

References

115

Click to edit Master title style References 1 Anderson DJ (2010) Kanban Sequim WA Blue Hole Press

Basili V Trendowicz A Kowalczyk M Heidrich J Seaman C Munch J amp Rombach D (2014) Aligning Organizations through Measurement Heidelberg Springer

Boehm B (1987) Increasing software productivity IEEE Computer 20 (9) 43-57

Boehm B et al (2000) Software Cost Estimation with COCOMO II Prentice-Hall

Chrissis MB Konrad M amp Shrum S (2011) CMMI Third Edition Guidelines for Process Integration and Product Improvement Reading MA Addison Wesley

Cohn M (2006) Agile Estimating and Planning Upper Saddle River NJ Prentice-Hall

Crosby P (1979) Quality Is Free New York McGraw-Hill

Curtis B Sapiddi J amp Szynkarski A (2012) Estimating the principle of an applicationrsquos technical debt IEEE Software 29 (6) 34-42

Dion R (1993) Process improvement and the corporate balance sheet IEEE Software 10 (4) 28-35

Duncker R (1992) Presentation at the 25th Annual Conference of the Singapore Computer Society Singapore November 1992

Dyne K (1999) ESSI Benefits from Continuous Software Improvements In Proceedings of ESEPGrsquo99 Milton Keyes UK ESPI Foundation

Flowe RM amp Thordahl JB (1994) A Correlational Study of the SEIrsquos Capability Maturity Model and Software Development Performance in DoD Contracts (AFITGSSLAR94D-2) Dayton OH Air Force Institute of Technology Wright Patterson Air Force Base

116

Click to edit Master title style References 2 Haley T Ireland B Wojtaszek E Nash D amp Dion R (1995) Raytheon Electronic Systems

Experience in Software Process Improvement (Tech Rep CMUSEI-95-TR-017) ) Pittsburgh Software Engineering Institute Carnegie Mellon University

Ebert C amp Dumke R (2007) Software Measurement Berlin Springer-Verlag

Fagan E amp Davis N (2005) TSP PSP at Intuit Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Fenton N (2006) New directions for Software Metrics CIO Symposium on Software Best Practices

Garmus D amp Herron D (2001) Function Point Analysis Boston Addison-Wesley

George E (2005) Project management with TSP Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Herbsleb J Carleton A Rozum J Siegel J amp Zubrow D (1994) Benefits of CMM-Based Software Process Improvement Initial Results (Tech Rep CMUSEI-94-TR-13) Pittsburgh Software Engineering Institute Carnegie Mellon University

Humphrey W (1997) Introduction to the Personal Software Process Boston Addison Wesley

Humphrey W (2000) Introduction to the Team Software Process Boston Addison Wesley

Humphrey W (2006) TSP Leading a Development Team Boston Addison Wesley

Jackson D (2009) A direct path to dependable software Communications of the ACM 52 (4)

Kan SH (1995) Metrics and Models in Software Quality Engineering Reading MA Addison-Wesley

Keeni B (2000) IEEE Software 16 (4)

117

Click to edit Master title style References 2 Larman C (2004) Agile and Iterative Development Boston Addison Wesley

Lydon T (1995) Productivity drivers Process and capital In Proceedings of the 1995 SEPG Conference Pittsburgh Software Engineering Institute Carnegie Mellon University

McGarry J et al (2002) Practical Software Measurement Reading MA Addison Wesley

Paulk M Weber C Curtis B amp Chrissis M (1995) The Capability Maturity Model Guidelines for Improving the Software Process Reading MA Addison-Wesley

Peterman W (2000) IEEE Software JulyAugust issue

Poppendieck M amp Poppendieck T (2007) Implementing Lean Software Development Boston Addison-Wesley

Sakamoto K Kishida K amp Nakakoji K (1996) Cultural adaptation of the CMM In Fuggetta A amp Wolf A (Eds) Software Process Chichester UK Wiley 137-154

Sinha M (2006) Performance models ndash Enabling organizations prediction capability Proceedings of SEPGrsquo06 Pittsburgh Software Engineering Institute Carnegie Mellon University

Schwaber K amp Beedle M (2002) Agile Software Development Using Scrum

Spinellis D (2006) Code Quality Addison-Wesley

Van Epps S amp Marshall R (2012) Using TSP at the end of the development cycle Proceedings of TSP 2012 Pittsburgh Software Engineering Institute Carnegie Mellon University

Vu JD (1996) Software process improvement A business case In Proceedings of the European SEPG Conference Milton Keynes UK European Software Process Improvement Foundation

118

  • Best Practices for Software Process and Product Measurement
  • Instructor Dr Bill Curtis
  • Agenda
  • Section 1
  • Why Does Measurement Languish
  • Measurement Process Guidance
  • Measurement Categories
  • Productivity Analysis Objectives
  • Productivity Analysis Measures
  • Software Productivity
  • Software Size Measures
  • How Many Lines of Code
  • Function Point Estimation
  • Effort mdash Weakest Measure
  • How Quality Affects Productivity
  • Carry-forward Rework
  • Example of Quality Impact
  • Quality-Adjusted Productivity
  • Best Practices for Analysis
  • 1 Evaluate Demographics
  • 2 Segment Baselines
  • 3 Beware Sampling Effects
  • 4 Understand Variation
  • 5 Investigate Distributions
  • 6 Account for Maturity Effects
  • 7 Caution for External Data Sets
  • Slide Number 27
  • Measuring Project Progress
  • Requirements Change Impact
  • Phase Defect Removal Model
  • SCRUM Board
  • Burndown Chart and Velocity
  • Analyzing Velocity
  • Measuring and Managing Flow
  • Trends Over Sprints
  • Sources and Measures
  • Recommended Books
  • Personal Software Process (PSP)
  • Team Software Process (TSP)
  • Aggregated Personal Data Best
  • Defect Detection at Intuit
  • TSP Benefits
  • Section 3
  • CMMI Maturity Transitions
  • Enhanced Maturity Framework
  • Maturity Level Transformation
  • Measurement by Level
  • Average Improvement Results
  • Schedule Performance
  • Cost Performance
  • Crosbyrsquos Cost of Quality Model
  • Raytheons Cost of Quality
  • Raytheonrsquos Productivity
  • Raytheonrsquos Cost Reduction
  • Raytheonrsquos Cost Predictability
  • OMRONrsquos Reuse gains
  • Section 4
  • What about the Product
  • Six Sigmarsquos Challenge
  • Testing is Not Enough
  • CASTrsquos Application Intelligence Platform
  • Modern Apps are a Technology Stack
  • Architecturally Complex Defects
  • Propagated Risk
  • Transaction Risk
  • Risk-Based Prioritization
  • Static Analysis in US DoD
  • Reducing Operational Incidents amp Costs
  • Reducing Operational Losses
  • Slide Number 70
  • The Technical Debt Metaphor
  • Tech Debt by Quality Factor
  • Join CISQ wwwit-cisqorg
  • Section
  • References 1
  • References 2
  • References 2

PHASE

Escapes Previous Phase

KSLOC

Defect Injection

KSLOC

Subtotal

KSLOC

Removal

Effective-ness

Defect Removal

KSLOC

Escapes at phase exit KSLOC

Rqmts

0

12

12

0

0

12

Design

12

180

192

76

146

46

Code amp UT

46

154

200

71

142

58

Inte-gration

58

0

58

67

39

19

Sys Test

19

0

19

58

11

08

Opera-tion

08

Page 34: Best Practices for Software Process and Product Measurement · Software Process and Product Measurement Dr. Bill Curtis ... measurement best practice • Best guidance for starting

Sheet3

Click to edit Master title style Phase Defect Removal Model

PHASE

Escapes Previous Phase KSLOC

Defect Injection KSLOC

Subtotal KSLOC

Removal Effective-ness

Defect Removal KSLOC

Escapes at phase exit KSLOC

Rqmts 0 12 12 0 0 12

Design 12 180 192 76 146 46

Code amp UT 46 154 200 71 142 58

Inte-gration 58 0 58 67 39 19

Sys Test 19 0 19 58 11 08

Opera-tion 08

Kan (1999) Metrics and Model in Software Quality Engineering

Click to edit Master title style SCRUM Board Scrumboard displays

the path to completion for each story by showing task status

Story points are an estimate of days for each story

Agile estimating is by feature (story) rather than by task

lsquoDonersquo does not always mean all tasks are finished

Kanban methods restrict the work in progress to a limited number of paths

Click to edit Master title style Burndown Chart and Velocity

Burndown chart displays story completion by day

Burndown chart indicates actual versus estimated progress

Velocity is the rate at which stories are completed

Velocity indicates sustainable progress

Velocity results provide one source of historical data for improving estimates

Story points without the context of productivity factors are dangerous for estimating

Click to edit Master title style Team Velocity

0

1

2

3

4

5

1 2 2 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20Days

Story Points per Hour

0

1

2

3

4

5

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

HrsStryPt

Analyzing Velocity

Item Cycle Times

02468

101214161820

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

Stories Hours HrsStryPt

0

10

20

30

40

50

60

1 2 2 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

Burndown Chart

Days

Click to edit Master title style Measuring and Managing Flow

Released

In progress

Backlogged

Cumulative Flow Diagram Days to delivery

outliers

Anderson (2010) Kanban

Click to edit Master title style

0

10

20

30

40

50

60

70

80

1 2 3 4 5 6 7 8 9 10 11 12

Sprints

Sto

ry p

oin

ts d

eliv

ered

Trends Over Sprints

0

10

20

30

40

50

60

70

80

1 2 3 4 5 6 7 8 9 10 11 12

Sprints

Sto

ry p

oin

ts d

eliv

ered

0

05

1

15

2

25

3

35

4

1 2 3 4 5 6 7 8 9 10 11 12

Sprints

Tech

nic

al d

ebt

Track and correlate multiple measures

across sprints

Predictive models

Click to edit Master title style Sources and Measures

Stories submitted

Stories pulled back

Growth in size of stories

Stories added mid-sprint

Stories under review

Failed tests

Non-development tasks

Assists to other projects

Data sources

Measures Project tracking

Code control

Bug tracking

Build system

Test environment

Deployment tools

Operational monitoring

Click to edit Master title style Recommended Books

Click to edit Master title style Personal Software Process (PSP)

PSP 0 Current process Time recording

Defect recording Defect type standards

PSP 01 Coding standard

Size measurement Process improvement

proposal

Baseline Personal Process

Humphrey (1997) Introduction to the Personal Software Process

PSP 1 Size estimating

Test report

Personal Planning

PSP 2 Code reviews

Design reviews

Personal Quality Management

PSP 21 Cyclic development

Cyclic Personal Process

PSP 11 Task planning

Schedule planning

PSP 21 Design templates

Click to edit Master title style Team Software Process (TSP) bull Built from personal processes of team members

bull Well defined team roles

bull Project launch workshops for planning

bull Team measurement and tracking

bull Post-mortems for improvement

bull Team application of lean principles

Humphrey (2000) Introduction to the Team Software Process

Click to edit Master title style Aggregated Personal Data Best

R2s range from 70 to 87

The ability to predict the amount of time required to produce a piece of software is dramatically improved by estimating at the individual level first and then combining the estimates rather than developing a single team estimate averaged over individuals

R2 = 53

Click to edit Master title style Defect Detection at Intuit

Consequently test changes from defect

detection into correctness verification

PSPTSP shifts defect detection from the test phase back up into development

Fagan (2005)

Personal reviews

33

Compile 33

Team reviews 19

Unit test 15

Test 14

Click to edit Master title style

Dramatic reductions

ndash Delivered defects

ndash Days per KLOC

ndash Schedule deviation

ndash Test defectskloc

ndash Variation in results

Humphrey (2006) TSP Leading a Development Team

TSP Benefits

Click to edit Master title style Section 3 Section 3 Measuring Software

Improvement Programs

1) Improvement program methods

2) Maturity-based measurement

3) Improvement program results

61

Click to edit Master title style CMMI Maturity Transitions

Level 5 Optimizing

Innovation management

Level 4 Quantitatively

Managed

Capability management

Level 3 Defined

Process management

Level 2 Managed

Project management

Level 1 Initial

Inconsistent management

Chrisis Konrad amp Shrum (2005) CMMI

Click to edit Master title style Enhanced Maturity Framework

Level 5 Innovating

Innovation management

Level 4 Optimized

Capability management

Level 3 Standardized

Organizational management

Level 2 Stabilized

Work unit management

Level 1 Initial

Inconsistent management

Click to edit Master title style Maturity Level Transformation

Level 5 Opportunistic improvements

Empowered culture

Level 5 Proactive

improvements Agile culture

Organization

Projects

Individual

Level 4 Process and results

managed statistically Precision culture

Level 3 Organization develops

standard processes Common culture

Level 2 Project managers stabilize projects

Local culture

Level 1 Ad Hoc inconsistent

development practices Relationship culture

Trust

Click to edit Master title style Measurement by Level

Level 5 mdash Exploratory data ∆ t = N x ROI

Level 4 mdash Predictive data X plusmn (196σ radicn )

Level 3 mdash Process data 83 defects per review

Level 2 mdash Management data Effort = α LOCβ

Level 1 mdash Haphazard data ldquoabout a million linesrdquo

Measure-maturity mismatch slows improvement and creates

dysfunction

Click to edit Master title style Average Improvement Results

Productivity growth 4 35 9 - 67

Pre-test defect detection 3 22 6 - 25

Time to market 2 darr 19 15 - 23

Field error reports 5 darr 39 10 - 94

Return on investment 5 501 41 - 881

Annual Annual Improvement benefit Orgs median range

Click to edit Master title style Schedule Performance

0

02

04

06

08

1

12

14

16

1 2 3

Schedule performance

index

CMM maturity level

Budgeted cost of work performed

Budgeted cost of work scheduled

behind ahead

Air Force study of contractors

Flowe amp Thordahl (1994)

Click to edit Master title style Cost Performance

Cost performance

index

0

05

1

15

2

25

1 2 3 CMM Maturity Level

Budgeted cost of work performed Actual cost of

work performed

overrun underrun

Air Force study of contractors

Flowe amp Thordahl (1994)

Click to edit Master title style Crosbyrsquos Cost of Quality Model Project Cost

Performance Cost of Quality Generating plans documentation Development - requirements - design - code - integration Prevention Appraisal

Reviews - system - requirements - design - test plan - test scripts Walkthroughs Testing (first time) Independent Verification and Validation (first time) Audits

Training Policies Procedures and methods Tools Planning Quality Improvement Projects Data Gathering and Analysis Root Cause Analysis Quality Reporting

Nonconformance Conformance Fixing Defects Reworking any Document Updating Source Code Re-reviews Re-tests Lab costs for re-tests Patches(Internal and External) Engineering Changes Change Control Boards External Failures and fines Customer Support Help Desks

Dion (1993) Crosby (1979) Quality Is Free

Click to edit Master title style Raytheons Cost of Quality

bull Performance mdash cost of building it right first time bull Nonconformance mdash cost of rework bull Appraisal mdash cost of testing bull Prevention mdash cost of preventing nonconformance

Dion (1993) Haley (1995)

23

18

Year Level Perform Nonconf Appraise Prevent

1988 1 34 41 15 7

1990 2 55 18 15 12

1992 3 66 11

1994 4 76 6 75

Click to edit Master title style Raytheonrsquos Productivity

0

1

2

3

4

1988 1989 1990 1991 1992 1993 1994

Growth relative to 1989 baseline

Lydon (1995)

Click to edit Master title style Raytheonrsquos Cost Reduction

0

02

04

06

08

1

12

1988 1989 1990 1991 1992 1993 1994

Reduction relative to 1989 baseline

Lydon (1995)

Click to edit Master title style

Haley (1995)

1993 1992 1991 1990 1989 1988 1994

50

40

30

20

10

0

-10

Actual cost

Budgeted cost

over-run under-run

4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1

Raytheonrsquos Cost Predictability

Click to edit Master title style OMRONrsquos Reuse gains

Sakamoto et al (1996)

0

5 0 0

1 0 0 0

1 5 0 0

2 0 0 0

2 5 0 0

3 0 0 0

1 9 8 7 1 9 8 8 1 9 8 9 1 9 9 0 1 9 9 1 1 9 9 2 1 9 9 3 1 9 9 4

Steps

100

75

50

25

0

of reuse

Click to edit Master title style Section 4 Section 4 Measuring Quality and

Technical Debt

1) Structural quality measurement

2) Technical debt

3) Consortium for IT Software Quality (CISQ)

89

Click to edit Master title style What about the Product CMMIrsquos primary assumption

ldquoThe quality of a software system is governed by quality of the process used to develop itrdquo

- Watts Humphrey

CMMIrsquos product quality problem

1 Assessments do not verify product quality

2 Compliance focus undermines CMMIrsquos primary assumption

3 Product quality focus at Level 4 in CMM was lost in quantitative process control in CMMI

4 The assumptions underlying control charts are violated by software data

Click to edit Master title style Six Sigmarsquos Challenge

Process focus ndash process improvement ndash Six Sigma Product focus ndash product improvement ndash Design for 6σ

Product focus supplements product focus to unlock even more value from software

Six Sigma projects must have significant benefits Huge benefits

Large benefits

Good benefits

Okay benefits

Small benefits

Now what

Ultimately we run out of projects with significant enough benefits to continue the programhellip

so what is the solution that allows continual improvement

Click to edit Master title style Testing is Not Enough ldquoAs higher levels of assurance are demandedhelliptesting cannot deliver the level of confidence required at a reasonable costrdquo

ldquoThe correctness of the code is rarely the weakest linkrdquo

Jackson D (2009) Spinellis D (2006)

ldquohellipa failure to satisfy a non-functional requirement can be critical even catastrophichellipnon-functional requirements are sometimes difficult to verify We cannot write a test case to verify a systemrsquos reliabilityhellipThe ability to associate code to non-functional properties can be a powerful weapon in a software engineerrsquos arsenalrdquo

Click to edit Master title style CASTrsquos Application Intelligence Platform

Application Analysis

Evaluation of 1200+ coding amp

architectural rules

Application meta-data

Transferability

Changeability

Robustness

Performance

Security

Quality Measurements

Detected Violations

Expensive operation in loop Static vs pooled connections Complex query on big table Large indices on big table

Empty CATCH block Uncontrolled data access Poor memory management Opened resource not closed

SQL injection Cross-site scripting Buffer overflow Uncontrolled format string

Unstructured code Misuse of inheritance Lack of comments Violated naming convention

Highly coupled component Duplicated code Index modified in loop High cyclomatic complexity

Language Parsers

Oracle PLSQL Sybase T-SQL SQL Server T-SQL IBM SQLPSM C C++ C Pro C Cobol CICS Visual Basic VBNet ASPNet Java J2EE JSP XML HTML Javascript VBScript PHP PowerBuilder Oracle Forms PeopleSoft SAP ABAP Netweaver Tibco Business Objects Universal Analyzer for other languages

Click to edit Master title style Modern Apps are a Technology Stack Ar

chite

ctur

al C

ompl

ianc

e

Data Flow Transaction Risk

EJB PLSQL

Oracle

SQL Server

DB2

TSQL

Hibernate

Spring

Struts NET

COBOL

Sybase IMS

Messaging

Integration quality Architectural

compliance Risk propagation Application security Resiliency checks Transaction integrity

Function point Effort estimation Data access control SDK versioning Calibration across

technologies IT organization level

System Level 3

Code style amp layout Expression complexity Code documentation Class or program design Basic coding standards Developer level

Unit Level 1

Java Java Java

Java Java

Web Services Java Java Single languagetechnology layer

Intra-technology architecture Intra-layer dependencies Inter-program invocation Security vulnerabilities Development team level

Technology Level 2

JSP ASPNET APIs

Click to edit Master title style Architecturally Complex Defects

Primary cause of operational

problems

20x as many

fixes to correct

Architecturally Complex Defect

A structural flaw involving interactions among multiple components that reside in different application layers

48

52 92

8 Architecturally

Complex Defects

Code unit-level violations

of total app defects

of total repair effort

Most architecturally complex defects involve an architectural hotspot ndash a badly-built component that should be replaced

Click to edit Master title style Propagated Risk

The impact of violation 284 in Component B is very widely propagated throughout the system ndash

it presents very high risk

The impact of violation 342 in Component A is only narrowly propagated in the system

ndash it presents low risk

Propagated Risk Index

A ranking of the risk created by each violation based on the its severity and the breadth of its impact in the system

Violation 342 in component A

Violation 284 in component B

Remediating violation 284 in Component B will have

the greater impact on reducing risk and cost

Click to edit Master title style Transaction Risk

Customers care most about the dependability of their transactions

The risk of a transaction is determined by the number and severity of violations along its path

Transactions can be ranked by risk to prioritize their remediation

The Transaction Risk Index can be further tuned using the operational frequency of each transaction

Transaction Risk Index

A ranking of transactions based on the risk created by the number and severity of violations along the transaction path

Transaction entryexit in the user interface layer

Transaction fulfillment in the data storage layer

1 3 2 4 5

Transaction 1 poses greatest risk

Transaction processing in the business logic layer

Click to edit Master title style Risk-Based Prioritization

Violation Health Factor Severity Propagated Risk Index

In a high risk transaction

Frequency of path execution

148 Security 9

371 Robustness 9

062 Performance 8

284 Robustness 7

016 Changeability 7

241 Security 5

173 Transferability 4

359 Transferability 3

Click to edit Master title style Static Analysis in US DoD

John Keane (2013)

Testing finds lt13 of structural defects

Static analysis before testing is 85 efficient

With testing and static analysis combined test schedules are reduced by 50

Click to edit Master title style Reducing Operational Incidents amp Costs

Study of structural quality measures and maintenance effort across 20 customers in a large global system integrator

TQI increase of 24 decreased corrective maintenance effort by 50

Total Quality Index

Click to edit Master title style Reducing Operational Losses

0

20

40

60

80

100

120

140

160

180

100 150 200 250 300 350 400

Technical Quality Index

Pro

du

ctio

n In

cid

ents

+$2000000 Annual Loss

$50000 Annual Loss

$5000 Annual Loss

Large international investment bank Business critical applications

Click to edit Master title style Factors Affecting Structural Quality

2017 CRASH Report Request from infocastsoftwarecom

Factor apps Robust Security Perform Change Transfer

Technology 1606 20 3 16 26 7

Java-EE

Industry 677 5 5 7 4 6

App type 510 1 1 1 1 1

Source 580 1 1 4

Shore 540

Maturity 72 28 25 15 24 12

Method 208 10 6 14 6

Team size 186 7 5 8 8

of users 262 4 3 5 4

Percentage of variation in structural quality scores accounted for by various factors

Click to edit Master title style The Technical Debt Metaphor

Structural quality problems in production code

Technical Debt

Principal borrowed

Interest on the debt

Business Risk Liability from debt

Opportunity cost

Interestmdashcontinuing IT costs attributable to the violations causing technical debt ie higher maintenance costs greater resource usage etc

Principalcost of fixing problems remaining in the code after release that must be remediated

Opportunity costmdashbenefits that could have been achieved had resources been put on new capability rather than retiring technical debt

Liabilitymdashbusiness costs related to outages breaches corrupted data etc

Technical Debt the future cost of fixing defects remaining in code at release a component of the cost of ownership

Todayrsquos talk focuses on the principal

Curtis et al (2012) IEEE Software

Click to edit Master title style Tech Debt by Quality Factor

Transferability

40

Changeability

30

Security 7

Robustness

18

70 of Technical Debt is in IT Cost (Transferability Changeability)

30 of Technical Debt is in Business Risk

(Robustness Performance Security)

Health Factor proportions are mostly

consistent across technologies

Curtis et al (2012) IEEE Software

Click to edit Master title style Join CISQ wwwit-cisqorg

Click to edit Master title style Section

References

115

Click to edit Master title style References 1 Anderson DJ (2010) Kanban Sequim WA Blue Hole Press

Basili V Trendowicz A Kowalczyk M Heidrich J Seaman C Munch J amp Rombach D (2014) Aligning Organizations through Measurement Heidelberg Springer

Boehm B (1987) Increasing software productivity IEEE Computer 20 (9) 43-57

Boehm B et al (2000) Software Cost Estimation with COCOMO II Prentice-Hall

Chrissis MB Konrad M amp Shrum S (2011) CMMI Third Edition Guidelines for Process Integration and Product Improvement Reading MA Addison Wesley

Cohn M (2006) Agile Estimating and Planning Upper Saddle River NJ Prentice-Hall

Crosby P (1979) Quality Is Free New York McGraw-Hill

Curtis B Sapiddi J amp Szynkarski A (2012) Estimating the principle of an applicationrsquos technical debt IEEE Software 29 (6) 34-42

Dion R (1993) Process improvement and the corporate balance sheet IEEE Software 10 (4) 28-35

Duncker R (1992) Presentation at the 25th Annual Conference of the Singapore Computer Society Singapore November 1992

Dyne K (1999) ESSI Benefits from Continuous Software Improvements In Proceedings of ESEPGrsquo99 Milton Keyes UK ESPI Foundation

Flowe RM amp Thordahl JB (1994) A Correlational Study of the SEIrsquos Capability Maturity Model and Software Development Performance in DoD Contracts (AFITGSSLAR94D-2) Dayton OH Air Force Institute of Technology Wright Patterson Air Force Base

116

Click to edit Master title style References 2 Haley T Ireland B Wojtaszek E Nash D amp Dion R (1995) Raytheon Electronic Systems

Experience in Software Process Improvement (Tech Rep CMUSEI-95-TR-017) ) Pittsburgh Software Engineering Institute Carnegie Mellon University

Ebert C amp Dumke R (2007) Software Measurement Berlin Springer-Verlag

Fagan E amp Davis N (2005) TSP PSP at Intuit Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Fenton N (2006) New directions for Software Metrics CIO Symposium on Software Best Practices

Garmus D amp Herron D (2001) Function Point Analysis Boston Addison-Wesley

George E (2005) Project management with TSP Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Herbsleb J Carleton A Rozum J Siegel J amp Zubrow D (1994) Benefits of CMM-Based Software Process Improvement Initial Results (Tech Rep CMUSEI-94-TR-13) Pittsburgh Software Engineering Institute Carnegie Mellon University

Humphrey W (1997) Introduction to the Personal Software Process Boston Addison Wesley

Humphrey W (2000) Introduction to the Team Software Process Boston Addison Wesley

Humphrey W (2006) TSP Leading a Development Team Boston Addison Wesley

Jackson D (2009) A direct path to dependable software Communications of the ACM 52 (4)

Kan SH (1995) Metrics and Models in Software Quality Engineering Reading MA Addison-Wesley

Keeni B (2000) IEEE Software 16 (4)

117

Click to edit Master title style References 2 Larman C (2004) Agile and Iterative Development Boston Addison Wesley

Lydon T (1995) Productivity drivers Process and capital In Proceedings of the 1995 SEPG Conference Pittsburgh Software Engineering Institute Carnegie Mellon University

McGarry J et al (2002) Practical Software Measurement Reading MA Addison Wesley

Paulk M Weber C Curtis B amp Chrissis M (1995) The Capability Maturity Model Guidelines for Improving the Software Process Reading MA Addison-Wesley

Peterman W (2000) IEEE Software JulyAugust issue

Poppendieck M amp Poppendieck T (2007) Implementing Lean Software Development Boston Addison-Wesley

Sakamoto K Kishida K amp Nakakoji K (1996) Cultural adaptation of the CMM In Fuggetta A amp Wolf A (Eds) Software Process Chichester UK Wiley 137-154

Sinha M (2006) Performance models ndash Enabling organizations prediction capability Proceedings of SEPGrsquo06 Pittsburgh Software Engineering Institute Carnegie Mellon University

Schwaber K amp Beedle M (2002) Agile Software Development Using Scrum

Spinellis D (2006) Code Quality Addison-Wesley

Van Epps S amp Marshall R (2012) Using TSP at the end of the development cycle Proceedings of TSP 2012 Pittsburgh Software Engineering Institute Carnegie Mellon University

Vu JD (1996) Software process improvement A business case In Proceedings of the European SEPG Conference Milton Keynes UK European Software Process Improvement Foundation

118

  • Best Practices for Software Process and Product Measurement
  • Instructor Dr Bill Curtis
  • Agenda
  • Section 1
  • Why Does Measurement Languish
  • Measurement Process Guidance
  • Measurement Categories
  • Productivity Analysis Objectives
  • Productivity Analysis Measures
  • Software Productivity
  • Software Size Measures
  • How Many Lines of Code
  • Function Point Estimation
  • Effort mdash Weakest Measure
  • How Quality Affects Productivity
  • Carry-forward Rework
  • Example of Quality Impact
  • Quality-Adjusted Productivity
  • Best Practices for Analysis
  • 1 Evaluate Demographics
  • 2 Segment Baselines
  • 3 Beware Sampling Effects
  • 4 Understand Variation
  • 5 Investigate Distributions
  • 6 Account for Maturity Effects
  • 7 Caution for External Data Sets
  • Slide Number 27
  • Measuring Project Progress
  • Requirements Change Impact
  • Phase Defect Removal Model
  • SCRUM Board
  • Burndown Chart and Velocity
  • Analyzing Velocity
  • Measuring and Managing Flow
  • Trends Over Sprints
  • Sources and Measures
  • Recommended Books
  • Personal Software Process (PSP)
  • Team Software Process (TSP)
  • Aggregated Personal Data Best
  • Defect Detection at Intuit
  • TSP Benefits
  • Section 3
  • CMMI Maturity Transitions
  • Enhanced Maturity Framework
  • Maturity Level Transformation
  • Measurement by Level
  • Average Improvement Results
  • Schedule Performance
  • Cost Performance
  • Crosbyrsquos Cost of Quality Model
  • Raytheons Cost of Quality
  • Raytheonrsquos Productivity
  • Raytheonrsquos Cost Reduction
  • Raytheonrsquos Cost Predictability
  • OMRONrsquos Reuse gains
  • Section 4
  • What about the Product
  • Six Sigmarsquos Challenge
  • Testing is Not Enough
  • CASTrsquos Application Intelligence Platform
  • Modern Apps are a Technology Stack
  • Architecturally Complex Defects
  • Propagated Risk
  • Transaction Risk
  • Risk-Based Prioritization
  • Static Analysis in US DoD
  • Reducing Operational Incidents amp Costs
  • Reducing Operational Losses
  • Slide Number 70
  • The Technical Debt Metaphor
  • Tech Debt by Quality Factor
  • Join CISQ wwwit-cisqorg
  • Section
  • References 1
  • References 2
  • References 2

PHASE

Escapes Previous Phase

KSLOC

Defect Injection

KSLOC

Subtotal

KSLOC

Removal

Effective-ness

Defect Removal

KSLOC

Escapes at phase exit KSLOC

Rqmts

0

12

12

0

0

12

Design

12

180

192

76

146

46

Code amp UT

46

154

200

71

142

58

Inte-gration

58

0

58

67

39

19

Sys Test

19

0

19

58

11

08

Opera-tion

08

Page 35: Best Practices for Software Process and Product Measurement · Software Process and Product Measurement Dr. Bill Curtis ... measurement best practice • Best guidance for starting

Click to edit Master title style Phase Defect Removal Model

PHASE

Escapes Previous Phase KSLOC

Defect Injection KSLOC

Subtotal KSLOC

Removal Effective-ness

Defect Removal KSLOC

Escapes at phase exit KSLOC

Rqmts 0 12 12 0 0 12

Design 12 180 192 76 146 46

Code amp UT 46 154 200 71 142 58

Inte-gration 58 0 58 67 39 19

Sys Test 19 0 19 58 11 08

Opera-tion 08

Kan (1999) Metrics and Model in Software Quality Engineering

Click to edit Master title style SCRUM Board Scrumboard displays

the path to completion for each story by showing task status

Story points are an estimate of days for each story

Agile estimating is by feature (story) rather than by task

lsquoDonersquo does not always mean all tasks are finished

Kanban methods restrict the work in progress to a limited number of paths

Click to edit Master title style Burndown Chart and Velocity

Burndown chart displays story completion by day

Burndown chart indicates actual versus estimated progress

Velocity is the rate at which stories are completed

Velocity indicates sustainable progress

Velocity results provide one source of historical data for improving estimates

Story points without the context of productivity factors are dangerous for estimating

Click to edit Master title style Team Velocity

0

1

2

3

4

5

1 2 2 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20Days

Story Points per Hour

0

1

2

3

4

5

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

HrsStryPt

Analyzing Velocity

Item Cycle Times

02468

101214161820

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

Stories Hours HrsStryPt

0

10

20

30

40

50

60

1 2 2 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

Burndown Chart

Days

Click to edit Master title style Measuring and Managing Flow

Released

In progress

Backlogged

Cumulative Flow Diagram Days to delivery

outliers

Anderson (2010) Kanban

Click to edit Master title style

0

10

20

30

40

50

60

70

80

1 2 3 4 5 6 7 8 9 10 11 12

Sprints

Sto

ry p

oin

ts d

eliv

ered

Trends Over Sprints

0

10

20

30

40

50

60

70

80

1 2 3 4 5 6 7 8 9 10 11 12

Sprints

Sto

ry p

oin

ts d

eliv

ered

0

05

1

15

2

25

3

35

4

1 2 3 4 5 6 7 8 9 10 11 12

Sprints

Tech

nic

al d

ebt

Track and correlate multiple measures

across sprints

Predictive models

Click to edit Master title style Sources and Measures

Stories submitted

Stories pulled back

Growth in size of stories

Stories added mid-sprint

Stories under review

Failed tests

Non-development tasks

Assists to other projects

Data sources

Measures Project tracking

Code control

Bug tracking

Build system

Test environment

Deployment tools

Operational monitoring

Click to edit Master title style Recommended Books

Click to edit Master title style Personal Software Process (PSP)

PSP 0 Current process Time recording

Defect recording Defect type standards

PSP 01 Coding standard

Size measurement Process improvement

proposal

Baseline Personal Process

Humphrey (1997) Introduction to the Personal Software Process

PSP 1 Size estimating

Test report

Personal Planning

PSP 2 Code reviews

Design reviews

Personal Quality Management

PSP 21 Cyclic development

Cyclic Personal Process

PSP 11 Task planning

Schedule planning

PSP 21 Design templates

Click to edit Master title style Team Software Process (TSP) bull Built from personal processes of team members

bull Well defined team roles

bull Project launch workshops for planning

bull Team measurement and tracking

bull Post-mortems for improvement

bull Team application of lean principles

Humphrey (2000) Introduction to the Team Software Process

Click to edit Master title style Aggregated Personal Data Best

R2s range from 70 to 87

The ability to predict the amount of time required to produce a piece of software is dramatically improved by estimating at the individual level first and then combining the estimates rather than developing a single team estimate averaged over individuals

R2 = 53

Click to edit Master title style Defect Detection at Intuit

Consequently test changes from defect

detection into correctness verification

PSPTSP shifts defect detection from the test phase back up into development

Fagan (2005)

Personal reviews

33

Compile 33

Team reviews 19

Unit test 15

Test 14

Click to edit Master title style

Dramatic reductions

ndash Delivered defects

ndash Days per KLOC

ndash Schedule deviation

ndash Test defectskloc

ndash Variation in results

Humphrey (2006) TSP Leading a Development Team

TSP Benefits

Click to edit Master title style Section 3 Section 3 Measuring Software

Improvement Programs

1) Improvement program methods

2) Maturity-based measurement

3) Improvement program results

61

Click to edit Master title style CMMI Maturity Transitions

Level 5 Optimizing

Innovation management

Level 4 Quantitatively

Managed

Capability management

Level 3 Defined

Process management

Level 2 Managed

Project management

Level 1 Initial

Inconsistent management

Chrisis Konrad amp Shrum (2005) CMMI

Click to edit Master title style Enhanced Maturity Framework

Level 5 Innovating

Innovation management

Level 4 Optimized

Capability management

Level 3 Standardized

Organizational management

Level 2 Stabilized

Work unit management

Level 1 Initial

Inconsistent management

Click to edit Master title style Maturity Level Transformation

Level 5 Opportunistic improvements

Empowered culture

Level 5 Proactive

improvements Agile culture

Organization

Projects

Individual

Level 4 Process and results

managed statistically Precision culture

Level 3 Organization develops

standard processes Common culture

Level 2 Project managers stabilize projects

Local culture

Level 1 Ad Hoc inconsistent

development practices Relationship culture

Trust

Click to edit Master title style Measurement by Level

Level 5 mdash Exploratory data ∆ t = N x ROI

Level 4 mdash Predictive data X plusmn (196σ radicn )

Level 3 mdash Process data 83 defects per review

Level 2 mdash Management data Effort = α LOCβ

Level 1 mdash Haphazard data ldquoabout a million linesrdquo

Measure-maturity mismatch slows improvement and creates

dysfunction

Click to edit Master title style Average Improvement Results

Productivity growth 4 35 9 - 67

Pre-test defect detection 3 22 6 - 25

Time to market 2 darr 19 15 - 23

Field error reports 5 darr 39 10 - 94

Return on investment 5 501 41 - 881

Annual Annual Improvement benefit Orgs median range

Click to edit Master title style Schedule Performance

0

02

04

06

08

1

12

14

16

1 2 3

Schedule performance

index

CMM maturity level

Budgeted cost of work performed

Budgeted cost of work scheduled

behind ahead

Air Force study of contractors

Flowe amp Thordahl (1994)

Click to edit Master title style Cost Performance

Cost performance

index

0

05

1

15

2

25

1 2 3 CMM Maturity Level

Budgeted cost of work performed Actual cost of

work performed

overrun underrun

Air Force study of contractors

Flowe amp Thordahl (1994)

Click to edit Master title style Crosbyrsquos Cost of Quality Model Project Cost

Performance Cost of Quality Generating plans documentation Development - requirements - design - code - integration Prevention Appraisal

Reviews - system - requirements - design - test plan - test scripts Walkthroughs Testing (first time) Independent Verification and Validation (first time) Audits

Training Policies Procedures and methods Tools Planning Quality Improvement Projects Data Gathering and Analysis Root Cause Analysis Quality Reporting

Nonconformance Conformance Fixing Defects Reworking any Document Updating Source Code Re-reviews Re-tests Lab costs for re-tests Patches(Internal and External) Engineering Changes Change Control Boards External Failures and fines Customer Support Help Desks

Dion (1993) Crosby (1979) Quality Is Free

Click to edit Master title style Raytheons Cost of Quality

bull Performance mdash cost of building it right first time bull Nonconformance mdash cost of rework bull Appraisal mdash cost of testing bull Prevention mdash cost of preventing nonconformance

Dion (1993) Haley (1995)

23

18

Year Level Perform Nonconf Appraise Prevent

1988 1 34 41 15 7

1990 2 55 18 15 12

1992 3 66 11

1994 4 76 6 75

Click to edit Master title style Raytheonrsquos Productivity

0

1

2

3

4

1988 1989 1990 1991 1992 1993 1994

Growth relative to 1989 baseline

Lydon (1995)

Click to edit Master title style Raytheonrsquos Cost Reduction

0

02

04

06

08

1

12

1988 1989 1990 1991 1992 1993 1994

Reduction relative to 1989 baseline

Lydon (1995)

Click to edit Master title style

Haley (1995)

1993 1992 1991 1990 1989 1988 1994

50

40

30

20

10

0

-10

Actual cost

Budgeted cost

over-run under-run

4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1

Raytheonrsquos Cost Predictability

Click to edit Master title style OMRONrsquos Reuse gains

Sakamoto et al (1996)

0

5 0 0

1 0 0 0

1 5 0 0

2 0 0 0

2 5 0 0

3 0 0 0

1 9 8 7 1 9 8 8 1 9 8 9 1 9 9 0 1 9 9 1 1 9 9 2 1 9 9 3 1 9 9 4

Steps

100

75

50

25

0

of reuse

Click to edit Master title style Section 4 Section 4 Measuring Quality and

Technical Debt

1) Structural quality measurement

2) Technical debt

3) Consortium for IT Software Quality (CISQ)

89

Click to edit Master title style What about the Product CMMIrsquos primary assumption

ldquoThe quality of a software system is governed by quality of the process used to develop itrdquo

- Watts Humphrey

CMMIrsquos product quality problem

1 Assessments do not verify product quality

2 Compliance focus undermines CMMIrsquos primary assumption

3 Product quality focus at Level 4 in CMM was lost in quantitative process control in CMMI

4 The assumptions underlying control charts are violated by software data

Click to edit Master title style Six Sigmarsquos Challenge

Process focus ndash process improvement ndash Six Sigma Product focus ndash product improvement ndash Design for 6σ

Product focus supplements product focus to unlock even more value from software

Six Sigma projects must have significant benefits Huge benefits

Large benefits

Good benefits

Okay benefits

Small benefits

Now what

Ultimately we run out of projects with significant enough benefits to continue the programhellip

so what is the solution that allows continual improvement

Click to edit Master title style Testing is Not Enough ldquoAs higher levels of assurance are demandedhelliptesting cannot deliver the level of confidence required at a reasonable costrdquo

ldquoThe correctness of the code is rarely the weakest linkrdquo

Jackson D (2009) Spinellis D (2006)

ldquohellipa failure to satisfy a non-functional requirement can be critical even catastrophichellipnon-functional requirements are sometimes difficult to verify We cannot write a test case to verify a systemrsquos reliabilityhellipThe ability to associate code to non-functional properties can be a powerful weapon in a software engineerrsquos arsenalrdquo

Click to edit Master title style CASTrsquos Application Intelligence Platform

Application Analysis

Evaluation of 1200+ coding amp

architectural rules

Application meta-data

Transferability

Changeability

Robustness

Performance

Security

Quality Measurements

Detected Violations

Expensive operation in loop Static vs pooled connections Complex query on big table Large indices on big table

Empty CATCH block Uncontrolled data access Poor memory management Opened resource not closed

SQL injection Cross-site scripting Buffer overflow Uncontrolled format string

Unstructured code Misuse of inheritance Lack of comments Violated naming convention

Highly coupled component Duplicated code Index modified in loop High cyclomatic complexity

Language Parsers

Oracle PLSQL Sybase T-SQL SQL Server T-SQL IBM SQLPSM C C++ C Pro C Cobol CICS Visual Basic VBNet ASPNet Java J2EE JSP XML HTML Javascript VBScript PHP PowerBuilder Oracle Forms PeopleSoft SAP ABAP Netweaver Tibco Business Objects Universal Analyzer for other languages

Click to edit Master title style Modern Apps are a Technology Stack Ar

chite

ctur

al C

ompl

ianc

e

Data Flow Transaction Risk

EJB PLSQL

Oracle

SQL Server

DB2

TSQL

Hibernate

Spring

Struts NET

COBOL

Sybase IMS

Messaging

Integration quality Architectural

compliance Risk propagation Application security Resiliency checks Transaction integrity

Function point Effort estimation Data access control SDK versioning Calibration across

technologies IT organization level

System Level 3

Code style amp layout Expression complexity Code documentation Class or program design Basic coding standards Developer level

Unit Level 1

Java Java Java

Java Java

Web Services Java Java Single languagetechnology layer

Intra-technology architecture Intra-layer dependencies Inter-program invocation Security vulnerabilities Development team level

Technology Level 2

JSP ASPNET APIs

Click to edit Master title style Architecturally Complex Defects

Primary cause of operational

problems

20x as many

fixes to correct

Architecturally Complex Defect

A structural flaw involving interactions among multiple components that reside in different application layers

48

52 92

8 Architecturally

Complex Defects

Code unit-level violations

of total app defects

of total repair effort

Most architecturally complex defects involve an architectural hotspot ndash a badly-built component that should be replaced

Click to edit Master title style Propagated Risk

The impact of violation 284 in Component B is very widely propagated throughout the system ndash

it presents very high risk

The impact of violation 342 in Component A is only narrowly propagated in the system

ndash it presents low risk

Propagated Risk Index

A ranking of the risk created by each violation based on the its severity and the breadth of its impact in the system

Violation 342 in component A

Violation 284 in component B

Remediating violation 284 in Component B will have

the greater impact on reducing risk and cost

Click to edit Master title style Transaction Risk

Customers care most about the dependability of their transactions

The risk of a transaction is determined by the number and severity of violations along its path

Transactions can be ranked by risk to prioritize their remediation

The Transaction Risk Index can be further tuned using the operational frequency of each transaction

Transaction Risk Index

A ranking of transactions based on the risk created by the number and severity of violations along the transaction path

Transaction entryexit in the user interface layer

Transaction fulfillment in the data storage layer

1 3 2 4 5

Transaction 1 poses greatest risk

Transaction processing in the business logic layer

Click to edit Master title style Risk-Based Prioritization

Violation Health Factor Severity Propagated Risk Index

In a high risk transaction

Frequency of path execution

148 Security 9

371 Robustness 9

062 Performance 8

284 Robustness 7

016 Changeability 7

241 Security 5

173 Transferability 4

359 Transferability 3

Click to edit Master title style Static Analysis in US DoD

John Keane (2013)

Testing finds lt13 of structural defects

Static analysis before testing is 85 efficient

With testing and static analysis combined test schedules are reduced by 50

Click to edit Master title style Reducing Operational Incidents amp Costs

Study of structural quality measures and maintenance effort across 20 customers in a large global system integrator

TQI increase of 24 decreased corrective maintenance effort by 50

Total Quality Index

Click to edit Master title style Reducing Operational Losses

0

20

40

60

80

100

120

140

160

180

100 150 200 250 300 350 400

Technical Quality Index

Pro

du

ctio

n In

cid

ents

+$2000000 Annual Loss

$50000 Annual Loss

$5000 Annual Loss

Large international investment bank Business critical applications

Click to edit Master title style Factors Affecting Structural Quality

2017 CRASH Report Request from infocastsoftwarecom

Factor apps Robust Security Perform Change Transfer

Technology 1606 20 3 16 26 7

Java-EE

Industry 677 5 5 7 4 6

App type 510 1 1 1 1 1

Source 580 1 1 4

Shore 540

Maturity 72 28 25 15 24 12

Method 208 10 6 14 6

Team size 186 7 5 8 8

of users 262 4 3 5 4

Percentage of variation in structural quality scores accounted for by various factors

Click to edit Master title style The Technical Debt Metaphor

Structural quality problems in production code

Technical Debt

Principal borrowed

Interest on the debt

Business Risk Liability from debt

Opportunity cost

Interestmdashcontinuing IT costs attributable to the violations causing technical debt ie higher maintenance costs greater resource usage etc

Principalcost of fixing problems remaining in the code after release that must be remediated

Opportunity costmdashbenefits that could have been achieved had resources been put on new capability rather than retiring technical debt

Liabilitymdashbusiness costs related to outages breaches corrupted data etc

Technical Debt the future cost of fixing defects remaining in code at release a component of the cost of ownership

Todayrsquos talk focuses on the principal

Curtis et al (2012) IEEE Software

Click to edit Master title style Tech Debt by Quality Factor

Transferability

40

Changeability

30

Security 7

Robustness

18

70 of Technical Debt is in IT Cost (Transferability Changeability)

30 of Technical Debt is in Business Risk

(Robustness Performance Security)

Health Factor proportions are mostly

consistent across technologies

Curtis et al (2012) IEEE Software

Click to edit Master title style Join CISQ wwwit-cisqorg

Click to edit Master title style Section

References

115

Click to edit Master title style References 1 Anderson DJ (2010) Kanban Sequim WA Blue Hole Press

Basili V Trendowicz A Kowalczyk M Heidrich J Seaman C Munch J amp Rombach D (2014) Aligning Organizations through Measurement Heidelberg Springer

Boehm B (1987) Increasing software productivity IEEE Computer 20 (9) 43-57

Boehm B et al (2000) Software Cost Estimation with COCOMO II Prentice-Hall

Chrissis MB Konrad M amp Shrum S (2011) CMMI Third Edition Guidelines for Process Integration and Product Improvement Reading MA Addison Wesley

Cohn M (2006) Agile Estimating and Planning Upper Saddle River NJ Prentice-Hall

Crosby P (1979) Quality Is Free New York McGraw-Hill

Curtis B Sapiddi J amp Szynkarski A (2012) Estimating the principle of an applicationrsquos technical debt IEEE Software 29 (6) 34-42

Dion R (1993) Process improvement and the corporate balance sheet IEEE Software 10 (4) 28-35

Duncker R (1992) Presentation at the 25th Annual Conference of the Singapore Computer Society Singapore November 1992

Dyne K (1999) ESSI Benefits from Continuous Software Improvements In Proceedings of ESEPGrsquo99 Milton Keyes UK ESPI Foundation

Flowe RM amp Thordahl JB (1994) A Correlational Study of the SEIrsquos Capability Maturity Model and Software Development Performance in DoD Contracts (AFITGSSLAR94D-2) Dayton OH Air Force Institute of Technology Wright Patterson Air Force Base

116

Click to edit Master title style References 2 Haley T Ireland B Wojtaszek E Nash D amp Dion R (1995) Raytheon Electronic Systems

Experience in Software Process Improvement (Tech Rep CMUSEI-95-TR-017) ) Pittsburgh Software Engineering Institute Carnegie Mellon University

Ebert C amp Dumke R (2007) Software Measurement Berlin Springer-Verlag

Fagan E amp Davis N (2005) TSP PSP at Intuit Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Fenton N (2006) New directions for Software Metrics CIO Symposium on Software Best Practices

Garmus D amp Herron D (2001) Function Point Analysis Boston Addison-Wesley

George E (2005) Project management with TSP Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Herbsleb J Carleton A Rozum J Siegel J amp Zubrow D (1994) Benefits of CMM-Based Software Process Improvement Initial Results (Tech Rep CMUSEI-94-TR-13) Pittsburgh Software Engineering Institute Carnegie Mellon University

Humphrey W (1997) Introduction to the Personal Software Process Boston Addison Wesley

Humphrey W (2000) Introduction to the Team Software Process Boston Addison Wesley

Humphrey W (2006) TSP Leading a Development Team Boston Addison Wesley

Jackson D (2009) A direct path to dependable software Communications of the ACM 52 (4)

Kan SH (1995) Metrics and Models in Software Quality Engineering Reading MA Addison-Wesley

Keeni B (2000) IEEE Software 16 (4)

117

Click to edit Master title style References 2 Larman C (2004) Agile and Iterative Development Boston Addison Wesley

Lydon T (1995) Productivity drivers Process and capital In Proceedings of the 1995 SEPG Conference Pittsburgh Software Engineering Institute Carnegie Mellon University

McGarry J et al (2002) Practical Software Measurement Reading MA Addison Wesley

Paulk M Weber C Curtis B amp Chrissis M (1995) The Capability Maturity Model Guidelines for Improving the Software Process Reading MA Addison-Wesley

Peterman W (2000) IEEE Software JulyAugust issue

Poppendieck M amp Poppendieck T (2007) Implementing Lean Software Development Boston Addison-Wesley

Sakamoto K Kishida K amp Nakakoji K (1996) Cultural adaptation of the CMM In Fuggetta A amp Wolf A (Eds) Software Process Chichester UK Wiley 137-154

Sinha M (2006) Performance models ndash Enabling organizations prediction capability Proceedings of SEPGrsquo06 Pittsburgh Software Engineering Institute Carnegie Mellon University

Schwaber K amp Beedle M (2002) Agile Software Development Using Scrum

Spinellis D (2006) Code Quality Addison-Wesley

Van Epps S amp Marshall R (2012) Using TSP at the end of the development cycle Proceedings of TSP 2012 Pittsburgh Software Engineering Institute Carnegie Mellon University

Vu JD (1996) Software process improvement A business case In Proceedings of the European SEPG Conference Milton Keynes UK European Software Process Improvement Foundation

118

  • Best Practices for Software Process and Product Measurement
  • Instructor Dr Bill Curtis
  • Agenda
  • Section 1
  • Why Does Measurement Languish
  • Measurement Process Guidance
  • Measurement Categories
  • Productivity Analysis Objectives
  • Productivity Analysis Measures
  • Software Productivity
  • Software Size Measures
  • How Many Lines of Code
  • Function Point Estimation
  • Effort mdash Weakest Measure
  • How Quality Affects Productivity
  • Carry-forward Rework
  • Example of Quality Impact
  • Quality-Adjusted Productivity
  • Best Practices for Analysis
  • 1 Evaluate Demographics
  • 2 Segment Baselines
  • 3 Beware Sampling Effects
  • 4 Understand Variation
  • 5 Investigate Distributions
  • 6 Account for Maturity Effects
  • 7 Caution for External Data Sets
  • Slide Number 27
  • Measuring Project Progress
  • Requirements Change Impact
  • Phase Defect Removal Model
  • SCRUM Board
  • Burndown Chart and Velocity
  • Analyzing Velocity
  • Measuring and Managing Flow
  • Trends Over Sprints
  • Sources and Measures
  • Recommended Books
  • Personal Software Process (PSP)
  • Team Software Process (TSP)
  • Aggregated Personal Data Best
  • Defect Detection at Intuit
  • TSP Benefits
  • Section 3
  • CMMI Maturity Transitions
  • Enhanced Maturity Framework
  • Maturity Level Transformation
  • Measurement by Level
  • Average Improvement Results
  • Schedule Performance
  • Cost Performance
  • Crosbyrsquos Cost of Quality Model
  • Raytheons Cost of Quality
  • Raytheonrsquos Productivity
  • Raytheonrsquos Cost Reduction
  • Raytheonrsquos Cost Predictability
  • OMRONrsquos Reuse gains
  • Section 4
  • What about the Product
  • Six Sigmarsquos Challenge
  • Testing is Not Enough
  • CASTrsquos Application Intelligence Platform
  • Modern Apps are a Technology Stack
  • Architecturally Complex Defects
  • Propagated Risk
  • Transaction Risk
  • Risk-Based Prioritization
  • Static Analysis in US DoD
  • Reducing Operational Incidents amp Costs
  • Reducing Operational Losses
  • Slide Number 70
  • The Technical Debt Metaphor
  • Tech Debt by Quality Factor
  • Join CISQ wwwit-cisqorg
  • Section
  • References 1
  • References 2
  • References 2

PHASE

Escapes Previous Phase

KSLOC

Defect Injection

KSLOC

Subtotal

KSLOC

Removal

Effective-ness

Defect Removal

KSLOC

Escapes at phase exit KSLOC

Rqmts

0

12

12

0

0

12

Design

12

180

192

76

146

46

Code amp UT

46

154

200

71

142

58

Inte-gration

58

0

58

67

39

19

Sys Test

19

0

19

58

11

08

Opera-tion

08

Page 36: Best Practices for Software Process and Product Measurement · Software Process and Product Measurement Dr. Bill Curtis ... measurement best practice • Best guidance for starting

Click to edit Master title style SCRUM Board Scrumboard displays

the path to completion for each story by showing task status

Story points are an estimate of days for each story

Agile estimating is by feature (story) rather than by task

lsquoDonersquo does not always mean all tasks are finished

Kanban methods restrict the work in progress to a limited number of paths

Click to edit Master title style Burndown Chart and Velocity

Burndown chart displays story completion by day

Burndown chart indicates actual versus estimated progress

Velocity is the rate at which stories are completed

Velocity indicates sustainable progress

Velocity results provide one source of historical data for improving estimates

Story points without the context of productivity factors are dangerous for estimating

Click to edit Master title style Team Velocity

0

1

2

3

4

5

1 2 2 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20Days

Story Points per Hour

0

1

2

3

4

5

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

HrsStryPt

Analyzing Velocity

Item Cycle Times

02468

101214161820

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

Stories Hours HrsStryPt

0

10

20

30

40

50

60

1 2 2 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

Burndown Chart

Days

Click to edit Master title style Measuring and Managing Flow

Released

In progress

Backlogged

Cumulative Flow Diagram Days to delivery

outliers

Anderson (2010) Kanban

Click to edit Master title style

0

10

20

30

40

50

60

70

80

1 2 3 4 5 6 7 8 9 10 11 12

Sprints

Sto

ry p

oin

ts d

eliv

ered

Trends Over Sprints

0

10

20

30

40

50

60

70

80

1 2 3 4 5 6 7 8 9 10 11 12

Sprints

Sto

ry p

oin

ts d

eliv

ered

0

05

1

15

2

25

3

35

4

1 2 3 4 5 6 7 8 9 10 11 12

Sprints

Tech

nic

al d

ebt

Track and correlate multiple measures

across sprints

Predictive models

Click to edit Master title style Sources and Measures

Stories submitted

Stories pulled back

Growth in size of stories

Stories added mid-sprint

Stories under review

Failed tests

Non-development tasks

Assists to other projects

Data sources

Measures Project tracking

Code control

Bug tracking

Build system

Test environment

Deployment tools

Operational monitoring

Click to edit Master title style Recommended Books

Click to edit Master title style Personal Software Process (PSP)

PSP 0 Current process Time recording

Defect recording Defect type standards

PSP 01 Coding standard

Size measurement Process improvement

proposal

Baseline Personal Process

Humphrey (1997) Introduction to the Personal Software Process

PSP 1 Size estimating

Test report

Personal Planning

PSP 2 Code reviews

Design reviews

Personal Quality Management

PSP 21 Cyclic development

Cyclic Personal Process

PSP 11 Task planning

Schedule planning

PSP 21 Design templates

Click to edit Master title style Team Software Process (TSP) bull Built from personal processes of team members

bull Well defined team roles

bull Project launch workshops for planning

bull Team measurement and tracking

bull Post-mortems for improvement

bull Team application of lean principles

Humphrey (2000) Introduction to the Team Software Process

Click to edit Master title style Aggregated Personal Data Best

R2s range from 70 to 87

The ability to predict the amount of time required to produce a piece of software is dramatically improved by estimating at the individual level first and then combining the estimates rather than developing a single team estimate averaged over individuals

R2 = 53

Click to edit Master title style Defect Detection at Intuit

Consequently test changes from defect

detection into correctness verification

PSPTSP shifts defect detection from the test phase back up into development

Fagan (2005)

Personal reviews

33

Compile 33

Team reviews 19

Unit test 15

Test 14

Click to edit Master title style

Dramatic reductions

ndash Delivered defects

ndash Days per KLOC

ndash Schedule deviation

ndash Test defectskloc

ndash Variation in results

Humphrey (2006) TSP Leading a Development Team

TSP Benefits

Click to edit Master title style Section 3 Section 3 Measuring Software

Improvement Programs

1) Improvement program methods

2) Maturity-based measurement

3) Improvement program results

61

Click to edit Master title style CMMI Maturity Transitions

Level 5 Optimizing

Innovation management

Level 4 Quantitatively

Managed

Capability management

Level 3 Defined

Process management

Level 2 Managed

Project management

Level 1 Initial

Inconsistent management

Chrisis Konrad amp Shrum (2005) CMMI

Click to edit Master title style Enhanced Maturity Framework

Level 5 Innovating

Innovation management

Level 4 Optimized

Capability management

Level 3 Standardized

Organizational management

Level 2 Stabilized

Work unit management

Level 1 Initial

Inconsistent management

Click to edit Master title style Maturity Level Transformation

Level 5 Opportunistic improvements

Empowered culture

Level 5 Proactive

improvements Agile culture

Organization

Projects

Individual

Level 4 Process and results

managed statistically Precision culture

Level 3 Organization develops

standard processes Common culture

Level 2 Project managers stabilize projects

Local culture

Level 1 Ad Hoc inconsistent

development practices Relationship culture

Trust

Click to edit Master title style Measurement by Level

Level 5 mdash Exploratory data ∆ t = N x ROI

Level 4 mdash Predictive data X plusmn (196σ radicn )

Level 3 mdash Process data 83 defects per review

Level 2 mdash Management data Effort = α LOCβ

Level 1 mdash Haphazard data ldquoabout a million linesrdquo

Measure-maturity mismatch slows improvement and creates

dysfunction

Click to edit Master title style Average Improvement Results

Productivity growth 4 35 9 - 67

Pre-test defect detection 3 22 6 - 25

Time to market 2 darr 19 15 - 23

Field error reports 5 darr 39 10 - 94

Return on investment 5 501 41 - 881

Annual Annual Improvement benefit Orgs median range

Click to edit Master title style Schedule Performance

0

02

04

06

08

1

12

14

16

1 2 3

Schedule performance

index

CMM maturity level

Budgeted cost of work performed

Budgeted cost of work scheduled

behind ahead

Air Force study of contractors

Flowe amp Thordahl (1994)

Click to edit Master title style Cost Performance

Cost performance

index

0

05

1

15

2

25

1 2 3 CMM Maturity Level

Budgeted cost of work performed Actual cost of

work performed

overrun underrun

Air Force study of contractors

Flowe amp Thordahl (1994)

Click to edit Master title style Crosbyrsquos Cost of Quality Model Project Cost

Performance Cost of Quality Generating plans documentation Development - requirements - design - code - integration Prevention Appraisal

Reviews - system - requirements - design - test plan - test scripts Walkthroughs Testing (first time) Independent Verification and Validation (first time) Audits

Training Policies Procedures and methods Tools Planning Quality Improvement Projects Data Gathering and Analysis Root Cause Analysis Quality Reporting

Nonconformance Conformance Fixing Defects Reworking any Document Updating Source Code Re-reviews Re-tests Lab costs for re-tests Patches(Internal and External) Engineering Changes Change Control Boards External Failures and fines Customer Support Help Desks

Dion (1993) Crosby (1979) Quality Is Free

Click to edit Master title style Raytheons Cost of Quality

bull Performance mdash cost of building it right first time bull Nonconformance mdash cost of rework bull Appraisal mdash cost of testing bull Prevention mdash cost of preventing nonconformance

Dion (1993) Haley (1995)

23

18

Year Level Perform Nonconf Appraise Prevent

1988 1 34 41 15 7

1990 2 55 18 15 12

1992 3 66 11

1994 4 76 6 75

Click to edit Master title style Raytheonrsquos Productivity

0

1

2

3

4

1988 1989 1990 1991 1992 1993 1994

Growth relative to 1989 baseline

Lydon (1995)

Click to edit Master title style Raytheonrsquos Cost Reduction

0

02

04

06

08

1

12

1988 1989 1990 1991 1992 1993 1994

Reduction relative to 1989 baseline

Lydon (1995)

Click to edit Master title style

Haley (1995)

1993 1992 1991 1990 1989 1988 1994

50

40

30

20

10

0

-10

Actual cost

Budgeted cost

over-run under-run

4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1

Raytheonrsquos Cost Predictability

Click to edit Master title style OMRONrsquos Reuse gains

Sakamoto et al (1996)

0

5 0 0

1 0 0 0

1 5 0 0

2 0 0 0

2 5 0 0

3 0 0 0

1 9 8 7 1 9 8 8 1 9 8 9 1 9 9 0 1 9 9 1 1 9 9 2 1 9 9 3 1 9 9 4

Steps

100

75

50

25

0

of reuse

Click to edit Master title style Section 4 Section 4 Measuring Quality and

Technical Debt

1) Structural quality measurement

2) Technical debt

3) Consortium for IT Software Quality (CISQ)

89

Click to edit Master title style What about the Product CMMIrsquos primary assumption

ldquoThe quality of a software system is governed by quality of the process used to develop itrdquo

- Watts Humphrey

CMMIrsquos product quality problem

1 Assessments do not verify product quality

2 Compliance focus undermines CMMIrsquos primary assumption

3 Product quality focus at Level 4 in CMM was lost in quantitative process control in CMMI

4 The assumptions underlying control charts are violated by software data

Click to edit Master title style Six Sigmarsquos Challenge

Process focus ndash process improvement ndash Six Sigma Product focus ndash product improvement ndash Design for 6σ

Product focus supplements product focus to unlock even more value from software

Six Sigma projects must have significant benefits Huge benefits

Large benefits

Good benefits

Okay benefits

Small benefits

Now what

Ultimately we run out of projects with significant enough benefits to continue the programhellip

so what is the solution that allows continual improvement

Click to edit Master title style Testing is Not Enough ldquoAs higher levels of assurance are demandedhelliptesting cannot deliver the level of confidence required at a reasonable costrdquo

ldquoThe correctness of the code is rarely the weakest linkrdquo

Jackson D (2009) Spinellis D (2006)

ldquohellipa failure to satisfy a non-functional requirement can be critical even catastrophichellipnon-functional requirements are sometimes difficult to verify We cannot write a test case to verify a systemrsquos reliabilityhellipThe ability to associate code to non-functional properties can be a powerful weapon in a software engineerrsquos arsenalrdquo

Click to edit Master title style CASTrsquos Application Intelligence Platform

Application Analysis

Evaluation of 1200+ coding amp

architectural rules

Application meta-data

Transferability

Changeability

Robustness

Performance

Security

Quality Measurements

Detected Violations

Expensive operation in loop Static vs pooled connections Complex query on big table Large indices on big table

Empty CATCH block Uncontrolled data access Poor memory management Opened resource not closed

SQL injection Cross-site scripting Buffer overflow Uncontrolled format string

Unstructured code Misuse of inheritance Lack of comments Violated naming convention

Highly coupled component Duplicated code Index modified in loop High cyclomatic complexity

Language Parsers

Oracle PLSQL Sybase T-SQL SQL Server T-SQL IBM SQLPSM C C++ C Pro C Cobol CICS Visual Basic VBNet ASPNet Java J2EE JSP XML HTML Javascript VBScript PHP PowerBuilder Oracle Forms PeopleSoft SAP ABAP Netweaver Tibco Business Objects Universal Analyzer for other languages

Click to edit Master title style Modern Apps are a Technology Stack Ar

chite

ctur

al C

ompl

ianc

e

Data Flow Transaction Risk

EJB PLSQL

Oracle

SQL Server

DB2

TSQL

Hibernate

Spring

Struts NET

COBOL

Sybase IMS

Messaging

Integration quality Architectural

compliance Risk propagation Application security Resiliency checks Transaction integrity

Function point Effort estimation Data access control SDK versioning Calibration across

technologies IT organization level

System Level 3

Code style amp layout Expression complexity Code documentation Class or program design Basic coding standards Developer level

Unit Level 1

Java Java Java

Java Java

Web Services Java Java Single languagetechnology layer

Intra-technology architecture Intra-layer dependencies Inter-program invocation Security vulnerabilities Development team level

Technology Level 2

JSP ASPNET APIs

Click to edit Master title style Architecturally Complex Defects

Primary cause of operational

problems

20x as many

fixes to correct

Architecturally Complex Defect

A structural flaw involving interactions among multiple components that reside in different application layers

48

52 92

8 Architecturally

Complex Defects

Code unit-level violations

of total app defects

of total repair effort

Most architecturally complex defects involve an architectural hotspot ndash a badly-built component that should be replaced

Click to edit Master title style Propagated Risk

The impact of violation 284 in Component B is very widely propagated throughout the system ndash

it presents very high risk

The impact of violation 342 in Component A is only narrowly propagated in the system

ndash it presents low risk

Propagated Risk Index

A ranking of the risk created by each violation based on the its severity and the breadth of its impact in the system

Violation 342 in component A

Violation 284 in component B

Remediating violation 284 in Component B will have

the greater impact on reducing risk and cost

Click to edit Master title style Transaction Risk

Customers care most about the dependability of their transactions

The risk of a transaction is determined by the number and severity of violations along its path

Transactions can be ranked by risk to prioritize their remediation

The Transaction Risk Index can be further tuned using the operational frequency of each transaction

Transaction Risk Index

A ranking of transactions based on the risk created by the number and severity of violations along the transaction path

Transaction entryexit in the user interface layer

Transaction fulfillment in the data storage layer

1 3 2 4 5

Transaction 1 poses greatest risk

Transaction processing in the business logic layer

Click to edit Master title style Risk-Based Prioritization

Violation Health Factor Severity Propagated Risk Index

In a high risk transaction

Frequency of path execution

148 Security 9

371 Robustness 9

062 Performance 8

284 Robustness 7

016 Changeability 7

241 Security 5

173 Transferability 4

359 Transferability 3

Click to edit Master title style Static Analysis in US DoD

John Keane (2013)

Testing finds lt13 of structural defects

Static analysis before testing is 85 efficient

With testing and static analysis combined test schedules are reduced by 50

Click to edit Master title style Reducing Operational Incidents amp Costs

Study of structural quality measures and maintenance effort across 20 customers in a large global system integrator

TQI increase of 24 decreased corrective maintenance effort by 50

Total Quality Index

Click to edit Master title style Reducing Operational Losses

0

20

40

60

80

100

120

140

160

180

100 150 200 250 300 350 400

Technical Quality Index

Pro

du

ctio

n In

cid

ents

+$2000000 Annual Loss

$50000 Annual Loss

$5000 Annual Loss

Large international investment bank Business critical applications

Click to edit Master title style Factors Affecting Structural Quality

2017 CRASH Report Request from infocastsoftwarecom

Factor apps Robust Security Perform Change Transfer

Technology 1606 20 3 16 26 7

Java-EE

Industry 677 5 5 7 4 6

App type 510 1 1 1 1 1

Source 580 1 1 4

Shore 540

Maturity 72 28 25 15 24 12

Method 208 10 6 14 6

Team size 186 7 5 8 8

of users 262 4 3 5 4

Percentage of variation in structural quality scores accounted for by various factors

Click to edit Master title style The Technical Debt Metaphor

Structural quality problems in production code

Technical Debt

Principal borrowed

Interest on the debt

Business Risk Liability from debt

Opportunity cost

Interestmdashcontinuing IT costs attributable to the violations causing technical debt ie higher maintenance costs greater resource usage etc

Principalcost of fixing problems remaining in the code after release that must be remediated

Opportunity costmdashbenefits that could have been achieved had resources been put on new capability rather than retiring technical debt

Liabilitymdashbusiness costs related to outages breaches corrupted data etc

Technical Debt the future cost of fixing defects remaining in code at release a component of the cost of ownership

Todayrsquos talk focuses on the principal

Curtis et al (2012) IEEE Software

Click to edit Master title style Tech Debt by Quality Factor

Transferability

40

Changeability

30

Security 7

Robustness

18

70 of Technical Debt is in IT Cost (Transferability Changeability)

30 of Technical Debt is in Business Risk

(Robustness Performance Security)

Health Factor proportions are mostly

consistent across technologies

Curtis et al (2012) IEEE Software

Click to edit Master title style Join CISQ wwwit-cisqorg

Click to edit Master title style Section

References

115

Click to edit Master title style References 1 Anderson DJ (2010) Kanban Sequim WA Blue Hole Press

Basili V Trendowicz A Kowalczyk M Heidrich J Seaman C Munch J amp Rombach D (2014) Aligning Organizations through Measurement Heidelberg Springer

Boehm B (1987) Increasing software productivity IEEE Computer 20 (9) 43-57

Boehm B et al (2000) Software Cost Estimation with COCOMO II Prentice-Hall

Chrissis MB Konrad M amp Shrum S (2011) CMMI Third Edition Guidelines for Process Integration and Product Improvement Reading MA Addison Wesley

Cohn M (2006) Agile Estimating and Planning Upper Saddle River NJ Prentice-Hall

Crosby P (1979) Quality Is Free New York McGraw-Hill

Curtis B Sapiddi J amp Szynkarski A (2012) Estimating the principle of an applicationrsquos technical debt IEEE Software 29 (6) 34-42

Dion R (1993) Process improvement and the corporate balance sheet IEEE Software 10 (4) 28-35

Duncker R (1992) Presentation at the 25th Annual Conference of the Singapore Computer Society Singapore November 1992

Dyne K (1999) ESSI Benefits from Continuous Software Improvements In Proceedings of ESEPGrsquo99 Milton Keyes UK ESPI Foundation

Flowe RM amp Thordahl JB (1994) A Correlational Study of the SEIrsquos Capability Maturity Model and Software Development Performance in DoD Contracts (AFITGSSLAR94D-2) Dayton OH Air Force Institute of Technology Wright Patterson Air Force Base

116

Click to edit Master title style References 2 Haley T Ireland B Wojtaszek E Nash D amp Dion R (1995) Raytheon Electronic Systems

Experience in Software Process Improvement (Tech Rep CMUSEI-95-TR-017) ) Pittsburgh Software Engineering Institute Carnegie Mellon University

Ebert C amp Dumke R (2007) Software Measurement Berlin Springer-Verlag

Fagan E amp Davis N (2005) TSP PSP at Intuit Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Fenton N (2006) New directions for Software Metrics CIO Symposium on Software Best Practices

Garmus D amp Herron D (2001) Function Point Analysis Boston Addison-Wesley

George E (2005) Project management with TSP Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Herbsleb J Carleton A Rozum J Siegel J amp Zubrow D (1994) Benefits of CMM-Based Software Process Improvement Initial Results (Tech Rep CMUSEI-94-TR-13) Pittsburgh Software Engineering Institute Carnegie Mellon University

Humphrey W (1997) Introduction to the Personal Software Process Boston Addison Wesley

Humphrey W (2000) Introduction to the Team Software Process Boston Addison Wesley

Humphrey W (2006) TSP Leading a Development Team Boston Addison Wesley

Jackson D (2009) A direct path to dependable software Communications of the ACM 52 (4)

Kan SH (1995) Metrics and Models in Software Quality Engineering Reading MA Addison-Wesley

Keeni B (2000) IEEE Software 16 (4)

117

Click to edit Master title style References 2 Larman C (2004) Agile and Iterative Development Boston Addison Wesley

Lydon T (1995) Productivity drivers Process and capital In Proceedings of the 1995 SEPG Conference Pittsburgh Software Engineering Institute Carnegie Mellon University

McGarry J et al (2002) Practical Software Measurement Reading MA Addison Wesley

Paulk M Weber C Curtis B amp Chrissis M (1995) The Capability Maturity Model Guidelines for Improving the Software Process Reading MA Addison-Wesley

Peterman W (2000) IEEE Software JulyAugust issue

Poppendieck M amp Poppendieck T (2007) Implementing Lean Software Development Boston Addison-Wesley

Sakamoto K Kishida K amp Nakakoji K (1996) Cultural adaptation of the CMM In Fuggetta A amp Wolf A (Eds) Software Process Chichester UK Wiley 137-154

Sinha M (2006) Performance models ndash Enabling organizations prediction capability Proceedings of SEPGrsquo06 Pittsburgh Software Engineering Institute Carnegie Mellon University

Schwaber K amp Beedle M (2002) Agile Software Development Using Scrum

Spinellis D (2006) Code Quality Addison-Wesley

Van Epps S amp Marshall R (2012) Using TSP at the end of the development cycle Proceedings of TSP 2012 Pittsburgh Software Engineering Institute Carnegie Mellon University

Vu JD (1996) Software process improvement A business case In Proceedings of the European SEPG Conference Milton Keynes UK European Software Process Improvement Foundation

118

  • Best Practices for Software Process and Product Measurement
  • Instructor Dr Bill Curtis
  • Agenda
  • Section 1
  • Why Does Measurement Languish
  • Measurement Process Guidance
  • Measurement Categories
  • Productivity Analysis Objectives
  • Productivity Analysis Measures
  • Software Productivity
  • Software Size Measures
  • How Many Lines of Code
  • Function Point Estimation
  • Effort mdash Weakest Measure
  • How Quality Affects Productivity
  • Carry-forward Rework
  • Example of Quality Impact
  • Quality-Adjusted Productivity
  • Best Practices for Analysis
  • 1 Evaluate Demographics
  • 2 Segment Baselines
  • 3 Beware Sampling Effects
  • 4 Understand Variation
  • 5 Investigate Distributions
  • 6 Account for Maturity Effects
  • 7 Caution for External Data Sets
  • Slide Number 27
  • Measuring Project Progress
  • Requirements Change Impact
  • Phase Defect Removal Model
  • SCRUM Board
  • Burndown Chart and Velocity
  • Analyzing Velocity
  • Measuring and Managing Flow
  • Trends Over Sprints
  • Sources and Measures
  • Recommended Books
  • Personal Software Process (PSP)
  • Team Software Process (TSP)
  • Aggregated Personal Data Best
  • Defect Detection at Intuit
  • TSP Benefits
  • Section 3
  • CMMI Maturity Transitions
  • Enhanced Maturity Framework
  • Maturity Level Transformation
  • Measurement by Level
  • Average Improvement Results
  • Schedule Performance
  • Cost Performance
  • Crosbyrsquos Cost of Quality Model
  • Raytheons Cost of Quality
  • Raytheonrsquos Productivity
  • Raytheonrsquos Cost Reduction
  • Raytheonrsquos Cost Predictability
  • OMRONrsquos Reuse gains
  • Section 4
  • What about the Product
  • Six Sigmarsquos Challenge
  • Testing is Not Enough
  • CASTrsquos Application Intelligence Platform
  • Modern Apps are a Technology Stack
  • Architecturally Complex Defects
  • Propagated Risk
  • Transaction Risk
  • Risk-Based Prioritization
  • Static Analysis in US DoD
  • Reducing Operational Incidents amp Costs
  • Reducing Operational Losses
  • Slide Number 70
  • The Technical Debt Metaphor
  • Tech Debt by Quality Factor
  • Join CISQ wwwit-cisqorg
  • Section
  • References 1
  • References 2
  • References 2
Page 37: Best Practices for Software Process and Product Measurement · Software Process and Product Measurement Dr. Bill Curtis ... measurement best practice • Best guidance for starting

Click to edit Master title style Burndown Chart and Velocity

Burndown chart displays story completion by day

Burndown chart indicates actual versus estimated progress

Velocity is the rate at which stories are completed

Velocity indicates sustainable progress

Velocity results provide one source of historical data for improving estimates

Story points without the context of productivity factors are dangerous for estimating

Click to edit Master title style Team Velocity

0

1

2

3

4

5

1 2 2 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20Days

Story Points per Hour

0

1

2

3

4

5

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

HrsStryPt

Analyzing Velocity

Item Cycle Times

02468

101214161820

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

Stories Hours HrsStryPt

0

10

20

30

40

50

60

1 2 2 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

Burndown Chart

Days

Click to edit Master title style Measuring and Managing Flow

Released

In progress

Backlogged

Cumulative Flow Diagram Days to delivery

outliers

Anderson (2010) Kanban

Click to edit Master title style

0

10

20

30

40

50

60

70

80

1 2 3 4 5 6 7 8 9 10 11 12

Sprints

Sto

ry p

oin

ts d

eliv

ered

Trends Over Sprints

0

10

20

30

40

50

60

70

80

1 2 3 4 5 6 7 8 9 10 11 12

Sprints

Sto

ry p

oin

ts d

eliv

ered

0

05

1

15

2

25

3

35

4

1 2 3 4 5 6 7 8 9 10 11 12

Sprints

Tech

nic

al d

ebt

Track and correlate multiple measures

across sprints

Predictive models

Click to edit Master title style Sources and Measures

Stories submitted

Stories pulled back

Growth in size of stories

Stories added mid-sprint

Stories under review

Failed tests

Non-development tasks

Assists to other projects

Data sources

Measures Project tracking

Code control

Bug tracking

Build system

Test environment

Deployment tools

Operational monitoring

Click to edit Master title style Recommended Books

Click to edit Master title style Personal Software Process (PSP)

PSP 0 Current process Time recording

Defect recording Defect type standards

PSP 01 Coding standard

Size measurement Process improvement

proposal

Baseline Personal Process

Humphrey (1997) Introduction to the Personal Software Process

PSP 1 Size estimating

Test report

Personal Planning

PSP 2 Code reviews

Design reviews

Personal Quality Management

PSP 21 Cyclic development

Cyclic Personal Process

PSP 11 Task planning

Schedule planning

PSP 21 Design templates

Click to edit Master title style Team Software Process (TSP) bull Built from personal processes of team members

bull Well defined team roles

bull Project launch workshops for planning

bull Team measurement and tracking

bull Post-mortems for improvement

bull Team application of lean principles

Humphrey (2000) Introduction to the Team Software Process

Click to edit Master title style Aggregated Personal Data Best

R2s range from 70 to 87

The ability to predict the amount of time required to produce a piece of software is dramatically improved by estimating at the individual level first and then combining the estimates rather than developing a single team estimate averaged over individuals

R2 = 53

Click to edit Master title style Defect Detection at Intuit

Consequently test changes from defect

detection into correctness verification

PSPTSP shifts defect detection from the test phase back up into development

Fagan (2005)

Personal reviews

33

Compile 33

Team reviews 19

Unit test 15

Test 14

Click to edit Master title style

Dramatic reductions

ndash Delivered defects

ndash Days per KLOC

ndash Schedule deviation

ndash Test defectskloc

ndash Variation in results

Humphrey (2006) TSP Leading a Development Team

TSP Benefits

Click to edit Master title style Section 3 Section 3 Measuring Software

Improvement Programs

1) Improvement program methods

2) Maturity-based measurement

3) Improvement program results

61

Click to edit Master title style CMMI Maturity Transitions

Level 5 Optimizing

Innovation management

Level 4 Quantitatively

Managed

Capability management

Level 3 Defined

Process management

Level 2 Managed

Project management

Level 1 Initial

Inconsistent management

Chrisis Konrad amp Shrum (2005) CMMI

Click to edit Master title style Enhanced Maturity Framework

Level 5 Innovating

Innovation management

Level 4 Optimized

Capability management

Level 3 Standardized

Organizational management

Level 2 Stabilized

Work unit management

Level 1 Initial

Inconsistent management

Click to edit Master title style Maturity Level Transformation

Level 5 Opportunistic improvements

Empowered culture

Level 5 Proactive

improvements Agile culture

Organization

Projects

Individual

Level 4 Process and results

managed statistically Precision culture

Level 3 Organization develops

standard processes Common culture

Level 2 Project managers stabilize projects

Local culture

Level 1 Ad Hoc inconsistent

development practices Relationship culture

Trust

Click to edit Master title style Measurement by Level

Level 5 mdash Exploratory data ∆ t = N x ROI

Level 4 mdash Predictive data X plusmn (196σ radicn )

Level 3 mdash Process data 83 defects per review

Level 2 mdash Management data Effort = α LOCβ

Level 1 mdash Haphazard data ldquoabout a million linesrdquo

Measure-maturity mismatch slows improvement and creates

dysfunction

Click to edit Master title style Average Improvement Results

Productivity growth 4 35 9 - 67

Pre-test defect detection 3 22 6 - 25

Time to market 2 darr 19 15 - 23

Field error reports 5 darr 39 10 - 94

Return on investment 5 501 41 - 881

Annual Annual Improvement benefit Orgs median range

Click to edit Master title style Schedule Performance

0

02

04

06

08

1

12

14

16

1 2 3

Schedule performance

index

CMM maturity level

Budgeted cost of work performed

Budgeted cost of work scheduled

behind ahead

Air Force study of contractors

Flowe amp Thordahl (1994)

Click to edit Master title style Cost Performance

Cost performance

index

0

05

1

15

2

25

1 2 3 CMM Maturity Level

Budgeted cost of work performed Actual cost of

work performed

overrun underrun

Air Force study of contractors

Flowe amp Thordahl (1994)

Click to edit Master title style Crosbyrsquos Cost of Quality Model Project Cost

Performance Cost of Quality Generating plans documentation Development - requirements - design - code - integration Prevention Appraisal

Reviews - system - requirements - design - test plan - test scripts Walkthroughs Testing (first time) Independent Verification and Validation (first time) Audits

Training Policies Procedures and methods Tools Planning Quality Improvement Projects Data Gathering and Analysis Root Cause Analysis Quality Reporting

Nonconformance Conformance Fixing Defects Reworking any Document Updating Source Code Re-reviews Re-tests Lab costs for re-tests Patches(Internal and External) Engineering Changes Change Control Boards External Failures and fines Customer Support Help Desks

Dion (1993) Crosby (1979) Quality Is Free

Click to edit Master title style Raytheons Cost of Quality

bull Performance mdash cost of building it right first time bull Nonconformance mdash cost of rework bull Appraisal mdash cost of testing bull Prevention mdash cost of preventing nonconformance

Dion (1993) Haley (1995)

23

18

Year Level Perform Nonconf Appraise Prevent

1988 1 34 41 15 7

1990 2 55 18 15 12

1992 3 66 11

1994 4 76 6 75

Click to edit Master title style Raytheonrsquos Productivity

0

1

2

3

4

1988 1989 1990 1991 1992 1993 1994

Growth relative to 1989 baseline

Lydon (1995)

Click to edit Master title style Raytheonrsquos Cost Reduction

0

02

04

06

08

1

12

1988 1989 1990 1991 1992 1993 1994

Reduction relative to 1989 baseline

Lydon (1995)

Click to edit Master title style

Haley (1995)

1993 1992 1991 1990 1989 1988 1994

50

40

30

20

10

0

-10

Actual cost

Budgeted cost

over-run under-run

4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1

Raytheonrsquos Cost Predictability

Click to edit Master title style OMRONrsquos Reuse gains

Sakamoto et al (1996)

0

5 0 0

1 0 0 0

1 5 0 0

2 0 0 0

2 5 0 0

3 0 0 0

1 9 8 7 1 9 8 8 1 9 8 9 1 9 9 0 1 9 9 1 1 9 9 2 1 9 9 3 1 9 9 4

Steps

100

75

50

25

0

of reuse

Click to edit Master title style Section 4 Section 4 Measuring Quality and

Technical Debt

1) Structural quality measurement

2) Technical debt

3) Consortium for IT Software Quality (CISQ)

89

Click to edit Master title style What about the Product CMMIrsquos primary assumption

ldquoThe quality of a software system is governed by quality of the process used to develop itrdquo

- Watts Humphrey

CMMIrsquos product quality problem

1 Assessments do not verify product quality

2 Compliance focus undermines CMMIrsquos primary assumption

3 Product quality focus at Level 4 in CMM was lost in quantitative process control in CMMI

4 The assumptions underlying control charts are violated by software data

Click to edit Master title style Six Sigmarsquos Challenge

Process focus ndash process improvement ndash Six Sigma Product focus ndash product improvement ndash Design for 6σ

Product focus supplements product focus to unlock even more value from software

Six Sigma projects must have significant benefits Huge benefits

Large benefits

Good benefits

Okay benefits

Small benefits

Now what

Ultimately we run out of projects with significant enough benefits to continue the programhellip

so what is the solution that allows continual improvement

Click to edit Master title style Testing is Not Enough ldquoAs higher levels of assurance are demandedhelliptesting cannot deliver the level of confidence required at a reasonable costrdquo

ldquoThe correctness of the code is rarely the weakest linkrdquo

Jackson D (2009) Spinellis D (2006)

ldquohellipa failure to satisfy a non-functional requirement can be critical even catastrophichellipnon-functional requirements are sometimes difficult to verify We cannot write a test case to verify a systemrsquos reliabilityhellipThe ability to associate code to non-functional properties can be a powerful weapon in a software engineerrsquos arsenalrdquo

Click to edit Master title style CASTrsquos Application Intelligence Platform

Application Analysis

Evaluation of 1200+ coding amp

architectural rules

Application meta-data

Transferability

Changeability

Robustness

Performance

Security

Quality Measurements

Detected Violations

Expensive operation in loop Static vs pooled connections Complex query on big table Large indices on big table

Empty CATCH block Uncontrolled data access Poor memory management Opened resource not closed

SQL injection Cross-site scripting Buffer overflow Uncontrolled format string

Unstructured code Misuse of inheritance Lack of comments Violated naming convention

Highly coupled component Duplicated code Index modified in loop High cyclomatic complexity

Language Parsers

Oracle PLSQL Sybase T-SQL SQL Server T-SQL IBM SQLPSM C C++ C Pro C Cobol CICS Visual Basic VBNet ASPNet Java J2EE JSP XML HTML Javascript VBScript PHP PowerBuilder Oracle Forms PeopleSoft SAP ABAP Netweaver Tibco Business Objects Universal Analyzer for other languages

Click to edit Master title style Modern Apps are a Technology Stack Ar

chite

ctur

al C

ompl

ianc

e

Data Flow Transaction Risk

EJB PLSQL

Oracle

SQL Server

DB2

TSQL

Hibernate

Spring

Struts NET

COBOL

Sybase IMS

Messaging

Integration quality Architectural

compliance Risk propagation Application security Resiliency checks Transaction integrity

Function point Effort estimation Data access control SDK versioning Calibration across

technologies IT organization level

System Level 3

Code style amp layout Expression complexity Code documentation Class or program design Basic coding standards Developer level

Unit Level 1

Java Java Java

Java Java

Web Services Java Java Single languagetechnology layer

Intra-technology architecture Intra-layer dependencies Inter-program invocation Security vulnerabilities Development team level

Technology Level 2

JSP ASPNET APIs

Click to edit Master title style Architecturally Complex Defects

Primary cause of operational

problems

20x as many

fixes to correct

Architecturally Complex Defect

A structural flaw involving interactions among multiple components that reside in different application layers

48

52 92

8 Architecturally

Complex Defects

Code unit-level violations

of total app defects

of total repair effort

Most architecturally complex defects involve an architectural hotspot ndash a badly-built component that should be replaced

Click to edit Master title style Propagated Risk

The impact of violation 284 in Component B is very widely propagated throughout the system ndash

it presents very high risk

The impact of violation 342 in Component A is only narrowly propagated in the system

ndash it presents low risk

Propagated Risk Index

A ranking of the risk created by each violation based on the its severity and the breadth of its impact in the system

Violation 342 in component A

Violation 284 in component B

Remediating violation 284 in Component B will have

the greater impact on reducing risk and cost

Click to edit Master title style Transaction Risk

Customers care most about the dependability of their transactions

The risk of a transaction is determined by the number and severity of violations along its path

Transactions can be ranked by risk to prioritize their remediation

The Transaction Risk Index can be further tuned using the operational frequency of each transaction

Transaction Risk Index

A ranking of transactions based on the risk created by the number and severity of violations along the transaction path

Transaction entryexit in the user interface layer

Transaction fulfillment in the data storage layer

1 3 2 4 5

Transaction 1 poses greatest risk

Transaction processing in the business logic layer

Click to edit Master title style Risk-Based Prioritization

Violation Health Factor Severity Propagated Risk Index

In a high risk transaction

Frequency of path execution

148 Security 9

371 Robustness 9

062 Performance 8

284 Robustness 7

016 Changeability 7

241 Security 5

173 Transferability 4

359 Transferability 3

Click to edit Master title style Static Analysis in US DoD

John Keane (2013)

Testing finds lt13 of structural defects

Static analysis before testing is 85 efficient

With testing and static analysis combined test schedules are reduced by 50

Click to edit Master title style Reducing Operational Incidents amp Costs

Study of structural quality measures and maintenance effort across 20 customers in a large global system integrator

TQI increase of 24 decreased corrective maintenance effort by 50

Total Quality Index

Click to edit Master title style Reducing Operational Losses

0

20

40

60

80

100

120

140

160

180

100 150 200 250 300 350 400

Technical Quality Index

Pro

du

ctio

n In

cid

ents

+$2000000 Annual Loss

$50000 Annual Loss

$5000 Annual Loss

Large international investment bank Business critical applications

Click to edit Master title style Factors Affecting Structural Quality

2017 CRASH Report Request from infocastsoftwarecom

Factor apps Robust Security Perform Change Transfer

Technology 1606 20 3 16 26 7

Java-EE

Industry 677 5 5 7 4 6

App type 510 1 1 1 1 1

Source 580 1 1 4

Shore 540

Maturity 72 28 25 15 24 12

Method 208 10 6 14 6

Team size 186 7 5 8 8

of users 262 4 3 5 4

Percentage of variation in structural quality scores accounted for by various factors

Click to edit Master title style The Technical Debt Metaphor

Structural quality problems in production code

Technical Debt

Principal borrowed

Interest on the debt

Business Risk Liability from debt

Opportunity cost

Interestmdashcontinuing IT costs attributable to the violations causing technical debt ie higher maintenance costs greater resource usage etc

Principalcost of fixing problems remaining in the code after release that must be remediated

Opportunity costmdashbenefits that could have been achieved had resources been put on new capability rather than retiring technical debt

Liabilitymdashbusiness costs related to outages breaches corrupted data etc

Technical Debt the future cost of fixing defects remaining in code at release a component of the cost of ownership

Todayrsquos talk focuses on the principal

Curtis et al (2012) IEEE Software

Click to edit Master title style Tech Debt by Quality Factor

Transferability

40

Changeability

30

Security 7

Robustness

18

70 of Technical Debt is in IT Cost (Transferability Changeability)

30 of Technical Debt is in Business Risk

(Robustness Performance Security)

Health Factor proportions are mostly

consistent across technologies

Curtis et al (2012) IEEE Software

Click to edit Master title style Join CISQ wwwit-cisqorg

Click to edit Master title style Section

References

115

Click to edit Master title style References 1 Anderson DJ (2010) Kanban Sequim WA Blue Hole Press

Basili V Trendowicz A Kowalczyk M Heidrich J Seaman C Munch J amp Rombach D (2014) Aligning Organizations through Measurement Heidelberg Springer

Boehm B (1987) Increasing software productivity IEEE Computer 20 (9) 43-57

Boehm B et al (2000) Software Cost Estimation with COCOMO II Prentice-Hall

Chrissis MB Konrad M amp Shrum S (2011) CMMI Third Edition Guidelines for Process Integration and Product Improvement Reading MA Addison Wesley

Cohn M (2006) Agile Estimating and Planning Upper Saddle River NJ Prentice-Hall

Crosby P (1979) Quality Is Free New York McGraw-Hill

Curtis B Sapiddi J amp Szynkarski A (2012) Estimating the principle of an applicationrsquos technical debt IEEE Software 29 (6) 34-42

Dion R (1993) Process improvement and the corporate balance sheet IEEE Software 10 (4) 28-35

Duncker R (1992) Presentation at the 25th Annual Conference of the Singapore Computer Society Singapore November 1992

Dyne K (1999) ESSI Benefits from Continuous Software Improvements In Proceedings of ESEPGrsquo99 Milton Keyes UK ESPI Foundation

Flowe RM amp Thordahl JB (1994) A Correlational Study of the SEIrsquos Capability Maturity Model and Software Development Performance in DoD Contracts (AFITGSSLAR94D-2) Dayton OH Air Force Institute of Technology Wright Patterson Air Force Base

116

Click to edit Master title style References 2 Haley T Ireland B Wojtaszek E Nash D amp Dion R (1995) Raytheon Electronic Systems

Experience in Software Process Improvement (Tech Rep CMUSEI-95-TR-017) ) Pittsburgh Software Engineering Institute Carnegie Mellon University

Ebert C amp Dumke R (2007) Software Measurement Berlin Springer-Verlag

Fagan E amp Davis N (2005) TSP PSP at Intuit Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Fenton N (2006) New directions for Software Metrics CIO Symposium on Software Best Practices

Garmus D amp Herron D (2001) Function Point Analysis Boston Addison-Wesley

George E (2005) Project management with TSP Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Herbsleb J Carleton A Rozum J Siegel J amp Zubrow D (1994) Benefits of CMM-Based Software Process Improvement Initial Results (Tech Rep CMUSEI-94-TR-13) Pittsburgh Software Engineering Institute Carnegie Mellon University

Humphrey W (1997) Introduction to the Personal Software Process Boston Addison Wesley

Humphrey W (2000) Introduction to the Team Software Process Boston Addison Wesley

Humphrey W (2006) TSP Leading a Development Team Boston Addison Wesley

Jackson D (2009) A direct path to dependable software Communications of the ACM 52 (4)

Kan SH (1995) Metrics and Models in Software Quality Engineering Reading MA Addison-Wesley

Keeni B (2000) IEEE Software 16 (4)

117

Click to edit Master title style References 2 Larman C (2004) Agile and Iterative Development Boston Addison Wesley

Lydon T (1995) Productivity drivers Process and capital In Proceedings of the 1995 SEPG Conference Pittsburgh Software Engineering Institute Carnegie Mellon University

McGarry J et al (2002) Practical Software Measurement Reading MA Addison Wesley

Paulk M Weber C Curtis B amp Chrissis M (1995) The Capability Maturity Model Guidelines for Improving the Software Process Reading MA Addison-Wesley

Peterman W (2000) IEEE Software JulyAugust issue

Poppendieck M amp Poppendieck T (2007) Implementing Lean Software Development Boston Addison-Wesley

Sakamoto K Kishida K amp Nakakoji K (1996) Cultural adaptation of the CMM In Fuggetta A amp Wolf A (Eds) Software Process Chichester UK Wiley 137-154

Sinha M (2006) Performance models ndash Enabling organizations prediction capability Proceedings of SEPGrsquo06 Pittsburgh Software Engineering Institute Carnegie Mellon University

Schwaber K amp Beedle M (2002) Agile Software Development Using Scrum

Spinellis D (2006) Code Quality Addison-Wesley

Van Epps S amp Marshall R (2012) Using TSP at the end of the development cycle Proceedings of TSP 2012 Pittsburgh Software Engineering Institute Carnegie Mellon University

Vu JD (1996) Software process improvement A business case In Proceedings of the European SEPG Conference Milton Keynes UK European Software Process Improvement Foundation

118

  • Best Practices for Software Process and Product Measurement
  • Instructor Dr Bill Curtis
  • Agenda
  • Section 1
  • Why Does Measurement Languish
  • Measurement Process Guidance
  • Measurement Categories
  • Productivity Analysis Objectives
  • Productivity Analysis Measures
  • Software Productivity
  • Software Size Measures
  • How Many Lines of Code
  • Function Point Estimation
  • Effort mdash Weakest Measure
  • How Quality Affects Productivity
  • Carry-forward Rework
  • Example of Quality Impact
  • Quality-Adjusted Productivity
  • Best Practices for Analysis
  • 1 Evaluate Demographics
  • 2 Segment Baselines
  • 3 Beware Sampling Effects
  • 4 Understand Variation
  • 5 Investigate Distributions
  • 6 Account for Maturity Effects
  • 7 Caution for External Data Sets
  • Slide Number 27
  • Measuring Project Progress
  • Requirements Change Impact
  • Phase Defect Removal Model
  • SCRUM Board
  • Burndown Chart and Velocity
  • Analyzing Velocity
  • Measuring and Managing Flow
  • Trends Over Sprints
  • Sources and Measures
  • Recommended Books
  • Personal Software Process (PSP)
  • Team Software Process (TSP)
  • Aggregated Personal Data Best
  • Defect Detection at Intuit
  • TSP Benefits
  • Section 3
  • CMMI Maturity Transitions
  • Enhanced Maturity Framework
  • Maturity Level Transformation
  • Measurement by Level
  • Average Improvement Results
  • Schedule Performance
  • Cost Performance
  • Crosbyrsquos Cost of Quality Model
  • Raytheons Cost of Quality
  • Raytheonrsquos Productivity
  • Raytheonrsquos Cost Reduction
  • Raytheonrsquos Cost Predictability
  • OMRONrsquos Reuse gains
  • Section 4
  • What about the Product
  • Six Sigmarsquos Challenge
  • Testing is Not Enough
  • CASTrsquos Application Intelligence Platform
  • Modern Apps are a Technology Stack
  • Architecturally Complex Defects
  • Propagated Risk
  • Transaction Risk
  • Risk-Based Prioritization
  • Static Analysis in US DoD
  • Reducing Operational Incidents amp Costs
  • Reducing Operational Losses
  • Slide Number 70
  • The Technical Debt Metaphor
  • Tech Debt by Quality Factor
  • Join CISQ wwwit-cisqorg
  • Section
  • References 1
  • References 2
  • References 2
Page 38: Best Practices for Software Process and Product Measurement · Software Process and Product Measurement Dr. Bill Curtis ... measurement best practice • Best guidance for starting

Click to edit Master title style Team Velocity

0

1

2

3

4

5

1 2 2 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20Days

Story Points per Hour

0

1

2

3

4

5

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

HrsStryPt

Analyzing Velocity

Item Cycle Times

02468

101214161820

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

Stories Hours HrsStryPt

0

10

20

30

40

50

60

1 2 2 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

Burndown Chart

Days

Click to edit Master title style Measuring and Managing Flow

Released

In progress

Backlogged

Cumulative Flow Diagram Days to delivery

outliers

Anderson (2010) Kanban

Click to edit Master title style

0

10

20

30

40

50

60

70

80

1 2 3 4 5 6 7 8 9 10 11 12

Sprints

Sto

ry p

oin

ts d

eliv

ered

Trends Over Sprints

0

10

20

30

40

50

60

70

80

1 2 3 4 5 6 7 8 9 10 11 12

Sprints

Sto

ry p

oin

ts d

eliv

ered

0

05

1

15

2

25

3

35

4

1 2 3 4 5 6 7 8 9 10 11 12

Sprints

Tech

nic

al d

ebt

Track and correlate multiple measures

across sprints

Predictive models

Click to edit Master title style Sources and Measures

Stories submitted

Stories pulled back

Growth in size of stories

Stories added mid-sprint

Stories under review

Failed tests

Non-development tasks

Assists to other projects

Data sources

Measures Project tracking

Code control

Bug tracking

Build system

Test environment

Deployment tools

Operational monitoring

Click to edit Master title style Recommended Books

Click to edit Master title style Personal Software Process (PSP)

PSP 0 Current process Time recording

Defect recording Defect type standards

PSP 01 Coding standard

Size measurement Process improvement

proposal

Baseline Personal Process

Humphrey (1997) Introduction to the Personal Software Process

PSP 1 Size estimating

Test report

Personal Planning

PSP 2 Code reviews

Design reviews

Personal Quality Management

PSP 21 Cyclic development

Cyclic Personal Process

PSP 11 Task planning

Schedule planning

PSP 21 Design templates

Click to edit Master title style Team Software Process (TSP) bull Built from personal processes of team members

bull Well defined team roles

bull Project launch workshops for planning

bull Team measurement and tracking

bull Post-mortems for improvement

bull Team application of lean principles

Humphrey (2000) Introduction to the Team Software Process

Click to edit Master title style Aggregated Personal Data Best

R2s range from 70 to 87

The ability to predict the amount of time required to produce a piece of software is dramatically improved by estimating at the individual level first and then combining the estimates rather than developing a single team estimate averaged over individuals

R2 = 53

Click to edit Master title style Defect Detection at Intuit

Consequently test changes from defect

detection into correctness verification

PSPTSP shifts defect detection from the test phase back up into development

Fagan (2005)

Personal reviews

33

Compile 33

Team reviews 19

Unit test 15

Test 14

Click to edit Master title style

Dramatic reductions

ndash Delivered defects

ndash Days per KLOC

ndash Schedule deviation

ndash Test defectskloc

ndash Variation in results

Humphrey (2006) TSP Leading a Development Team

TSP Benefits

Click to edit Master title style Section 3 Section 3 Measuring Software

Improvement Programs

1) Improvement program methods

2) Maturity-based measurement

3) Improvement program results

61

Click to edit Master title style CMMI Maturity Transitions

Level 5 Optimizing

Innovation management

Level 4 Quantitatively

Managed

Capability management

Level 3 Defined

Process management

Level 2 Managed

Project management

Level 1 Initial

Inconsistent management

Chrisis Konrad amp Shrum (2005) CMMI

Click to edit Master title style Enhanced Maturity Framework

Level 5 Innovating

Innovation management

Level 4 Optimized

Capability management

Level 3 Standardized

Organizational management

Level 2 Stabilized

Work unit management

Level 1 Initial

Inconsistent management

Click to edit Master title style Maturity Level Transformation

Level 5 Opportunistic improvements

Empowered culture

Level 5 Proactive

improvements Agile culture

Organization

Projects

Individual

Level 4 Process and results

managed statistically Precision culture

Level 3 Organization develops

standard processes Common culture

Level 2 Project managers stabilize projects

Local culture

Level 1 Ad Hoc inconsistent

development practices Relationship culture

Trust

Click to edit Master title style Measurement by Level

Level 5 mdash Exploratory data ∆ t = N x ROI

Level 4 mdash Predictive data X plusmn (196σ radicn )

Level 3 mdash Process data 83 defects per review

Level 2 mdash Management data Effort = α LOCβ

Level 1 mdash Haphazard data ldquoabout a million linesrdquo

Measure-maturity mismatch slows improvement and creates

dysfunction

Click to edit Master title style Average Improvement Results

Productivity growth 4 35 9 - 67

Pre-test defect detection 3 22 6 - 25

Time to market 2 darr 19 15 - 23

Field error reports 5 darr 39 10 - 94

Return on investment 5 501 41 - 881

Annual Annual Improvement benefit Orgs median range

Click to edit Master title style Schedule Performance

0

02

04

06

08

1

12

14

16

1 2 3

Schedule performance

index

CMM maturity level

Budgeted cost of work performed

Budgeted cost of work scheduled

behind ahead

Air Force study of contractors

Flowe amp Thordahl (1994)

Click to edit Master title style Cost Performance

Cost performance

index

0

05

1

15

2

25

1 2 3 CMM Maturity Level

Budgeted cost of work performed Actual cost of

work performed

overrun underrun

Air Force study of contractors

Flowe amp Thordahl (1994)

Click to edit Master title style Crosbyrsquos Cost of Quality Model Project Cost

Performance Cost of Quality Generating plans documentation Development - requirements - design - code - integration Prevention Appraisal

Reviews - system - requirements - design - test plan - test scripts Walkthroughs Testing (first time) Independent Verification and Validation (first time) Audits

Training Policies Procedures and methods Tools Planning Quality Improvement Projects Data Gathering and Analysis Root Cause Analysis Quality Reporting

Nonconformance Conformance Fixing Defects Reworking any Document Updating Source Code Re-reviews Re-tests Lab costs for re-tests Patches(Internal and External) Engineering Changes Change Control Boards External Failures and fines Customer Support Help Desks

Dion (1993) Crosby (1979) Quality Is Free

Click to edit Master title style Raytheons Cost of Quality

bull Performance mdash cost of building it right first time bull Nonconformance mdash cost of rework bull Appraisal mdash cost of testing bull Prevention mdash cost of preventing nonconformance

Dion (1993) Haley (1995)

23

18

Year Level Perform Nonconf Appraise Prevent

1988 1 34 41 15 7

1990 2 55 18 15 12

1992 3 66 11

1994 4 76 6 75

Click to edit Master title style Raytheonrsquos Productivity

0

1

2

3

4

1988 1989 1990 1991 1992 1993 1994

Growth relative to 1989 baseline

Lydon (1995)

Click to edit Master title style Raytheonrsquos Cost Reduction

0

02

04

06

08

1

12

1988 1989 1990 1991 1992 1993 1994

Reduction relative to 1989 baseline

Lydon (1995)

Click to edit Master title style

Haley (1995)

1993 1992 1991 1990 1989 1988 1994

50

40

30

20

10

0

-10

Actual cost

Budgeted cost

over-run under-run

4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1

Raytheonrsquos Cost Predictability

Click to edit Master title style OMRONrsquos Reuse gains

Sakamoto et al (1996)

0

5 0 0

1 0 0 0

1 5 0 0

2 0 0 0

2 5 0 0

3 0 0 0

1 9 8 7 1 9 8 8 1 9 8 9 1 9 9 0 1 9 9 1 1 9 9 2 1 9 9 3 1 9 9 4

Steps

100

75

50

25

0

of reuse

Click to edit Master title style Section 4 Section 4 Measuring Quality and

Technical Debt

1) Structural quality measurement

2) Technical debt

3) Consortium for IT Software Quality (CISQ)

89

Click to edit Master title style What about the Product CMMIrsquos primary assumption

ldquoThe quality of a software system is governed by quality of the process used to develop itrdquo

- Watts Humphrey

CMMIrsquos product quality problem

1 Assessments do not verify product quality

2 Compliance focus undermines CMMIrsquos primary assumption

3 Product quality focus at Level 4 in CMM was lost in quantitative process control in CMMI

4 The assumptions underlying control charts are violated by software data

Click to edit Master title style Six Sigmarsquos Challenge

Process focus ndash process improvement ndash Six Sigma Product focus ndash product improvement ndash Design for 6σ

Product focus supplements product focus to unlock even more value from software

Six Sigma projects must have significant benefits Huge benefits

Large benefits

Good benefits

Okay benefits

Small benefits

Now what

Ultimately we run out of projects with significant enough benefits to continue the programhellip

so what is the solution that allows continual improvement

Click to edit Master title style Testing is Not Enough ldquoAs higher levels of assurance are demandedhelliptesting cannot deliver the level of confidence required at a reasonable costrdquo

ldquoThe correctness of the code is rarely the weakest linkrdquo

Jackson D (2009) Spinellis D (2006)

ldquohellipa failure to satisfy a non-functional requirement can be critical even catastrophichellipnon-functional requirements are sometimes difficult to verify We cannot write a test case to verify a systemrsquos reliabilityhellipThe ability to associate code to non-functional properties can be a powerful weapon in a software engineerrsquos arsenalrdquo

Click to edit Master title style CASTrsquos Application Intelligence Platform

Application Analysis

Evaluation of 1200+ coding amp

architectural rules

Application meta-data

Transferability

Changeability

Robustness

Performance

Security

Quality Measurements

Detected Violations

Expensive operation in loop Static vs pooled connections Complex query on big table Large indices on big table

Empty CATCH block Uncontrolled data access Poor memory management Opened resource not closed

SQL injection Cross-site scripting Buffer overflow Uncontrolled format string

Unstructured code Misuse of inheritance Lack of comments Violated naming convention

Highly coupled component Duplicated code Index modified in loop High cyclomatic complexity

Language Parsers

Oracle PLSQL Sybase T-SQL SQL Server T-SQL IBM SQLPSM C C++ C Pro C Cobol CICS Visual Basic VBNet ASPNet Java J2EE JSP XML HTML Javascript VBScript PHP PowerBuilder Oracle Forms PeopleSoft SAP ABAP Netweaver Tibco Business Objects Universal Analyzer for other languages

Click to edit Master title style Modern Apps are a Technology Stack Ar

chite

ctur

al C

ompl

ianc

e

Data Flow Transaction Risk

EJB PLSQL

Oracle

SQL Server

DB2

TSQL

Hibernate

Spring

Struts NET

COBOL

Sybase IMS

Messaging

Integration quality Architectural

compliance Risk propagation Application security Resiliency checks Transaction integrity

Function point Effort estimation Data access control SDK versioning Calibration across

technologies IT organization level

System Level 3

Code style amp layout Expression complexity Code documentation Class or program design Basic coding standards Developer level

Unit Level 1

Java Java Java

Java Java

Web Services Java Java Single languagetechnology layer

Intra-technology architecture Intra-layer dependencies Inter-program invocation Security vulnerabilities Development team level

Technology Level 2

JSP ASPNET APIs

Click to edit Master title style Architecturally Complex Defects

Primary cause of operational

problems

20x as many

fixes to correct

Architecturally Complex Defect

A structural flaw involving interactions among multiple components that reside in different application layers

48

52 92

8 Architecturally

Complex Defects

Code unit-level violations

of total app defects

of total repair effort

Most architecturally complex defects involve an architectural hotspot ndash a badly-built component that should be replaced

Click to edit Master title style Propagated Risk

The impact of violation 284 in Component B is very widely propagated throughout the system ndash

it presents very high risk

The impact of violation 342 in Component A is only narrowly propagated in the system

ndash it presents low risk

Propagated Risk Index

A ranking of the risk created by each violation based on the its severity and the breadth of its impact in the system

Violation 342 in component A

Violation 284 in component B

Remediating violation 284 in Component B will have

the greater impact on reducing risk and cost

Click to edit Master title style Transaction Risk

Customers care most about the dependability of their transactions

The risk of a transaction is determined by the number and severity of violations along its path

Transactions can be ranked by risk to prioritize their remediation

The Transaction Risk Index can be further tuned using the operational frequency of each transaction

Transaction Risk Index

A ranking of transactions based on the risk created by the number and severity of violations along the transaction path

Transaction entryexit in the user interface layer

Transaction fulfillment in the data storage layer

1 3 2 4 5

Transaction 1 poses greatest risk

Transaction processing in the business logic layer

Click to edit Master title style Risk-Based Prioritization

Violation Health Factor Severity Propagated Risk Index

In a high risk transaction

Frequency of path execution

148 Security 9

371 Robustness 9

062 Performance 8

284 Robustness 7

016 Changeability 7

241 Security 5

173 Transferability 4

359 Transferability 3

Click to edit Master title style Static Analysis in US DoD

John Keane (2013)

Testing finds lt13 of structural defects

Static analysis before testing is 85 efficient

With testing and static analysis combined test schedules are reduced by 50

Click to edit Master title style Reducing Operational Incidents amp Costs

Study of structural quality measures and maintenance effort across 20 customers in a large global system integrator

TQI increase of 24 decreased corrective maintenance effort by 50

Total Quality Index

Click to edit Master title style Reducing Operational Losses

0

20

40

60

80

100

120

140

160

180

100 150 200 250 300 350 400

Technical Quality Index

Pro

du

ctio

n In

cid

ents

+$2000000 Annual Loss

$50000 Annual Loss

$5000 Annual Loss

Large international investment bank Business critical applications

Click to edit Master title style Factors Affecting Structural Quality

2017 CRASH Report Request from infocastsoftwarecom

Factor apps Robust Security Perform Change Transfer

Technology 1606 20 3 16 26 7

Java-EE

Industry 677 5 5 7 4 6

App type 510 1 1 1 1 1

Source 580 1 1 4

Shore 540

Maturity 72 28 25 15 24 12

Method 208 10 6 14 6

Team size 186 7 5 8 8

of users 262 4 3 5 4

Percentage of variation in structural quality scores accounted for by various factors

Click to edit Master title style The Technical Debt Metaphor

Structural quality problems in production code

Technical Debt

Principal borrowed

Interest on the debt

Business Risk Liability from debt

Opportunity cost

Interestmdashcontinuing IT costs attributable to the violations causing technical debt ie higher maintenance costs greater resource usage etc

Principalcost of fixing problems remaining in the code after release that must be remediated

Opportunity costmdashbenefits that could have been achieved had resources been put on new capability rather than retiring technical debt

Liabilitymdashbusiness costs related to outages breaches corrupted data etc

Technical Debt the future cost of fixing defects remaining in code at release a component of the cost of ownership

Todayrsquos talk focuses on the principal

Curtis et al (2012) IEEE Software

Click to edit Master title style Tech Debt by Quality Factor

Transferability

40

Changeability

30

Security 7

Robustness

18

70 of Technical Debt is in IT Cost (Transferability Changeability)

30 of Technical Debt is in Business Risk

(Robustness Performance Security)

Health Factor proportions are mostly

consistent across technologies

Curtis et al (2012) IEEE Software

Click to edit Master title style Join CISQ wwwit-cisqorg

Click to edit Master title style Section

References

115

Click to edit Master title style References 1 Anderson DJ (2010) Kanban Sequim WA Blue Hole Press

Basili V Trendowicz A Kowalczyk M Heidrich J Seaman C Munch J amp Rombach D (2014) Aligning Organizations through Measurement Heidelberg Springer

Boehm B (1987) Increasing software productivity IEEE Computer 20 (9) 43-57

Boehm B et al (2000) Software Cost Estimation with COCOMO II Prentice-Hall

Chrissis MB Konrad M amp Shrum S (2011) CMMI Third Edition Guidelines for Process Integration and Product Improvement Reading MA Addison Wesley

Cohn M (2006) Agile Estimating and Planning Upper Saddle River NJ Prentice-Hall

Crosby P (1979) Quality Is Free New York McGraw-Hill

Curtis B Sapiddi J amp Szynkarski A (2012) Estimating the principle of an applicationrsquos technical debt IEEE Software 29 (6) 34-42

Dion R (1993) Process improvement and the corporate balance sheet IEEE Software 10 (4) 28-35

Duncker R (1992) Presentation at the 25th Annual Conference of the Singapore Computer Society Singapore November 1992

Dyne K (1999) ESSI Benefits from Continuous Software Improvements In Proceedings of ESEPGrsquo99 Milton Keyes UK ESPI Foundation

Flowe RM amp Thordahl JB (1994) A Correlational Study of the SEIrsquos Capability Maturity Model and Software Development Performance in DoD Contracts (AFITGSSLAR94D-2) Dayton OH Air Force Institute of Technology Wright Patterson Air Force Base

116

Click to edit Master title style References 2 Haley T Ireland B Wojtaszek E Nash D amp Dion R (1995) Raytheon Electronic Systems

Experience in Software Process Improvement (Tech Rep CMUSEI-95-TR-017) ) Pittsburgh Software Engineering Institute Carnegie Mellon University

Ebert C amp Dumke R (2007) Software Measurement Berlin Springer-Verlag

Fagan E amp Davis N (2005) TSP PSP at Intuit Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Fenton N (2006) New directions for Software Metrics CIO Symposium on Software Best Practices

Garmus D amp Herron D (2001) Function Point Analysis Boston Addison-Wesley

George E (2005) Project management with TSP Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Herbsleb J Carleton A Rozum J Siegel J amp Zubrow D (1994) Benefits of CMM-Based Software Process Improvement Initial Results (Tech Rep CMUSEI-94-TR-13) Pittsburgh Software Engineering Institute Carnegie Mellon University

Humphrey W (1997) Introduction to the Personal Software Process Boston Addison Wesley

Humphrey W (2000) Introduction to the Team Software Process Boston Addison Wesley

Humphrey W (2006) TSP Leading a Development Team Boston Addison Wesley

Jackson D (2009) A direct path to dependable software Communications of the ACM 52 (4)

Kan SH (1995) Metrics and Models in Software Quality Engineering Reading MA Addison-Wesley

Keeni B (2000) IEEE Software 16 (4)

117

Click to edit Master title style References 2 Larman C (2004) Agile and Iterative Development Boston Addison Wesley

Lydon T (1995) Productivity drivers Process and capital In Proceedings of the 1995 SEPG Conference Pittsburgh Software Engineering Institute Carnegie Mellon University

McGarry J et al (2002) Practical Software Measurement Reading MA Addison Wesley

Paulk M Weber C Curtis B amp Chrissis M (1995) The Capability Maturity Model Guidelines for Improving the Software Process Reading MA Addison-Wesley

Peterman W (2000) IEEE Software JulyAugust issue

Poppendieck M amp Poppendieck T (2007) Implementing Lean Software Development Boston Addison-Wesley

Sakamoto K Kishida K amp Nakakoji K (1996) Cultural adaptation of the CMM In Fuggetta A amp Wolf A (Eds) Software Process Chichester UK Wiley 137-154

Sinha M (2006) Performance models ndash Enabling organizations prediction capability Proceedings of SEPGrsquo06 Pittsburgh Software Engineering Institute Carnegie Mellon University

Schwaber K amp Beedle M (2002) Agile Software Development Using Scrum

Spinellis D (2006) Code Quality Addison-Wesley

Van Epps S amp Marshall R (2012) Using TSP at the end of the development cycle Proceedings of TSP 2012 Pittsburgh Software Engineering Institute Carnegie Mellon University

Vu JD (1996) Software process improvement A business case In Proceedings of the European SEPG Conference Milton Keynes UK European Software Process Improvement Foundation

118

  • Best Practices for Software Process and Product Measurement
  • Instructor Dr Bill Curtis
  • Agenda
  • Section 1
  • Why Does Measurement Languish
  • Measurement Process Guidance
  • Measurement Categories
  • Productivity Analysis Objectives
  • Productivity Analysis Measures
  • Software Productivity
  • Software Size Measures
  • How Many Lines of Code
  • Function Point Estimation
  • Effort mdash Weakest Measure
  • How Quality Affects Productivity
  • Carry-forward Rework
  • Example of Quality Impact
  • Quality-Adjusted Productivity
  • Best Practices for Analysis
  • 1 Evaluate Demographics
  • 2 Segment Baselines
  • 3 Beware Sampling Effects
  • 4 Understand Variation
  • 5 Investigate Distributions
  • 6 Account for Maturity Effects
  • 7 Caution for External Data Sets
  • Slide Number 27
  • Measuring Project Progress
  • Requirements Change Impact
  • Phase Defect Removal Model
  • SCRUM Board
  • Burndown Chart and Velocity
  • Analyzing Velocity
  • Measuring and Managing Flow
  • Trends Over Sprints
  • Sources and Measures
  • Recommended Books
  • Personal Software Process (PSP)
  • Team Software Process (TSP)
  • Aggregated Personal Data Best
  • Defect Detection at Intuit
  • TSP Benefits
  • Section 3
  • CMMI Maturity Transitions
  • Enhanced Maturity Framework
  • Maturity Level Transformation
  • Measurement by Level
  • Average Improvement Results
  • Schedule Performance
  • Cost Performance
  • Crosbyrsquos Cost of Quality Model
  • Raytheons Cost of Quality
  • Raytheonrsquos Productivity
  • Raytheonrsquos Cost Reduction
  • Raytheonrsquos Cost Predictability
  • OMRONrsquos Reuse gains
  • Section 4
  • What about the Product
  • Six Sigmarsquos Challenge
  • Testing is Not Enough
  • CASTrsquos Application Intelligence Platform
  • Modern Apps are a Technology Stack
  • Architecturally Complex Defects
  • Propagated Risk
  • Transaction Risk
  • Risk-Based Prioritization
  • Static Analysis in US DoD
  • Reducing Operational Incidents amp Costs
  • Reducing Operational Losses
  • Slide Number 70
  • The Technical Debt Metaphor
  • Tech Debt by Quality Factor
  • Join CISQ wwwit-cisqorg
  • Section
  • References 1
  • References 2
  • References 2
Page 39: Best Practices for Software Process and Product Measurement · Software Process and Product Measurement Dr. Bill Curtis ... measurement best practice • Best guidance for starting

Click to edit Master title style Measuring and Managing Flow

Released

In progress

Backlogged

Cumulative Flow Diagram Days to delivery

outliers

Anderson (2010) Kanban

Click to edit Master title style

0

10

20

30

40

50

60

70

80

1 2 3 4 5 6 7 8 9 10 11 12

Sprints

Sto

ry p

oin

ts d

eliv

ered

Trends Over Sprints

0

10

20

30

40

50

60

70

80

1 2 3 4 5 6 7 8 9 10 11 12

Sprints

Sto

ry p

oin

ts d

eliv

ered

0

05

1

15

2

25

3

35

4

1 2 3 4 5 6 7 8 9 10 11 12

Sprints

Tech

nic

al d

ebt

Track and correlate multiple measures

across sprints

Predictive models

Click to edit Master title style Sources and Measures

Stories submitted

Stories pulled back

Growth in size of stories

Stories added mid-sprint

Stories under review

Failed tests

Non-development tasks

Assists to other projects

Data sources

Measures Project tracking

Code control

Bug tracking

Build system

Test environment

Deployment tools

Operational monitoring

Click to edit Master title style Recommended Books

Click to edit Master title style Personal Software Process (PSP)

PSP 0 Current process Time recording

Defect recording Defect type standards

PSP 01 Coding standard

Size measurement Process improvement

proposal

Baseline Personal Process

Humphrey (1997) Introduction to the Personal Software Process

PSP 1 Size estimating

Test report

Personal Planning

PSP 2 Code reviews

Design reviews

Personal Quality Management

PSP 21 Cyclic development

Cyclic Personal Process

PSP 11 Task planning

Schedule planning

PSP 21 Design templates

Click to edit Master title style Team Software Process (TSP) bull Built from personal processes of team members

bull Well defined team roles

bull Project launch workshops for planning

bull Team measurement and tracking

bull Post-mortems for improvement

bull Team application of lean principles

Humphrey (2000) Introduction to the Team Software Process

Click to edit Master title style Aggregated Personal Data Best

R2s range from 70 to 87

The ability to predict the amount of time required to produce a piece of software is dramatically improved by estimating at the individual level first and then combining the estimates rather than developing a single team estimate averaged over individuals

R2 = 53

Click to edit Master title style Defect Detection at Intuit

Consequently test changes from defect

detection into correctness verification

PSPTSP shifts defect detection from the test phase back up into development

Fagan (2005)

Personal reviews

33

Compile 33

Team reviews 19

Unit test 15

Test 14

Click to edit Master title style

Dramatic reductions

ndash Delivered defects

ndash Days per KLOC

ndash Schedule deviation

ndash Test defectskloc

ndash Variation in results

Humphrey (2006) TSP Leading a Development Team

TSP Benefits

Click to edit Master title style Section 3 Section 3 Measuring Software

Improvement Programs

1) Improvement program methods

2) Maturity-based measurement

3) Improvement program results

61

Click to edit Master title style CMMI Maturity Transitions

Level 5 Optimizing

Innovation management

Level 4 Quantitatively

Managed

Capability management

Level 3 Defined

Process management

Level 2 Managed

Project management

Level 1 Initial

Inconsistent management

Chrisis Konrad amp Shrum (2005) CMMI

Click to edit Master title style Enhanced Maturity Framework

Level 5 Innovating

Innovation management

Level 4 Optimized

Capability management

Level 3 Standardized

Organizational management

Level 2 Stabilized

Work unit management

Level 1 Initial

Inconsistent management

Click to edit Master title style Maturity Level Transformation

Level 5 Opportunistic improvements

Empowered culture

Level 5 Proactive

improvements Agile culture

Organization

Projects

Individual

Level 4 Process and results

managed statistically Precision culture

Level 3 Organization develops

standard processes Common culture

Level 2 Project managers stabilize projects

Local culture

Level 1 Ad Hoc inconsistent

development practices Relationship culture

Trust

Click to edit Master title style Measurement by Level

Level 5 mdash Exploratory data ∆ t = N x ROI

Level 4 mdash Predictive data X plusmn (196σ radicn )

Level 3 mdash Process data 83 defects per review

Level 2 mdash Management data Effort = α LOCβ

Level 1 mdash Haphazard data ldquoabout a million linesrdquo

Measure-maturity mismatch slows improvement and creates

dysfunction

Click to edit Master title style Average Improvement Results

Productivity growth 4 35 9 - 67

Pre-test defect detection 3 22 6 - 25

Time to market 2 darr 19 15 - 23

Field error reports 5 darr 39 10 - 94

Return on investment 5 501 41 - 881

Annual Annual Improvement benefit Orgs median range

Click to edit Master title style Schedule Performance

0

02

04

06

08

1

12

14

16

1 2 3

Schedule performance

index

CMM maturity level

Budgeted cost of work performed

Budgeted cost of work scheduled

behind ahead

Air Force study of contractors

Flowe amp Thordahl (1994)

Click to edit Master title style Cost Performance

Cost performance

index

0

05

1

15

2

25

1 2 3 CMM Maturity Level

Budgeted cost of work performed Actual cost of

work performed

overrun underrun

Air Force study of contractors

Flowe amp Thordahl (1994)

Click to edit Master title style Crosbyrsquos Cost of Quality Model Project Cost

Performance Cost of Quality Generating plans documentation Development - requirements - design - code - integration Prevention Appraisal

Reviews - system - requirements - design - test plan - test scripts Walkthroughs Testing (first time) Independent Verification and Validation (first time) Audits

Training Policies Procedures and methods Tools Planning Quality Improvement Projects Data Gathering and Analysis Root Cause Analysis Quality Reporting

Nonconformance Conformance Fixing Defects Reworking any Document Updating Source Code Re-reviews Re-tests Lab costs for re-tests Patches(Internal and External) Engineering Changes Change Control Boards External Failures and fines Customer Support Help Desks

Dion (1993) Crosby (1979) Quality Is Free

Click to edit Master title style Raytheons Cost of Quality

bull Performance mdash cost of building it right first time bull Nonconformance mdash cost of rework bull Appraisal mdash cost of testing bull Prevention mdash cost of preventing nonconformance

Dion (1993) Haley (1995)

23

18

Year Level Perform Nonconf Appraise Prevent

1988 1 34 41 15 7

1990 2 55 18 15 12

1992 3 66 11

1994 4 76 6 75

Click to edit Master title style Raytheonrsquos Productivity

0

1

2

3

4

1988 1989 1990 1991 1992 1993 1994

Growth relative to 1989 baseline

Lydon (1995)

Click to edit Master title style Raytheonrsquos Cost Reduction

0

02

04

06

08

1

12

1988 1989 1990 1991 1992 1993 1994

Reduction relative to 1989 baseline

Lydon (1995)

Click to edit Master title style

Haley (1995)

1993 1992 1991 1990 1989 1988 1994

50

40

30

20

10

0

-10

Actual cost

Budgeted cost

over-run under-run

4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1

Raytheonrsquos Cost Predictability

Click to edit Master title style OMRONrsquos Reuse gains

Sakamoto et al (1996)

0

5 0 0

1 0 0 0

1 5 0 0

2 0 0 0

2 5 0 0

3 0 0 0

1 9 8 7 1 9 8 8 1 9 8 9 1 9 9 0 1 9 9 1 1 9 9 2 1 9 9 3 1 9 9 4

Steps

100

75

50

25

0

of reuse

Click to edit Master title style Section 4 Section 4 Measuring Quality and

Technical Debt

1) Structural quality measurement

2) Technical debt

3) Consortium for IT Software Quality (CISQ)

89

Click to edit Master title style What about the Product CMMIrsquos primary assumption

ldquoThe quality of a software system is governed by quality of the process used to develop itrdquo

- Watts Humphrey

CMMIrsquos product quality problem

1 Assessments do not verify product quality

2 Compliance focus undermines CMMIrsquos primary assumption

3 Product quality focus at Level 4 in CMM was lost in quantitative process control in CMMI

4 The assumptions underlying control charts are violated by software data

Click to edit Master title style Six Sigmarsquos Challenge

Process focus ndash process improvement ndash Six Sigma Product focus ndash product improvement ndash Design for 6σ

Product focus supplements product focus to unlock even more value from software

Six Sigma projects must have significant benefits Huge benefits

Large benefits

Good benefits

Okay benefits

Small benefits

Now what

Ultimately we run out of projects with significant enough benefits to continue the programhellip

so what is the solution that allows continual improvement

Click to edit Master title style Testing is Not Enough ldquoAs higher levels of assurance are demandedhelliptesting cannot deliver the level of confidence required at a reasonable costrdquo

ldquoThe correctness of the code is rarely the weakest linkrdquo

Jackson D (2009) Spinellis D (2006)

ldquohellipa failure to satisfy a non-functional requirement can be critical even catastrophichellipnon-functional requirements are sometimes difficult to verify We cannot write a test case to verify a systemrsquos reliabilityhellipThe ability to associate code to non-functional properties can be a powerful weapon in a software engineerrsquos arsenalrdquo

Click to edit Master title style CASTrsquos Application Intelligence Platform

Application Analysis

Evaluation of 1200+ coding amp

architectural rules

Application meta-data

Transferability

Changeability

Robustness

Performance

Security

Quality Measurements

Detected Violations

Expensive operation in loop Static vs pooled connections Complex query on big table Large indices on big table

Empty CATCH block Uncontrolled data access Poor memory management Opened resource not closed

SQL injection Cross-site scripting Buffer overflow Uncontrolled format string

Unstructured code Misuse of inheritance Lack of comments Violated naming convention

Highly coupled component Duplicated code Index modified in loop High cyclomatic complexity

Language Parsers

Oracle PLSQL Sybase T-SQL SQL Server T-SQL IBM SQLPSM C C++ C Pro C Cobol CICS Visual Basic VBNet ASPNet Java J2EE JSP XML HTML Javascript VBScript PHP PowerBuilder Oracle Forms PeopleSoft SAP ABAP Netweaver Tibco Business Objects Universal Analyzer for other languages

Click to edit Master title style Modern Apps are a Technology Stack Ar

chite

ctur

al C

ompl

ianc

e

Data Flow Transaction Risk

EJB PLSQL

Oracle

SQL Server

DB2

TSQL

Hibernate

Spring

Struts NET

COBOL

Sybase IMS

Messaging

Integration quality Architectural

compliance Risk propagation Application security Resiliency checks Transaction integrity

Function point Effort estimation Data access control SDK versioning Calibration across

technologies IT organization level

System Level 3

Code style amp layout Expression complexity Code documentation Class or program design Basic coding standards Developer level

Unit Level 1

Java Java Java

Java Java

Web Services Java Java Single languagetechnology layer

Intra-technology architecture Intra-layer dependencies Inter-program invocation Security vulnerabilities Development team level

Technology Level 2

JSP ASPNET APIs

Click to edit Master title style Architecturally Complex Defects

Primary cause of operational

problems

20x as many

fixes to correct

Architecturally Complex Defect

A structural flaw involving interactions among multiple components that reside in different application layers

48

52 92

8 Architecturally

Complex Defects

Code unit-level violations

of total app defects

of total repair effort

Most architecturally complex defects involve an architectural hotspot ndash a badly-built component that should be replaced

Click to edit Master title style Propagated Risk

The impact of violation 284 in Component B is very widely propagated throughout the system ndash

it presents very high risk

The impact of violation 342 in Component A is only narrowly propagated in the system

ndash it presents low risk

Propagated Risk Index

A ranking of the risk created by each violation based on the its severity and the breadth of its impact in the system

Violation 342 in component A

Violation 284 in component B

Remediating violation 284 in Component B will have

the greater impact on reducing risk and cost

Click to edit Master title style Transaction Risk

Customers care most about the dependability of their transactions

The risk of a transaction is determined by the number and severity of violations along its path

Transactions can be ranked by risk to prioritize their remediation

The Transaction Risk Index can be further tuned using the operational frequency of each transaction

Transaction Risk Index

A ranking of transactions based on the risk created by the number and severity of violations along the transaction path

Transaction entryexit in the user interface layer

Transaction fulfillment in the data storage layer

1 3 2 4 5

Transaction 1 poses greatest risk

Transaction processing in the business logic layer

Click to edit Master title style Risk-Based Prioritization

Violation Health Factor Severity Propagated Risk Index

In a high risk transaction

Frequency of path execution

148 Security 9

371 Robustness 9

062 Performance 8

284 Robustness 7

016 Changeability 7

241 Security 5

173 Transferability 4

359 Transferability 3

Click to edit Master title style Static Analysis in US DoD

John Keane (2013)

Testing finds lt13 of structural defects

Static analysis before testing is 85 efficient

With testing and static analysis combined test schedules are reduced by 50

Click to edit Master title style Reducing Operational Incidents amp Costs

Study of structural quality measures and maintenance effort across 20 customers in a large global system integrator

TQI increase of 24 decreased corrective maintenance effort by 50

Total Quality Index

Click to edit Master title style Reducing Operational Losses

0

20

40

60

80

100

120

140

160

180

100 150 200 250 300 350 400

Technical Quality Index

Pro

du

ctio

n In

cid

ents

+$2000000 Annual Loss

$50000 Annual Loss

$5000 Annual Loss

Large international investment bank Business critical applications

Click to edit Master title style Factors Affecting Structural Quality

2017 CRASH Report Request from infocastsoftwarecom

Factor apps Robust Security Perform Change Transfer

Technology 1606 20 3 16 26 7

Java-EE

Industry 677 5 5 7 4 6

App type 510 1 1 1 1 1

Source 580 1 1 4

Shore 540

Maturity 72 28 25 15 24 12

Method 208 10 6 14 6

Team size 186 7 5 8 8

of users 262 4 3 5 4

Percentage of variation in structural quality scores accounted for by various factors

Click to edit Master title style The Technical Debt Metaphor

Structural quality problems in production code

Technical Debt

Principal borrowed

Interest on the debt

Business Risk Liability from debt

Opportunity cost

Interestmdashcontinuing IT costs attributable to the violations causing technical debt ie higher maintenance costs greater resource usage etc

Principalcost of fixing problems remaining in the code after release that must be remediated

Opportunity costmdashbenefits that could have been achieved had resources been put on new capability rather than retiring technical debt

Liabilitymdashbusiness costs related to outages breaches corrupted data etc

Technical Debt the future cost of fixing defects remaining in code at release a component of the cost of ownership

Todayrsquos talk focuses on the principal

Curtis et al (2012) IEEE Software

Click to edit Master title style Tech Debt by Quality Factor

Transferability

40

Changeability

30

Security 7

Robustness

18

70 of Technical Debt is in IT Cost (Transferability Changeability)

30 of Technical Debt is in Business Risk

(Robustness Performance Security)

Health Factor proportions are mostly

consistent across technologies

Curtis et al (2012) IEEE Software

Click to edit Master title style Join CISQ wwwit-cisqorg

Click to edit Master title style Section

References

115

Click to edit Master title style References 1 Anderson DJ (2010) Kanban Sequim WA Blue Hole Press

Basili V Trendowicz A Kowalczyk M Heidrich J Seaman C Munch J amp Rombach D (2014) Aligning Organizations through Measurement Heidelberg Springer

Boehm B (1987) Increasing software productivity IEEE Computer 20 (9) 43-57

Boehm B et al (2000) Software Cost Estimation with COCOMO II Prentice-Hall

Chrissis MB Konrad M amp Shrum S (2011) CMMI Third Edition Guidelines for Process Integration and Product Improvement Reading MA Addison Wesley

Cohn M (2006) Agile Estimating and Planning Upper Saddle River NJ Prentice-Hall

Crosby P (1979) Quality Is Free New York McGraw-Hill

Curtis B Sapiddi J amp Szynkarski A (2012) Estimating the principle of an applicationrsquos technical debt IEEE Software 29 (6) 34-42

Dion R (1993) Process improvement and the corporate balance sheet IEEE Software 10 (4) 28-35

Duncker R (1992) Presentation at the 25th Annual Conference of the Singapore Computer Society Singapore November 1992

Dyne K (1999) ESSI Benefits from Continuous Software Improvements In Proceedings of ESEPGrsquo99 Milton Keyes UK ESPI Foundation

Flowe RM amp Thordahl JB (1994) A Correlational Study of the SEIrsquos Capability Maturity Model and Software Development Performance in DoD Contracts (AFITGSSLAR94D-2) Dayton OH Air Force Institute of Technology Wright Patterson Air Force Base

116

Click to edit Master title style References 2 Haley T Ireland B Wojtaszek E Nash D amp Dion R (1995) Raytheon Electronic Systems

Experience in Software Process Improvement (Tech Rep CMUSEI-95-TR-017) ) Pittsburgh Software Engineering Institute Carnegie Mellon University

Ebert C amp Dumke R (2007) Software Measurement Berlin Springer-Verlag

Fagan E amp Davis N (2005) TSP PSP at Intuit Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Fenton N (2006) New directions for Software Metrics CIO Symposium on Software Best Practices

Garmus D amp Herron D (2001) Function Point Analysis Boston Addison-Wesley

George E (2005) Project management with TSP Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Herbsleb J Carleton A Rozum J Siegel J amp Zubrow D (1994) Benefits of CMM-Based Software Process Improvement Initial Results (Tech Rep CMUSEI-94-TR-13) Pittsburgh Software Engineering Institute Carnegie Mellon University

Humphrey W (1997) Introduction to the Personal Software Process Boston Addison Wesley

Humphrey W (2000) Introduction to the Team Software Process Boston Addison Wesley

Humphrey W (2006) TSP Leading a Development Team Boston Addison Wesley

Jackson D (2009) A direct path to dependable software Communications of the ACM 52 (4)

Kan SH (1995) Metrics and Models in Software Quality Engineering Reading MA Addison-Wesley

Keeni B (2000) IEEE Software 16 (4)

117

Click to edit Master title style References 2 Larman C (2004) Agile and Iterative Development Boston Addison Wesley

Lydon T (1995) Productivity drivers Process and capital In Proceedings of the 1995 SEPG Conference Pittsburgh Software Engineering Institute Carnegie Mellon University

McGarry J et al (2002) Practical Software Measurement Reading MA Addison Wesley

Paulk M Weber C Curtis B amp Chrissis M (1995) The Capability Maturity Model Guidelines for Improving the Software Process Reading MA Addison-Wesley

Peterman W (2000) IEEE Software JulyAugust issue

Poppendieck M amp Poppendieck T (2007) Implementing Lean Software Development Boston Addison-Wesley

Sakamoto K Kishida K amp Nakakoji K (1996) Cultural adaptation of the CMM In Fuggetta A amp Wolf A (Eds) Software Process Chichester UK Wiley 137-154

Sinha M (2006) Performance models ndash Enabling organizations prediction capability Proceedings of SEPGrsquo06 Pittsburgh Software Engineering Institute Carnegie Mellon University

Schwaber K amp Beedle M (2002) Agile Software Development Using Scrum

Spinellis D (2006) Code Quality Addison-Wesley

Van Epps S amp Marshall R (2012) Using TSP at the end of the development cycle Proceedings of TSP 2012 Pittsburgh Software Engineering Institute Carnegie Mellon University

Vu JD (1996) Software process improvement A business case In Proceedings of the European SEPG Conference Milton Keynes UK European Software Process Improvement Foundation

118

  • Best Practices for Software Process and Product Measurement
  • Instructor Dr Bill Curtis
  • Agenda
  • Section 1
  • Why Does Measurement Languish
  • Measurement Process Guidance
  • Measurement Categories
  • Productivity Analysis Objectives
  • Productivity Analysis Measures
  • Software Productivity
  • Software Size Measures
  • How Many Lines of Code
  • Function Point Estimation
  • Effort mdash Weakest Measure
  • How Quality Affects Productivity
  • Carry-forward Rework
  • Example of Quality Impact
  • Quality-Adjusted Productivity
  • Best Practices for Analysis
  • 1 Evaluate Demographics
  • 2 Segment Baselines
  • 3 Beware Sampling Effects
  • 4 Understand Variation
  • 5 Investigate Distributions
  • 6 Account for Maturity Effects
  • 7 Caution for External Data Sets
  • Slide Number 27
  • Measuring Project Progress
  • Requirements Change Impact
  • Phase Defect Removal Model
  • SCRUM Board
  • Burndown Chart and Velocity
  • Analyzing Velocity
  • Measuring and Managing Flow
  • Trends Over Sprints
  • Sources and Measures
  • Recommended Books
  • Personal Software Process (PSP)
  • Team Software Process (TSP)
  • Aggregated Personal Data Best
  • Defect Detection at Intuit
  • TSP Benefits
  • Section 3
  • CMMI Maturity Transitions
  • Enhanced Maturity Framework
  • Maturity Level Transformation
  • Measurement by Level
  • Average Improvement Results
  • Schedule Performance
  • Cost Performance
  • Crosbyrsquos Cost of Quality Model
  • Raytheons Cost of Quality
  • Raytheonrsquos Productivity
  • Raytheonrsquos Cost Reduction
  • Raytheonrsquos Cost Predictability
  • OMRONrsquos Reuse gains
  • Section 4
  • What about the Product
  • Six Sigmarsquos Challenge
  • Testing is Not Enough
  • CASTrsquos Application Intelligence Platform
  • Modern Apps are a Technology Stack
  • Architecturally Complex Defects
  • Propagated Risk
  • Transaction Risk
  • Risk-Based Prioritization
  • Static Analysis in US DoD
  • Reducing Operational Incidents amp Costs
  • Reducing Operational Losses
  • Slide Number 70
  • The Technical Debt Metaphor
  • Tech Debt by Quality Factor
  • Join CISQ wwwit-cisqorg
  • Section
  • References 1
  • References 2
  • References 2
Page 40: Best Practices for Software Process and Product Measurement · Software Process and Product Measurement Dr. Bill Curtis ... measurement best practice • Best guidance for starting

Click to edit Master title style

0

10

20

30

40

50

60

70

80

1 2 3 4 5 6 7 8 9 10 11 12

Sprints

Sto

ry p

oin

ts d

eliv

ered

Trends Over Sprints

0

10

20

30

40

50

60

70

80

1 2 3 4 5 6 7 8 9 10 11 12

Sprints

Sto

ry p

oin

ts d

eliv

ered

0

05

1

15

2

25

3

35

4

1 2 3 4 5 6 7 8 9 10 11 12

Sprints

Tech

nic

al d

ebt

Track and correlate multiple measures

across sprints

Predictive models

Click to edit Master title style Sources and Measures

Stories submitted

Stories pulled back

Growth in size of stories

Stories added mid-sprint

Stories under review

Failed tests

Non-development tasks

Assists to other projects

Data sources

Measures Project tracking

Code control

Bug tracking

Build system

Test environment

Deployment tools

Operational monitoring

Click to edit Master title style Recommended Books

Click to edit Master title style Personal Software Process (PSP)

PSP 0 Current process Time recording

Defect recording Defect type standards

PSP 01 Coding standard

Size measurement Process improvement

proposal

Baseline Personal Process

Humphrey (1997) Introduction to the Personal Software Process

PSP 1 Size estimating

Test report

Personal Planning

PSP 2 Code reviews

Design reviews

Personal Quality Management

PSP 21 Cyclic development

Cyclic Personal Process

PSP 11 Task planning

Schedule planning

PSP 21 Design templates

Click to edit Master title style Team Software Process (TSP) bull Built from personal processes of team members

bull Well defined team roles

bull Project launch workshops for planning

bull Team measurement and tracking

bull Post-mortems for improvement

bull Team application of lean principles

Humphrey (2000) Introduction to the Team Software Process

Click to edit Master title style Aggregated Personal Data Best

R2s range from 70 to 87

The ability to predict the amount of time required to produce a piece of software is dramatically improved by estimating at the individual level first and then combining the estimates rather than developing a single team estimate averaged over individuals

R2 = 53

Click to edit Master title style Defect Detection at Intuit

Consequently test changes from defect

detection into correctness verification

PSPTSP shifts defect detection from the test phase back up into development

Fagan (2005)

Personal reviews

33

Compile 33

Team reviews 19

Unit test 15

Test 14

Click to edit Master title style

Dramatic reductions

ndash Delivered defects

ndash Days per KLOC

ndash Schedule deviation

ndash Test defectskloc

ndash Variation in results

Humphrey (2006) TSP Leading a Development Team

TSP Benefits

Click to edit Master title style Section 3 Section 3 Measuring Software

Improvement Programs

1) Improvement program methods

2) Maturity-based measurement

3) Improvement program results

61

Click to edit Master title style CMMI Maturity Transitions

Level 5 Optimizing

Innovation management

Level 4 Quantitatively

Managed

Capability management

Level 3 Defined

Process management

Level 2 Managed

Project management

Level 1 Initial

Inconsistent management

Chrisis Konrad amp Shrum (2005) CMMI

Click to edit Master title style Enhanced Maturity Framework

Level 5 Innovating

Innovation management

Level 4 Optimized

Capability management

Level 3 Standardized

Organizational management

Level 2 Stabilized

Work unit management

Level 1 Initial

Inconsistent management

Click to edit Master title style Maturity Level Transformation

Level 5 Opportunistic improvements

Empowered culture

Level 5 Proactive

improvements Agile culture

Organization

Projects

Individual

Level 4 Process and results

managed statistically Precision culture

Level 3 Organization develops

standard processes Common culture

Level 2 Project managers stabilize projects

Local culture

Level 1 Ad Hoc inconsistent

development practices Relationship culture

Trust

Click to edit Master title style Measurement by Level

Level 5 mdash Exploratory data ∆ t = N x ROI

Level 4 mdash Predictive data X plusmn (196σ radicn )

Level 3 mdash Process data 83 defects per review

Level 2 mdash Management data Effort = α LOCβ

Level 1 mdash Haphazard data ldquoabout a million linesrdquo

Measure-maturity mismatch slows improvement and creates

dysfunction

Click to edit Master title style Average Improvement Results

Productivity growth 4 35 9 - 67

Pre-test defect detection 3 22 6 - 25

Time to market 2 darr 19 15 - 23

Field error reports 5 darr 39 10 - 94

Return on investment 5 501 41 - 881

Annual Annual Improvement benefit Orgs median range

Click to edit Master title style Schedule Performance

0

02

04

06

08

1

12

14

16

1 2 3

Schedule performance

index

CMM maturity level

Budgeted cost of work performed

Budgeted cost of work scheduled

behind ahead

Air Force study of contractors

Flowe amp Thordahl (1994)

Click to edit Master title style Cost Performance

Cost performance

index

0

05

1

15

2

25

1 2 3 CMM Maturity Level

Budgeted cost of work performed Actual cost of

work performed

overrun underrun

Air Force study of contractors

Flowe amp Thordahl (1994)

Click to edit Master title style Crosbyrsquos Cost of Quality Model Project Cost

Performance Cost of Quality Generating plans documentation Development - requirements - design - code - integration Prevention Appraisal

Reviews - system - requirements - design - test plan - test scripts Walkthroughs Testing (first time) Independent Verification and Validation (first time) Audits

Training Policies Procedures and methods Tools Planning Quality Improvement Projects Data Gathering and Analysis Root Cause Analysis Quality Reporting

Nonconformance Conformance Fixing Defects Reworking any Document Updating Source Code Re-reviews Re-tests Lab costs for re-tests Patches(Internal and External) Engineering Changes Change Control Boards External Failures and fines Customer Support Help Desks

Dion (1993) Crosby (1979) Quality Is Free

Click to edit Master title style Raytheons Cost of Quality

bull Performance mdash cost of building it right first time bull Nonconformance mdash cost of rework bull Appraisal mdash cost of testing bull Prevention mdash cost of preventing nonconformance

Dion (1993) Haley (1995)

23

18

Year Level Perform Nonconf Appraise Prevent

1988 1 34 41 15 7

1990 2 55 18 15 12

1992 3 66 11

1994 4 76 6 75

Click to edit Master title style Raytheonrsquos Productivity

0

1

2

3

4

1988 1989 1990 1991 1992 1993 1994

Growth relative to 1989 baseline

Lydon (1995)

Click to edit Master title style Raytheonrsquos Cost Reduction

0

02

04

06

08

1

12

1988 1989 1990 1991 1992 1993 1994

Reduction relative to 1989 baseline

Lydon (1995)

Click to edit Master title style

Haley (1995)

1993 1992 1991 1990 1989 1988 1994

50

40

30

20

10

0

-10

Actual cost

Budgeted cost

over-run under-run

4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1

Raytheonrsquos Cost Predictability

Click to edit Master title style OMRONrsquos Reuse gains

Sakamoto et al (1996)

0

5 0 0

1 0 0 0

1 5 0 0

2 0 0 0

2 5 0 0

3 0 0 0

1 9 8 7 1 9 8 8 1 9 8 9 1 9 9 0 1 9 9 1 1 9 9 2 1 9 9 3 1 9 9 4

Steps

100

75

50

25

0

of reuse

Click to edit Master title style Section 4 Section 4 Measuring Quality and

Technical Debt

1) Structural quality measurement

2) Technical debt

3) Consortium for IT Software Quality (CISQ)

89

Click to edit Master title style What about the Product CMMIrsquos primary assumption

ldquoThe quality of a software system is governed by quality of the process used to develop itrdquo

- Watts Humphrey

CMMIrsquos product quality problem

1 Assessments do not verify product quality

2 Compliance focus undermines CMMIrsquos primary assumption

3 Product quality focus at Level 4 in CMM was lost in quantitative process control in CMMI

4 The assumptions underlying control charts are violated by software data

Click to edit Master title style Six Sigmarsquos Challenge

Process focus ndash process improvement ndash Six Sigma Product focus ndash product improvement ndash Design for 6σ

Product focus supplements product focus to unlock even more value from software

Six Sigma projects must have significant benefits Huge benefits

Large benefits

Good benefits

Okay benefits

Small benefits

Now what

Ultimately we run out of projects with significant enough benefits to continue the programhellip

so what is the solution that allows continual improvement

Click to edit Master title style Testing is Not Enough ldquoAs higher levels of assurance are demandedhelliptesting cannot deliver the level of confidence required at a reasonable costrdquo

ldquoThe correctness of the code is rarely the weakest linkrdquo

Jackson D (2009) Spinellis D (2006)

ldquohellipa failure to satisfy a non-functional requirement can be critical even catastrophichellipnon-functional requirements are sometimes difficult to verify We cannot write a test case to verify a systemrsquos reliabilityhellipThe ability to associate code to non-functional properties can be a powerful weapon in a software engineerrsquos arsenalrdquo

Click to edit Master title style CASTrsquos Application Intelligence Platform

Application Analysis

Evaluation of 1200+ coding amp

architectural rules

Application meta-data

Transferability

Changeability

Robustness

Performance

Security

Quality Measurements

Detected Violations

Expensive operation in loop Static vs pooled connections Complex query on big table Large indices on big table

Empty CATCH block Uncontrolled data access Poor memory management Opened resource not closed

SQL injection Cross-site scripting Buffer overflow Uncontrolled format string

Unstructured code Misuse of inheritance Lack of comments Violated naming convention

Highly coupled component Duplicated code Index modified in loop High cyclomatic complexity

Language Parsers

Oracle PLSQL Sybase T-SQL SQL Server T-SQL IBM SQLPSM C C++ C Pro C Cobol CICS Visual Basic VBNet ASPNet Java J2EE JSP XML HTML Javascript VBScript PHP PowerBuilder Oracle Forms PeopleSoft SAP ABAP Netweaver Tibco Business Objects Universal Analyzer for other languages

Click to edit Master title style Modern Apps are a Technology Stack Ar

chite

ctur

al C

ompl

ianc

e

Data Flow Transaction Risk

EJB PLSQL

Oracle

SQL Server

DB2

TSQL

Hibernate

Spring

Struts NET

COBOL

Sybase IMS

Messaging

Integration quality Architectural

compliance Risk propagation Application security Resiliency checks Transaction integrity

Function point Effort estimation Data access control SDK versioning Calibration across

technologies IT organization level

System Level 3

Code style amp layout Expression complexity Code documentation Class or program design Basic coding standards Developer level

Unit Level 1

Java Java Java

Java Java

Web Services Java Java Single languagetechnology layer

Intra-technology architecture Intra-layer dependencies Inter-program invocation Security vulnerabilities Development team level

Technology Level 2

JSP ASPNET APIs

Click to edit Master title style Architecturally Complex Defects

Primary cause of operational

problems

20x as many

fixes to correct

Architecturally Complex Defect

A structural flaw involving interactions among multiple components that reside in different application layers

48

52 92

8 Architecturally

Complex Defects

Code unit-level violations

of total app defects

of total repair effort

Most architecturally complex defects involve an architectural hotspot ndash a badly-built component that should be replaced

Click to edit Master title style Propagated Risk

The impact of violation 284 in Component B is very widely propagated throughout the system ndash

it presents very high risk

The impact of violation 342 in Component A is only narrowly propagated in the system

ndash it presents low risk

Propagated Risk Index

A ranking of the risk created by each violation based on the its severity and the breadth of its impact in the system

Violation 342 in component A

Violation 284 in component B

Remediating violation 284 in Component B will have

the greater impact on reducing risk and cost

Click to edit Master title style Transaction Risk

Customers care most about the dependability of their transactions

The risk of a transaction is determined by the number and severity of violations along its path

Transactions can be ranked by risk to prioritize their remediation

The Transaction Risk Index can be further tuned using the operational frequency of each transaction

Transaction Risk Index

A ranking of transactions based on the risk created by the number and severity of violations along the transaction path

Transaction entryexit in the user interface layer

Transaction fulfillment in the data storage layer

1 3 2 4 5

Transaction 1 poses greatest risk

Transaction processing in the business logic layer

Click to edit Master title style Risk-Based Prioritization

Violation Health Factor Severity Propagated Risk Index

In a high risk transaction

Frequency of path execution

148 Security 9

371 Robustness 9

062 Performance 8

284 Robustness 7

016 Changeability 7

241 Security 5

173 Transferability 4

359 Transferability 3

Click to edit Master title style Static Analysis in US DoD

John Keane (2013)

Testing finds lt13 of structural defects

Static analysis before testing is 85 efficient

With testing and static analysis combined test schedules are reduced by 50

Click to edit Master title style Reducing Operational Incidents amp Costs

Study of structural quality measures and maintenance effort across 20 customers in a large global system integrator

TQI increase of 24 decreased corrective maintenance effort by 50

Total Quality Index

Click to edit Master title style Reducing Operational Losses

0

20

40

60

80

100

120

140

160

180

100 150 200 250 300 350 400

Technical Quality Index

Pro

du

ctio

n In

cid

ents

+$2000000 Annual Loss

$50000 Annual Loss

$5000 Annual Loss

Large international investment bank Business critical applications

Click to edit Master title style Factors Affecting Structural Quality

2017 CRASH Report Request from infocastsoftwarecom

Factor apps Robust Security Perform Change Transfer

Technology 1606 20 3 16 26 7

Java-EE

Industry 677 5 5 7 4 6

App type 510 1 1 1 1 1

Source 580 1 1 4

Shore 540

Maturity 72 28 25 15 24 12

Method 208 10 6 14 6

Team size 186 7 5 8 8

of users 262 4 3 5 4

Percentage of variation in structural quality scores accounted for by various factors

Click to edit Master title style The Technical Debt Metaphor

Structural quality problems in production code

Technical Debt

Principal borrowed

Interest on the debt

Business Risk Liability from debt

Opportunity cost

Interestmdashcontinuing IT costs attributable to the violations causing technical debt ie higher maintenance costs greater resource usage etc

Principalcost of fixing problems remaining in the code after release that must be remediated

Opportunity costmdashbenefits that could have been achieved had resources been put on new capability rather than retiring technical debt

Liabilitymdashbusiness costs related to outages breaches corrupted data etc

Technical Debt the future cost of fixing defects remaining in code at release a component of the cost of ownership

Todayrsquos talk focuses on the principal

Curtis et al (2012) IEEE Software

Click to edit Master title style Tech Debt by Quality Factor

Transferability

40

Changeability

30

Security 7

Robustness

18

70 of Technical Debt is in IT Cost (Transferability Changeability)

30 of Technical Debt is in Business Risk

(Robustness Performance Security)

Health Factor proportions are mostly

consistent across technologies

Curtis et al (2012) IEEE Software

Click to edit Master title style Join CISQ wwwit-cisqorg

Click to edit Master title style Section

References

115

Click to edit Master title style References 1 Anderson DJ (2010) Kanban Sequim WA Blue Hole Press

Basili V Trendowicz A Kowalczyk M Heidrich J Seaman C Munch J amp Rombach D (2014) Aligning Organizations through Measurement Heidelberg Springer

Boehm B (1987) Increasing software productivity IEEE Computer 20 (9) 43-57

Boehm B et al (2000) Software Cost Estimation with COCOMO II Prentice-Hall

Chrissis MB Konrad M amp Shrum S (2011) CMMI Third Edition Guidelines for Process Integration and Product Improvement Reading MA Addison Wesley

Cohn M (2006) Agile Estimating and Planning Upper Saddle River NJ Prentice-Hall

Crosby P (1979) Quality Is Free New York McGraw-Hill

Curtis B Sapiddi J amp Szynkarski A (2012) Estimating the principle of an applicationrsquos technical debt IEEE Software 29 (6) 34-42

Dion R (1993) Process improvement and the corporate balance sheet IEEE Software 10 (4) 28-35

Duncker R (1992) Presentation at the 25th Annual Conference of the Singapore Computer Society Singapore November 1992

Dyne K (1999) ESSI Benefits from Continuous Software Improvements In Proceedings of ESEPGrsquo99 Milton Keyes UK ESPI Foundation

Flowe RM amp Thordahl JB (1994) A Correlational Study of the SEIrsquos Capability Maturity Model and Software Development Performance in DoD Contracts (AFITGSSLAR94D-2) Dayton OH Air Force Institute of Technology Wright Patterson Air Force Base

116

Click to edit Master title style References 2 Haley T Ireland B Wojtaszek E Nash D amp Dion R (1995) Raytheon Electronic Systems

Experience in Software Process Improvement (Tech Rep CMUSEI-95-TR-017) ) Pittsburgh Software Engineering Institute Carnegie Mellon University

Ebert C amp Dumke R (2007) Software Measurement Berlin Springer-Verlag

Fagan E amp Davis N (2005) TSP PSP at Intuit Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Fenton N (2006) New directions for Software Metrics CIO Symposium on Software Best Practices

Garmus D amp Herron D (2001) Function Point Analysis Boston Addison-Wesley

George E (2005) Project management with TSP Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Herbsleb J Carleton A Rozum J Siegel J amp Zubrow D (1994) Benefits of CMM-Based Software Process Improvement Initial Results (Tech Rep CMUSEI-94-TR-13) Pittsburgh Software Engineering Institute Carnegie Mellon University

Humphrey W (1997) Introduction to the Personal Software Process Boston Addison Wesley

Humphrey W (2000) Introduction to the Team Software Process Boston Addison Wesley

Humphrey W (2006) TSP Leading a Development Team Boston Addison Wesley

Jackson D (2009) A direct path to dependable software Communications of the ACM 52 (4)

Kan SH (1995) Metrics and Models in Software Quality Engineering Reading MA Addison-Wesley

Keeni B (2000) IEEE Software 16 (4)

117

Click to edit Master title style References 2 Larman C (2004) Agile and Iterative Development Boston Addison Wesley

Lydon T (1995) Productivity drivers Process and capital In Proceedings of the 1995 SEPG Conference Pittsburgh Software Engineering Institute Carnegie Mellon University

McGarry J et al (2002) Practical Software Measurement Reading MA Addison Wesley

Paulk M Weber C Curtis B amp Chrissis M (1995) The Capability Maturity Model Guidelines for Improving the Software Process Reading MA Addison-Wesley

Peterman W (2000) IEEE Software JulyAugust issue

Poppendieck M amp Poppendieck T (2007) Implementing Lean Software Development Boston Addison-Wesley

Sakamoto K Kishida K amp Nakakoji K (1996) Cultural adaptation of the CMM In Fuggetta A amp Wolf A (Eds) Software Process Chichester UK Wiley 137-154

Sinha M (2006) Performance models ndash Enabling organizations prediction capability Proceedings of SEPGrsquo06 Pittsburgh Software Engineering Institute Carnegie Mellon University

Schwaber K amp Beedle M (2002) Agile Software Development Using Scrum

Spinellis D (2006) Code Quality Addison-Wesley

Van Epps S amp Marshall R (2012) Using TSP at the end of the development cycle Proceedings of TSP 2012 Pittsburgh Software Engineering Institute Carnegie Mellon University

Vu JD (1996) Software process improvement A business case In Proceedings of the European SEPG Conference Milton Keynes UK European Software Process Improvement Foundation

118

  • Best Practices for Software Process and Product Measurement
  • Instructor Dr Bill Curtis
  • Agenda
  • Section 1
  • Why Does Measurement Languish
  • Measurement Process Guidance
  • Measurement Categories
  • Productivity Analysis Objectives
  • Productivity Analysis Measures
  • Software Productivity
  • Software Size Measures
  • How Many Lines of Code
  • Function Point Estimation
  • Effort mdash Weakest Measure
  • How Quality Affects Productivity
  • Carry-forward Rework
  • Example of Quality Impact
  • Quality-Adjusted Productivity
  • Best Practices for Analysis
  • 1 Evaluate Demographics
  • 2 Segment Baselines
  • 3 Beware Sampling Effects
  • 4 Understand Variation
  • 5 Investigate Distributions
  • 6 Account for Maturity Effects
  • 7 Caution for External Data Sets
  • Slide Number 27
  • Measuring Project Progress
  • Requirements Change Impact
  • Phase Defect Removal Model
  • SCRUM Board
  • Burndown Chart and Velocity
  • Analyzing Velocity
  • Measuring and Managing Flow
  • Trends Over Sprints
  • Sources and Measures
  • Recommended Books
  • Personal Software Process (PSP)
  • Team Software Process (TSP)
  • Aggregated Personal Data Best
  • Defect Detection at Intuit
  • TSP Benefits
  • Section 3
  • CMMI Maturity Transitions
  • Enhanced Maturity Framework
  • Maturity Level Transformation
  • Measurement by Level
  • Average Improvement Results
  • Schedule Performance
  • Cost Performance
  • Crosbyrsquos Cost of Quality Model
  • Raytheons Cost of Quality
  • Raytheonrsquos Productivity
  • Raytheonrsquos Cost Reduction
  • Raytheonrsquos Cost Predictability
  • OMRONrsquos Reuse gains
  • Section 4
  • What about the Product
  • Six Sigmarsquos Challenge
  • Testing is Not Enough
  • CASTrsquos Application Intelligence Platform
  • Modern Apps are a Technology Stack
  • Architecturally Complex Defects
  • Propagated Risk
  • Transaction Risk
  • Risk-Based Prioritization
  • Static Analysis in US DoD
  • Reducing Operational Incidents amp Costs
  • Reducing Operational Losses
  • Slide Number 70
  • The Technical Debt Metaphor
  • Tech Debt by Quality Factor
  • Join CISQ wwwit-cisqorg
  • Section
  • References 1
  • References 2
  • References 2
Page 41: Best Practices for Software Process and Product Measurement · Software Process and Product Measurement Dr. Bill Curtis ... measurement best practice • Best guidance for starting

Click to edit Master title style Sources and Measures

Stories submitted

Stories pulled back

Growth in size of stories

Stories added mid-sprint

Stories under review

Failed tests

Non-development tasks

Assists to other projects

Data sources

Measures Project tracking

Code control

Bug tracking

Build system

Test environment

Deployment tools

Operational monitoring

Click to edit Master title style Recommended Books

Click to edit Master title style Personal Software Process (PSP)

PSP 0 Current process Time recording

Defect recording Defect type standards

PSP 01 Coding standard

Size measurement Process improvement

proposal

Baseline Personal Process

Humphrey (1997) Introduction to the Personal Software Process

PSP 1 Size estimating

Test report

Personal Planning

PSP 2 Code reviews

Design reviews

Personal Quality Management

PSP 21 Cyclic development

Cyclic Personal Process

PSP 11 Task planning

Schedule planning

PSP 21 Design templates

Click to edit Master title style Team Software Process (TSP) bull Built from personal processes of team members

bull Well defined team roles

bull Project launch workshops for planning

bull Team measurement and tracking

bull Post-mortems for improvement

bull Team application of lean principles

Humphrey (2000) Introduction to the Team Software Process

Click to edit Master title style Aggregated Personal Data Best

R2s range from 70 to 87

The ability to predict the amount of time required to produce a piece of software is dramatically improved by estimating at the individual level first and then combining the estimates rather than developing a single team estimate averaged over individuals

R2 = 53

Click to edit Master title style Defect Detection at Intuit

Consequently test changes from defect

detection into correctness verification

PSPTSP shifts defect detection from the test phase back up into development

Fagan (2005)

Personal reviews

33

Compile 33

Team reviews 19

Unit test 15

Test 14

Click to edit Master title style

Dramatic reductions

ndash Delivered defects

ndash Days per KLOC

ndash Schedule deviation

ndash Test defectskloc

ndash Variation in results

Humphrey (2006) TSP Leading a Development Team

TSP Benefits

Click to edit Master title style Section 3 Section 3 Measuring Software

Improvement Programs

1) Improvement program methods

2) Maturity-based measurement

3) Improvement program results

61

Click to edit Master title style CMMI Maturity Transitions

Level 5 Optimizing

Innovation management

Level 4 Quantitatively

Managed

Capability management

Level 3 Defined

Process management

Level 2 Managed

Project management

Level 1 Initial

Inconsistent management

Chrisis Konrad amp Shrum (2005) CMMI

Click to edit Master title style Enhanced Maturity Framework

Level 5 Innovating

Innovation management

Level 4 Optimized

Capability management

Level 3 Standardized

Organizational management

Level 2 Stabilized

Work unit management

Level 1 Initial

Inconsistent management

Click to edit Master title style Maturity Level Transformation

Level 5 Opportunistic improvements

Empowered culture

Level 5 Proactive

improvements Agile culture

Organization

Projects

Individual

Level 4 Process and results

managed statistically Precision culture

Level 3 Organization develops

standard processes Common culture

Level 2 Project managers stabilize projects

Local culture

Level 1 Ad Hoc inconsistent

development practices Relationship culture

Trust

Click to edit Master title style Measurement by Level

Level 5 mdash Exploratory data ∆ t = N x ROI

Level 4 mdash Predictive data X plusmn (196σ radicn )

Level 3 mdash Process data 83 defects per review

Level 2 mdash Management data Effort = α LOCβ

Level 1 mdash Haphazard data ldquoabout a million linesrdquo

Measure-maturity mismatch slows improvement and creates

dysfunction

Click to edit Master title style Average Improvement Results

Productivity growth 4 35 9 - 67

Pre-test defect detection 3 22 6 - 25

Time to market 2 darr 19 15 - 23

Field error reports 5 darr 39 10 - 94

Return on investment 5 501 41 - 881

Annual Annual Improvement benefit Orgs median range

Click to edit Master title style Schedule Performance

0

02

04

06

08

1

12

14

16

1 2 3

Schedule performance

index

CMM maturity level

Budgeted cost of work performed

Budgeted cost of work scheduled

behind ahead

Air Force study of contractors

Flowe amp Thordahl (1994)

Click to edit Master title style Cost Performance

Cost performance

index

0

05

1

15

2

25

1 2 3 CMM Maturity Level

Budgeted cost of work performed Actual cost of

work performed

overrun underrun

Air Force study of contractors

Flowe amp Thordahl (1994)

Click to edit Master title style Crosbyrsquos Cost of Quality Model Project Cost

Performance Cost of Quality Generating plans documentation Development - requirements - design - code - integration Prevention Appraisal

Reviews - system - requirements - design - test plan - test scripts Walkthroughs Testing (first time) Independent Verification and Validation (first time) Audits

Training Policies Procedures and methods Tools Planning Quality Improvement Projects Data Gathering and Analysis Root Cause Analysis Quality Reporting

Nonconformance Conformance Fixing Defects Reworking any Document Updating Source Code Re-reviews Re-tests Lab costs for re-tests Patches(Internal and External) Engineering Changes Change Control Boards External Failures and fines Customer Support Help Desks

Dion (1993) Crosby (1979) Quality Is Free

Click to edit Master title style Raytheons Cost of Quality

bull Performance mdash cost of building it right first time bull Nonconformance mdash cost of rework bull Appraisal mdash cost of testing bull Prevention mdash cost of preventing nonconformance

Dion (1993) Haley (1995)

23

18

Year Level Perform Nonconf Appraise Prevent

1988 1 34 41 15 7

1990 2 55 18 15 12

1992 3 66 11

1994 4 76 6 75

Click to edit Master title style Raytheonrsquos Productivity

0

1

2

3

4

1988 1989 1990 1991 1992 1993 1994

Growth relative to 1989 baseline

Lydon (1995)

Click to edit Master title style Raytheonrsquos Cost Reduction

0

02

04

06

08

1

12

1988 1989 1990 1991 1992 1993 1994

Reduction relative to 1989 baseline

Lydon (1995)

Click to edit Master title style

Haley (1995)

1993 1992 1991 1990 1989 1988 1994

50

40

30

20

10

0

-10

Actual cost

Budgeted cost

over-run under-run

4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1

Raytheonrsquos Cost Predictability

Click to edit Master title style OMRONrsquos Reuse gains

Sakamoto et al (1996)

0

5 0 0

1 0 0 0

1 5 0 0

2 0 0 0

2 5 0 0

3 0 0 0

1 9 8 7 1 9 8 8 1 9 8 9 1 9 9 0 1 9 9 1 1 9 9 2 1 9 9 3 1 9 9 4

Steps

100

75

50

25

0

of reuse

Click to edit Master title style Section 4 Section 4 Measuring Quality and

Technical Debt

1) Structural quality measurement

2) Technical debt

3) Consortium for IT Software Quality (CISQ)

89

Click to edit Master title style What about the Product CMMIrsquos primary assumption

ldquoThe quality of a software system is governed by quality of the process used to develop itrdquo

- Watts Humphrey

CMMIrsquos product quality problem

1 Assessments do not verify product quality

2 Compliance focus undermines CMMIrsquos primary assumption

3 Product quality focus at Level 4 in CMM was lost in quantitative process control in CMMI

4 The assumptions underlying control charts are violated by software data

Click to edit Master title style Six Sigmarsquos Challenge

Process focus ndash process improvement ndash Six Sigma Product focus ndash product improvement ndash Design for 6σ

Product focus supplements product focus to unlock even more value from software

Six Sigma projects must have significant benefits Huge benefits

Large benefits

Good benefits

Okay benefits

Small benefits

Now what

Ultimately we run out of projects with significant enough benefits to continue the programhellip

so what is the solution that allows continual improvement

Click to edit Master title style Testing is Not Enough ldquoAs higher levels of assurance are demandedhelliptesting cannot deliver the level of confidence required at a reasonable costrdquo

ldquoThe correctness of the code is rarely the weakest linkrdquo

Jackson D (2009) Spinellis D (2006)

ldquohellipa failure to satisfy a non-functional requirement can be critical even catastrophichellipnon-functional requirements are sometimes difficult to verify We cannot write a test case to verify a systemrsquos reliabilityhellipThe ability to associate code to non-functional properties can be a powerful weapon in a software engineerrsquos arsenalrdquo

Click to edit Master title style CASTrsquos Application Intelligence Platform

Application Analysis

Evaluation of 1200+ coding amp

architectural rules

Application meta-data

Transferability

Changeability

Robustness

Performance

Security

Quality Measurements

Detected Violations

Expensive operation in loop Static vs pooled connections Complex query on big table Large indices on big table

Empty CATCH block Uncontrolled data access Poor memory management Opened resource not closed

SQL injection Cross-site scripting Buffer overflow Uncontrolled format string

Unstructured code Misuse of inheritance Lack of comments Violated naming convention

Highly coupled component Duplicated code Index modified in loop High cyclomatic complexity

Language Parsers

Oracle PLSQL Sybase T-SQL SQL Server T-SQL IBM SQLPSM C C++ C Pro C Cobol CICS Visual Basic VBNet ASPNet Java J2EE JSP XML HTML Javascript VBScript PHP PowerBuilder Oracle Forms PeopleSoft SAP ABAP Netweaver Tibco Business Objects Universal Analyzer for other languages

Click to edit Master title style Modern Apps are a Technology Stack Ar

chite

ctur

al C

ompl

ianc

e

Data Flow Transaction Risk

EJB PLSQL

Oracle

SQL Server

DB2

TSQL

Hibernate

Spring

Struts NET

COBOL

Sybase IMS

Messaging

Integration quality Architectural

compliance Risk propagation Application security Resiliency checks Transaction integrity

Function point Effort estimation Data access control SDK versioning Calibration across

technologies IT organization level

System Level 3

Code style amp layout Expression complexity Code documentation Class or program design Basic coding standards Developer level

Unit Level 1

Java Java Java

Java Java

Web Services Java Java Single languagetechnology layer

Intra-technology architecture Intra-layer dependencies Inter-program invocation Security vulnerabilities Development team level

Technology Level 2

JSP ASPNET APIs

Click to edit Master title style Architecturally Complex Defects

Primary cause of operational

problems

20x as many

fixes to correct

Architecturally Complex Defect

A structural flaw involving interactions among multiple components that reside in different application layers

48

52 92

8 Architecturally

Complex Defects

Code unit-level violations

of total app defects

of total repair effort

Most architecturally complex defects involve an architectural hotspot ndash a badly-built component that should be replaced

Click to edit Master title style Propagated Risk

The impact of violation 284 in Component B is very widely propagated throughout the system ndash

it presents very high risk

The impact of violation 342 in Component A is only narrowly propagated in the system

ndash it presents low risk

Propagated Risk Index

A ranking of the risk created by each violation based on the its severity and the breadth of its impact in the system

Violation 342 in component A

Violation 284 in component B

Remediating violation 284 in Component B will have

the greater impact on reducing risk and cost

Click to edit Master title style Transaction Risk

Customers care most about the dependability of their transactions

The risk of a transaction is determined by the number and severity of violations along its path

Transactions can be ranked by risk to prioritize their remediation

The Transaction Risk Index can be further tuned using the operational frequency of each transaction

Transaction Risk Index

A ranking of transactions based on the risk created by the number and severity of violations along the transaction path

Transaction entryexit in the user interface layer

Transaction fulfillment in the data storage layer

1 3 2 4 5

Transaction 1 poses greatest risk

Transaction processing in the business logic layer

Click to edit Master title style Risk-Based Prioritization

Violation Health Factor Severity Propagated Risk Index

In a high risk transaction

Frequency of path execution

148 Security 9

371 Robustness 9

062 Performance 8

284 Robustness 7

016 Changeability 7

241 Security 5

173 Transferability 4

359 Transferability 3

Click to edit Master title style Static Analysis in US DoD

John Keane (2013)

Testing finds lt13 of structural defects

Static analysis before testing is 85 efficient

With testing and static analysis combined test schedules are reduced by 50

Click to edit Master title style Reducing Operational Incidents amp Costs

Study of structural quality measures and maintenance effort across 20 customers in a large global system integrator

TQI increase of 24 decreased corrective maintenance effort by 50

Total Quality Index

Click to edit Master title style Reducing Operational Losses

0

20

40

60

80

100

120

140

160

180

100 150 200 250 300 350 400

Technical Quality Index

Pro

du

ctio

n In

cid

ents

+$2000000 Annual Loss

$50000 Annual Loss

$5000 Annual Loss

Large international investment bank Business critical applications

Click to edit Master title style Factors Affecting Structural Quality

2017 CRASH Report Request from infocastsoftwarecom

Factor apps Robust Security Perform Change Transfer

Technology 1606 20 3 16 26 7

Java-EE

Industry 677 5 5 7 4 6

App type 510 1 1 1 1 1

Source 580 1 1 4

Shore 540

Maturity 72 28 25 15 24 12

Method 208 10 6 14 6

Team size 186 7 5 8 8

of users 262 4 3 5 4

Percentage of variation in structural quality scores accounted for by various factors

Click to edit Master title style The Technical Debt Metaphor

Structural quality problems in production code

Technical Debt

Principal borrowed

Interest on the debt

Business Risk Liability from debt

Opportunity cost

Interestmdashcontinuing IT costs attributable to the violations causing technical debt ie higher maintenance costs greater resource usage etc

Principalcost of fixing problems remaining in the code after release that must be remediated

Opportunity costmdashbenefits that could have been achieved had resources been put on new capability rather than retiring technical debt

Liabilitymdashbusiness costs related to outages breaches corrupted data etc

Technical Debt the future cost of fixing defects remaining in code at release a component of the cost of ownership

Todayrsquos talk focuses on the principal

Curtis et al (2012) IEEE Software

Click to edit Master title style Tech Debt by Quality Factor

Transferability

40

Changeability

30

Security 7

Robustness

18

70 of Technical Debt is in IT Cost (Transferability Changeability)

30 of Technical Debt is in Business Risk

(Robustness Performance Security)

Health Factor proportions are mostly

consistent across technologies

Curtis et al (2012) IEEE Software

Click to edit Master title style Join CISQ wwwit-cisqorg

Click to edit Master title style Section

References

115

Click to edit Master title style References 1 Anderson DJ (2010) Kanban Sequim WA Blue Hole Press

Basili V Trendowicz A Kowalczyk M Heidrich J Seaman C Munch J amp Rombach D (2014) Aligning Organizations through Measurement Heidelberg Springer

Boehm B (1987) Increasing software productivity IEEE Computer 20 (9) 43-57

Boehm B et al (2000) Software Cost Estimation with COCOMO II Prentice-Hall

Chrissis MB Konrad M amp Shrum S (2011) CMMI Third Edition Guidelines for Process Integration and Product Improvement Reading MA Addison Wesley

Cohn M (2006) Agile Estimating and Planning Upper Saddle River NJ Prentice-Hall

Crosby P (1979) Quality Is Free New York McGraw-Hill

Curtis B Sapiddi J amp Szynkarski A (2012) Estimating the principle of an applicationrsquos technical debt IEEE Software 29 (6) 34-42

Dion R (1993) Process improvement and the corporate balance sheet IEEE Software 10 (4) 28-35

Duncker R (1992) Presentation at the 25th Annual Conference of the Singapore Computer Society Singapore November 1992

Dyne K (1999) ESSI Benefits from Continuous Software Improvements In Proceedings of ESEPGrsquo99 Milton Keyes UK ESPI Foundation

Flowe RM amp Thordahl JB (1994) A Correlational Study of the SEIrsquos Capability Maturity Model and Software Development Performance in DoD Contracts (AFITGSSLAR94D-2) Dayton OH Air Force Institute of Technology Wright Patterson Air Force Base

116

Click to edit Master title style References 2 Haley T Ireland B Wojtaszek E Nash D amp Dion R (1995) Raytheon Electronic Systems

Experience in Software Process Improvement (Tech Rep CMUSEI-95-TR-017) ) Pittsburgh Software Engineering Institute Carnegie Mellon University

Ebert C amp Dumke R (2007) Software Measurement Berlin Springer-Verlag

Fagan E amp Davis N (2005) TSP PSP at Intuit Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Fenton N (2006) New directions for Software Metrics CIO Symposium on Software Best Practices

Garmus D amp Herron D (2001) Function Point Analysis Boston Addison-Wesley

George E (2005) Project management with TSP Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Herbsleb J Carleton A Rozum J Siegel J amp Zubrow D (1994) Benefits of CMM-Based Software Process Improvement Initial Results (Tech Rep CMUSEI-94-TR-13) Pittsburgh Software Engineering Institute Carnegie Mellon University

Humphrey W (1997) Introduction to the Personal Software Process Boston Addison Wesley

Humphrey W (2000) Introduction to the Team Software Process Boston Addison Wesley

Humphrey W (2006) TSP Leading a Development Team Boston Addison Wesley

Jackson D (2009) A direct path to dependable software Communications of the ACM 52 (4)

Kan SH (1995) Metrics and Models in Software Quality Engineering Reading MA Addison-Wesley

Keeni B (2000) IEEE Software 16 (4)

117

Click to edit Master title style References 2 Larman C (2004) Agile and Iterative Development Boston Addison Wesley

Lydon T (1995) Productivity drivers Process and capital In Proceedings of the 1995 SEPG Conference Pittsburgh Software Engineering Institute Carnegie Mellon University

McGarry J et al (2002) Practical Software Measurement Reading MA Addison Wesley

Paulk M Weber C Curtis B amp Chrissis M (1995) The Capability Maturity Model Guidelines for Improving the Software Process Reading MA Addison-Wesley

Peterman W (2000) IEEE Software JulyAugust issue

Poppendieck M amp Poppendieck T (2007) Implementing Lean Software Development Boston Addison-Wesley

Sakamoto K Kishida K amp Nakakoji K (1996) Cultural adaptation of the CMM In Fuggetta A amp Wolf A (Eds) Software Process Chichester UK Wiley 137-154

Sinha M (2006) Performance models ndash Enabling organizations prediction capability Proceedings of SEPGrsquo06 Pittsburgh Software Engineering Institute Carnegie Mellon University

Schwaber K amp Beedle M (2002) Agile Software Development Using Scrum

Spinellis D (2006) Code Quality Addison-Wesley

Van Epps S amp Marshall R (2012) Using TSP at the end of the development cycle Proceedings of TSP 2012 Pittsburgh Software Engineering Institute Carnegie Mellon University

Vu JD (1996) Software process improvement A business case In Proceedings of the European SEPG Conference Milton Keynes UK European Software Process Improvement Foundation

118

  • Best Practices for Software Process and Product Measurement
  • Instructor Dr Bill Curtis
  • Agenda
  • Section 1
  • Why Does Measurement Languish
  • Measurement Process Guidance
  • Measurement Categories
  • Productivity Analysis Objectives
  • Productivity Analysis Measures
  • Software Productivity
  • Software Size Measures
  • How Many Lines of Code
  • Function Point Estimation
  • Effort mdash Weakest Measure
  • How Quality Affects Productivity
  • Carry-forward Rework
  • Example of Quality Impact
  • Quality-Adjusted Productivity
  • Best Practices for Analysis
  • 1 Evaluate Demographics
  • 2 Segment Baselines
  • 3 Beware Sampling Effects
  • 4 Understand Variation
  • 5 Investigate Distributions
  • 6 Account for Maturity Effects
  • 7 Caution for External Data Sets
  • Slide Number 27
  • Measuring Project Progress
  • Requirements Change Impact
  • Phase Defect Removal Model
  • SCRUM Board
  • Burndown Chart and Velocity
  • Analyzing Velocity
  • Measuring and Managing Flow
  • Trends Over Sprints
  • Sources and Measures
  • Recommended Books
  • Personal Software Process (PSP)
  • Team Software Process (TSP)
  • Aggregated Personal Data Best
  • Defect Detection at Intuit
  • TSP Benefits
  • Section 3
  • CMMI Maturity Transitions
  • Enhanced Maturity Framework
  • Maturity Level Transformation
  • Measurement by Level
  • Average Improvement Results
  • Schedule Performance
  • Cost Performance
  • Crosbyrsquos Cost of Quality Model
  • Raytheons Cost of Quality
  • Raytheonrsquos Productivity
  • Raytheonrsquos Cost Reduction
  • Raytheonrsquos Cost Predictability
  • OMRONrsquos Reuse gains
  • Section 4
  • What about the Product
  • Six Sigmarsquos Challenge
  • Testing is Not Enough
  • CASTrsquos Application Intelligence Platform
  • Modern Apps are a Technology Stack
  • Architecturally Complex Defects
  • Propagated Risk
  • Transaction Risk
  • Risk-Based Prioritization
  • Static Analysis in US DoD
  • Reducing Operational Incidents amp Costs
  • Reducing Operational Losses
  • Slide Number 70
  • The Technical Debt Metaphor
  • Tech Debt by Quality Factor
  • Join CISQ wwwit-cisqorg
  • Section
  • References 1
  • References 2
  • References 2
Page 42: Best Practices for Software Process and Product Measurement · Software Process and Product Measurement Dr. Bill Curtis ... measurement best practice • Best guidance for starting

Click to edit Master title style Recommended Books

Click to edit Master title style Personal Software Process (PSP)

PSP 0 Current process Time recording

Defect recording Defect type standards

PSP 01 Coding standard

Size measurement Process improvement

proposal

Baseline Personal Process

Humphrey (1997) Introduction to the Personal Software Process

PSP 1 Size estimating

Test report

Personal Planning

PSP 2 Code reviews

Design reviews

Personal Quality Management

PSP 21 Cyclic development

Cyclic Personal Process

PSP 11 Task planning

Schedule planning

PSP 21 Design templates

Click to edit Master title style Team Software Process (TSP) bull Built from personal processes of team members

bull Well defined team roles

bull Project launch workshops for planning

bull Team measurement and tracking

bull Post-mortems for improvement

bull Team application of lean principles

Humphrey (2000) Introduction to the Team Software Process

Click to edit Master title style Aggregated Personal Data Best

R2s range from 70 to 87

The ability to predict the amount of time required to produce a piece of software is dramatically improved by estimating at the individual level first and then combining the estimates rather than developing a single team estimate averaged over individuals

R2 = 53

Click to edit Master title style Defect Detection at Intuit

Consequently test changes from defect

detection into correctness verification

PSPTSP shifts defect detection from the test phase back up into development

Fagan (2005)

Personal reviews

33

Compile 33

Team reviews 19

Unit test 15

Test 14

Click to edit Master title style

Dramatic reductions

ndash Delivered defects

ndash Days per KLOC

ndash Schedule deviation

ndash Test defectskloc

ndash Variation in results

Humphrey (2006) TSP Leading a Development Team

TSP Benefits

Click to edit Master title style Section 3 Section 3 Measuring Software

Improvement Programs

1) Improvement program methods

2) Maturity-based measurement

3) Improvement program results

61

Click to edit Master title style CMMI Maturity Transitions

Level 5 Optimizing

Innovation management

Level 4 Quantitatively

Managed

Capability management

Level 3 Defined

Process management

Level 2 Managed

Project management

Level 1 Initial

Inconsistent management

Chrisis Konrad amp Shrum (2005) CMMI

Click to edit Master title style Enhanced Maturity Framework

Level 5 Innovating

Innovation management

Level 4 Optimized

Capability management

Level 3 Standardized

Organizational management

Level 2 Stabilized

Work unit management

Level 1 Initial

Inconsistent management

Click to edit Master title style Maturity Level Transformation

Level 5 Opportunistic improvements

Empowered culture

Level 5 Proactive

improvements Agile culture

Organization

Projects

Individual

Level 4 Process and results

managed statistically Precision culture

Level 3 Organization develops

standard processes Common culture

Level 2 Project managers stabilize projects

Local culture

Level 1 Ad Hoc inconsistent

development practices Relationship culture

Trust

Click to edit Master title style Measurement by Level

Level 5 mdash Exploratory data ∆ t = N x ROI

Level 4 mdash Predictive data X plusmn (196σ radicn )

Level 3 mdash Process data 83 defects per review

Level 2 mdash Management data Effort = α LOCβ

Level 1 mdash Haphazard data ldquoabout a million linesrdquo

Measure-maturity mismatch slows improvement and creates

dysfunction

Click to edit Master title style Average Improvement Results

Productivity growth 4 35 9 - 67

Pre-test defect detection 3 22 6 - 25

Time to market 2 darr 19 15 - 23

Field error reports 5 darr 39 10 - 94

Return on investment 5 501 41 - 881

Annual Annual Improvement benefit Orgs median range

Click to edit Master title style Schedule Performance

0

02

04

06

08

1

12

14

16

1 2 3

Schedule performance

index

CMM maturity level

Budgeted cost of work performed

Budgeted cost of work scheduled

behind ahead

Air Force study of contractors

Flowe amp Thordahl (1994)

Click to edit Master title style Cost Performance

Cost performance

index

0

05

1

15

2

25

1 2 3 CMM Maturity Level

Budgeted cost of work performed Actual cost of

work performed

overrun underrun

Air Force study of contractors

Flowe amp Thordahl (1994)

Click to edit Master title style Crosbyrsquos Cost of Quality Model Project Cost

Performance Cost of Quality Generating plans documentation Development - requirements - design - code - integration Prevention Appraisal

Reviews - system - requirements - design - test plan - test scripts Walkthroughs Testing (first time) Independent Verification and Validation (first time) Audits

Training Policies Procedures and methods Tools Planning Quality Improvement Projects Data Gathering and Analysis Root Cause Analysis Quality Reporting

Nonconformance Conformance Fixing Defects Reworking any Document Updating Source Code Re-reviews Re-tests Lab costs for re-tests Patches(Internal and External) Engineering Changes Change Control Boards External Failures and fines Customer Support Help Desks

Dion (1993) Crosby (1979) Quality Is Free

Click to edit Master title style Raytheons Cost of Quality

bull Performance mdash cost of building it right first time bull Nonconformance mdash cost of rework bull Appraisal mdash cost of testing bull Prevention mdash cost of preventing nonconformance

Dion (1993) Haley (1995)

23

18

Year Level Perform Nonconf Appraise Prevent

1988 1 34 41 15 7

1990 2 55 18 15 12

1992 3 66 11

1994 4 76 6 75

Click to edit Master title style Raytheonrsquos Productivity

0

1

2

3

4

1988 1989 1990 1991 1992 1993 1994

Growth relative to 1989 baseline

Lydon (1995)

Click to edit Master title style Raytheonrsquos Cost Reduction

0

02

04

06

08

1

12

1988 1989 1990 1991 1992 1993 1994

Reduction relative to 1989 baseline

Lydon (1995)

Click to edit Master title style

Haley (1995)

1993 1992 1991 1990 1989 1988 1994

50

40

30

20

10

0

-10

Actual cost

Budgeted cost

over-run under-run

4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1

Raytheonrsquos Cost Predictability

Click to edit Master title style OMRONrsquos Reuse gains

Sakamoto et al (1996)

0

5 0 0

1 0 0 0

1 5 0 0

2 0 0 0

2 5 0 0

3 0 0 0

1 9 8 7 1 9 8 8 1 9 8 9 1 9 9 0 1 9 9 1 1 9 9 2 1 9 9 3 1 9 9 4

Steps

100

75

50

25

0

of reuse

Click to edit Master title style Section 4 Section 4 Measuring Quality and

Technical Debt

1) Structural quality measurement

2) Technical debt

3) Consortium for IT Software Quality (CISQ)

89

Click to edit Master title style What about the Product CMMIrsquos primary assumption

ldquoThe quality of a software system is governed by quality of the process used to develop itrdquo

- Watts Humphrey

CMMIrsquos product quality problem

1 Assessments do not verify product quality

2 Compliance focus undermines CMMIrsquos primary assumption

3 Product quality focus at Level 4 in CMM was lost in quantitative process control in CMMI

4 The assumptions underlying control charts are violated by software data

Click to edit Master title style Six Sigmarsquos Challenge

Process focus ndash process improvement ndash Six Sigma Product focus ndash product improvement ndash Design for 6σ

Product focus supplements product focus to unlock even more value from software

Six Sigma projects must have significant benefits Huge benefits

Large benefits

Good benefits

Okay benefits

Small benefits

Now what

Ultimately we run out of projects with significant enough benefits to continue the programhellip

so what is the solution that allows continual improvement

Click to edit Master title style Testing is Not Enough ldquoAs higher levels of assurance are demandedhelliptesting cannot deliver the level of confidence required at a reasonable costrdquo

ldquoThe correctness of the code is rarely the weakest linkrdquo

Jackson D (2009) Spinellis D (2006)

ldquohellipa failure to satisfy a non-functional requirement can be critical even catastrophichellipnon-functional requirements are sometimes difficult to verify We cannot write a test case to verify a systemrsquos reliabilityhellipThe ability to associate code to non-functional properties can be a powerful weapon in a software engineerrsquos arsenalrdquo

Click to edit Master title style CASTrsquos Application Intelligence Platform

Application Analysis

Evaluation of 1200+ coding amp

architectural rules

Application meta-data

Transferability

Changeability

Robustness

Performance

Security

Quality Measurements

Detected Violations

Expensive operation in loop Static vs pooled connections Complex query on big table Large indices on big table

Empty CATCH block Uncontrolled data access Poor memory management Opened resource not closed

SQL injection Cross-site scripting Buffer overflow Uncontrolled format string

Unstructured code Misuse of inheritance Lack of comments Violated naming convention

Highly coupled component Duplicated code Index modified in loop High cyclomatic complexity

Language Parsers

Oracle PLSQL Sybase T-SQL SQL Server T-SQL IBM SQLPSM C C++ C Pro C Cobol CICS Visual Basic VBNet ASPNet Java J2EE JSP XML HTML Javascript VBScript PHP PowerBuilder Oracle Forms PeopleSoft SAP ABAP Netweaver Tibco Business Objects Universal Analyzer for other languages

Click to edit Master title style Modern Apps are a Technology Stack Ar

chite

ctur

al C

ompl

ianc

e

Data Flow Transaction Risk

EJB PLSQL

Oracle

SQL Server

DB2

TSQL

Hibernate

Spring

Struts NET

COBOL

Sybase IMS

Messaging

Integration quality Architectural

compliance Risk propagation Application security Resiliency checks Transaction integrity

Function point Effort estimation Data access control SDK versioning Calibration across

technologies IT organization level

System Level 3

Code style amp layout Expression complexity Code documentation Class or program design Basic coding standards Developer level

Unit Level 1

Java Java Java

Java Java

Web Services Java Java Single languagetechnology layer

Intra-technology architecture Intra-layer dependencies Inter-program invocation Security vulnerabilities Development team level

Technology Level 2

JSP ASPNET APIs

Click to edit Master title style Architecturally Complex Defects

Primary cause of operational

problems

20x as many

fixes to correct

Architecturally Complex Defect

A structural flaw involving interactions among multiple components that reside in different application layers

48

52 92

8 Architecturally

Complex Defects

Code unit-level violations

of total app defects

of total repair effort

Most architecturally complex defects involve an architectural hotspot ndash a badly-built component that should be replaced

Click to edit Master title style Propagated Risk

The impact of violation 284 in Component B is very widely propagated throughout the system ndash

it presents very high risk

The impact of violation 342 in Component A is only narrowly propagated in the system

ndash it presents low risk

Propagated Risk Index

A ranking of the risk created by each violation based on the its severity and the breadth of its impact in the system

Violation 342 in component A

Violation 284 in component B

Remediating violation 284 in Component B will have

the greater impact on reducing risk and cost

Click to edit Master title style Transaction Risk

Customers care most about the dependability of their transactions

The risk of a transaction is determined by the number and severity of violations along its path

Transactions can be ranked by risk to prioritize their remediation

The Transaction Risk Index can be further tuned using the operational frequency of each transaction

Transaction Risk Index

A ranking of transactions based on the risk created by the number and severity of violations along the transaction path

Transaction entryexit in the user interface layer

Transaction fulfillment in the data storage layer

1 3 2 4 5

Transaction 1 poses greatest risk

Transaction processing in the business logic layer

Click to edit Master title style Risk-Based Prioritization

Violation Health Factor Severity Propagated Risk Index

In a high risk transaction

Frequency of path execution

148 Security 9

371 Robustness 9

062 Performance 8

284 Robustness 7

016 Changeability 7

241 Security 5

173 Transferability 4

359 Transferability 3

Click to edit Master title style Static Analysis in US DoD

John Keane (2013)

Testing finds lt13 of structural defects

Static analysis before testing is 85 efficient

With testing and static analysis combined test schedules are reduced by 50

Click to edit Master title style Reducing Operational Incidents amp Costs

Study of structural quality measures and maintenance effort across 20 customers in a large global system integrator

TQI increase of 24 decreased corrective maintenance effort by 50

Total Quality Index

Click to edit Master title style Reducing Operational Losses

0

20

40

60

80

100

120

140

160

180

100 150 200 250 300 350 400

Technical Quality Index

Pro

du

ctio

n In

cid

ents

+$2000000 Annual Loss

$50000 Annual Loss

$5000 Annual Loss

Large international investment bank Business critical applications

Click to edit Master title style Factors Affecting Structural Quality

2017 CRASH Report Request from infocastsoftwarecom

Factor apps Robust Security Perform Change Transfer

Technology 1606 20 3 16 26 7

Java-EE

Industry 677 5 5 7 4 6

App type 510 1 1 1 1 1

Source 580 1 1 4

Shore 540

Maturity 72 28 25 15 24 12

Method 208 10 6 14 6

Team size 186 7 5 8 8

of users 262 4 3 5 4

Percentage of variation in structural quality scores accounted for by various factors

Click to edit Master title style The Technical Debt Metaphor

Structural quality problems in production code

Technical Debt

Principal borrowed

Interest on the debt

Business Risk Liability from debt

Opportunity cost

Interestmdashcontinuing IT costs attributable to the violations causing technical debt ie higher maintenance costs greater resource usage etc

Principalcost of fixing problems remaining in the code after release that must be remediated

Opportunity costmdashbenefits that could have been achieved had resources been put on new capability rather than retiring technical debt

Liabilitymdashbusiness costs related to outages breaches corrupted data etc

Technical Debt the future cost of fixing defects remaining in code at release a component of the cost of ownership

Todayrsquos talk focuses on the principal

Curtis et al (2012) IEEE Software

Click to edit Master title style Tech Debt by Quality Factor

Transferability

40

Changeability

30

Security 7

Robustness

18

70 of Technical Debt is in IT Cost (Transferability Changeability)

30 of Technical Debt is in Business Risk

(Robustness Performance Security)

Health Factor proportions are mostly

consistent across technologies

Curtis et al (2012) IEEE Software

Click to edit Master title style Join CISQ wwwit-cisqorg

Click to edit Master title style Section

References

115

Click to edit Master title style References 1 Anderson DJ (2010) Kanban Sequim WA Blue Hole Press

Basili V Trendowicz A Kowalczyk M Heidrich J Seaman C Munch J amp Rombach D (2014) Aligning Organizations through Measurement Heidelberg Springer

Boehm B (1987) Increasing software productivity IEEE Computer 20 (9) 43-57

Boehm B et al (2000) Software Cost Estimation with COCOMO II Prentice-Hall

Chrissis MB Konrad M amp Shrum S (2011) CMMI Third Edition Guidelines for Process Integration and Product Improvement Reading MA Addison Wesley

Cohn M (2006) Agile Estimating and Planning Upper Saddle River NJ Prentice-Hall

Crosby P (1979) Quality Is Free New York McGraw-Hill

Curtis B Sapiddi J amp Szynkarski A (2012) Estimating the principle of an applicationrsquos technical debt IEEE Software 29 (6) 34-42

Dion R (1993) Process improvement and the corporate balance sheet IEEE Software 10 (4) 28-35

Duncker R (1992) Presentation at the 25th Annual Conference of the Singapore Computer Society Singapore November 1992

Dyne K (1999) ESSI Benefits from Continuous Software Improvements In Proceedings of ESEPGrsquo99 Milton Keyes UK ESPI Foundation

Flowe RM amp Thordahl JB (1994) A Correlational Study of the SEIrsquos Capability Maturity Model and Software Development Performance in DoD Contracts (AFITGSSLAR94D-2) Dayton OH Air Force Institute of Technology Wright Patterson Air Force Base

116

Click to edit Master title style References 2 Haley T Ireland B Wojtaszek E Nash D amp Dion R (1995) Raytheon Electronic Systems

Experience in Software Process Improvement (Tech Rep CMUSEI-95-TR-017) ) Pittsburgh Software Engineering Institute Carnegie Mellon University

Ebert C amp Dumke R (2007) Software Measurement Berlin Springer-Verlag

Fagan E amp Davis N (2005) TSP PSP at Intuit Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Fenton N (2006) New directions for Software Metrics CIO Symposium on Software Best Practices

Garmus D amp Herron D (2001) Function Point Analysis Boston Addison-Wesley

George E (2005) Project management with TSP Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Herbsleb J Carleton A Rozum J Siegel J amp Zubrow D (1994) Benefits of CMM-Based Software Process Improvement Initial Results (Tech Rep CMUSEI-94-TR-13) Pittsburgh Software Engineering Institute Carnegie Mellon University

Humphrey W (1997) Introduction to the Personal Software Process Boston Addison Wesley

Humphrey W (2000) Introduction to the Team Software Process Boston Addison Wesley

Humphrey W (2006) TSP Leading a Development Team Boston Addison Wesley

Jackson D (2009) A direct path to dependable software Communications of the ACM 52 (4)

Kan SH (1995) Metrics and Models in Software Quality Engineering Reading MA Addison-Wesley

Keeni B (2000) IEEE Software 16 (4)

117

Click to edit Master title style References 2 Larman C (2004) Agile and Iterative Development Boston Addison Wesley

Lydon T (1995) Productivity drivers Process and capital In Proceedings of the 1995 SEPG Conference Pittsburgh Software Engineering Institute Carnegie Mellon University

McGarry J et al (2002) Practical Software Measurement Reading MA Addison Wesley

Paulk M Weber C Curtis B amp Chrissis M (1995) The Capability Maturity Model Guidelines for Improving the Software Process Reading MA Addison-Wesley

Peterman W (2000) IEEE Software JulyAugust issue

Poppendieck M amp Poppendieck T (2007) Implementing Lean Software Development Boston Addison-Wesley

Sakamoto K Kishida K amp Nakakoji K (1996) Cultural adaptation of the CMM In Fuggetta A amp Wolf A (Eds) Software Process Chichester UK Wiley 137-154

Sinha M (2006) Performance models ndash Enabling organizations prediction capability Proceedings of SEPGrsquo06 Pittsburgh Software Engineering Institute Carnegie Mellon University

Schwaber K amp Beedle M (2002) Agile Software Development Using Scrum

Spinellis D (2006) Code Quality Addison-Wesley

Van Epps S amp Marshall R (2012) Using TSP at the end of the development cycle Proceedings of TSP 2012 Pittsburgh Software Engineering Institute Carnegie Mellon University

Vu JD (1996) Software process improvement A business case In Proceedings of the European SEPG Conference Milton Keynes UK European Software Process Improvement Foundation

118

  • Best Practices for Software Process and Product Measurement
  • Instructor Dr Bill Curtis
  • Agenda
  • Section 1
  • Why Does Measurement Languish
  • Measurement Process Guidance
  • Measurement Categories
  • Productivity Analysis Objectives
  • Productivity Analysis Measures
  • Software Productivity
  • Software Size Measures
  • How Many Lines of Code
  • Function Point Estimation
  • Effort mdash Weakest Measure
  • How Quality Affects Productivity
  • Carry-forward Rework
  • Example of Quality Impact
  • Quality-Adjusted Productivity
  • Best Practices for Analysis
  • 1 Evaluate Demographics
  • 2 Segment Baselines
  • 3 Beware Sampling Effects
  • 4 Understand Variation
  • 5 Investigate Distributions
  • 6 Account for Maturity Effects
  • 7 Caution for External Data Sets
  • Slide Number 27
  • Measuring Project Progress
  • Requirements Change Impact
  • Phase Defect Removal Model
  • SCRUM Board
  • Burndown Chart and Velocity
  • Analyzing Velocity
  • Measuring and Managing Flow
  • Trends Over Sprints
  • Sources and Measures
  • Recommended Books
  • Personal Software Process (PSP)
  • Team Software Process (TSP)
  • Aggregated Personal Data Best
  • Defect Detection at Intuit
  • TSP Benefits
  • Section 3
  • CMMI Maturity Transitions
  • Enhanced Maturity Framework
  • Maturity Level Transformation
  • Measurement by Level
  • Average Improvement Results
  • Schedule Performance
  • Cost Performance
  • Crosbyrsquos Cost of Quality Model
  • Raytheons Cost of Quality
  • Raytheonrsquos Productivity
  • Raytheonrsquos Cost Reduction
  • Raytheonrsquos Cost Predictability
  • OMRONrsquos Reuse gains
  • Section 4
  • What about the Product
  • Six Sigmarsquos Challenge
  • Testing is Not Enough
  • CASTrsquos Application Intelligence Platform
  • Modern Apps are a Technology Stack
  • Architecturally Complex Defects
  • Propagated Risk
  • Transaction Risk
  • Risk-Based Prioritization
  • Static Analysis in US DoD
  • Reducing Operational Incidents amp Costs
  • Reducing Operational Losses
  • Slide Number 70
  • The Technical Debt Metaphor
  • Tech Debt by Quality Factor
  • Join CISQ wwwit-cisqorg
  • Section
  • References 1
  • References 2
  • References 2
Page 43: Best Practices for Software Process and Product Measurement · Software Process and Product Measurement Dr. Bill Curtis ... measurement best practice • Best guidance for starting

Click to edit Master title style Personal Software Process (PSP)

PSP 0 Current process Time recording

Defect recording Defect type standards

PSP 01 Coding standard

Size measurement Process improvement

proposal

Baseline Personal Process

Humphrey (1997) Introduction to the Personal Software Process

PSP 1 Size estimating

Test report

Personal Planning

PSP 2 Code reviews

Design reviews

Personal Quality Management

PSP 21 Cyclic development

Cyclic Personal Process

PSP 11 Task planning

Schedule planning

PSP 21 Design templates

Click to edit Master title style Team Software Process (TSP) bull Built from personal processes of team members

bull Well defined team roles

bull Project launch workshops for planning

bull Team measurement and tracking

bull Post-mortems for improvement

bull Team application of lean principles

Humphrey (2000) Introduction to the Team Software Process

Click to edit Master title style Aggregated Personal Data Best

R2s range from 70 to 87

The ability to predict the amount of time required to produce a piece of software is dramatically improved by estimating at the individual level first and then combining the estimates rather than developing a single team estimate averaged over individuals

R2 = 53

Click to edit Master title style Defect Detection at Intuit

Consequently test changes from defect

detection into correctness verification

PSPTSP shifts defect detection from the test phase back up into development

Fagan (2005)

Personal reviews

33

Compile 33

Team reviews 19

Unit test 15

Test 14

Click to edit Master title style

Dramatic reductions

ndash Delivered defects

ndash Days per KLOC

ndash Schedule deviation

ndash Test defectskloc

ndash Variation in results

Humphrey (2006) TSP Leading a Development Team

TSP Benefits

Click to edit Master title style Section 3 Section 3 Measuring Software

Improvement Programs

1) Improvement program methods

2) Maturity-based measurement

3) Improvement program results

61

Click to edit Master title style CMMI Maturity Transitions

Level 5 Optimizing

Innovation management

Level 4 Quantitatively

Managed

Capability management

Level 3 Defined

Process management

Level 2 Managed

Project management

Level 1 Initial

Inconsistent management

Chrisis Konrad amp Shrum (2005) CMMI

Click to edit Master title style Enhanced Maturity Framework

Level 5 Innovating

Innovation management

Level 4 Optimized

Capability management

Level 3 Standardized

Organizational management

Level 2 Stabilized

Work unit management

Level 1 Initial

Inconsistent management

Click to edit Master title style Maturity Level Transformation

Level 5 Opportunistic improvements

Empowered culture

Level 5 Proactive

improvements Agile culture

Organization

Projects

Individual

Level 4 Process and results

managed statistically Precision culture

Level 3 Organization develops

standard processes Common culture

Level 2 Project managers stabilize projects

Local culture

Level 1 Ad Hoc inconsistent

development practices Relationship culture

Trust

Click to edit Master title style Measurement by Level

Level 5 mdash Exploratory data ∆ t = N x ROI

Level 4 mdash Predictive data X plusmn (196σ radicn )

Level 3 mdash Process data 83 defects per review

Level 2 mdash Management data Effort = α LOCβ

Level 1 mdash Haphazard data ldquoabout a million linesrdquo

Measure-maturity mismatch slows improvement and creates

dysfunction

Click to edit Master title style Average Improvement Results

Productivity growth 4 35 9 - 67

Pre-test defect detection 3 22 6 - 25

Time to market 2 darr 19 15 - 23

Field error reports 5 darr 39 10 - 94

Return on investment 5 501 41 - 881

Annual Annual Improvement benefit Orgs median range

Click to edit Master title style Schedule Performance

0

02

04

06

08

1

12

14

16

1 2 3

Schedule performance

index

CMM maturity level

Budgeted cost of work performed

Budgeted cost of work scheduled

behind ahead

Air Force study of contractors

Flowe amp Thordahl (1994)

Click to edit Master title style Cost Performance

Cost performance

index

0

05

1

15

2

25

1 2 3 CMM Maturity Level

Budgeted cost of work performed Actual cost of

work performed

overrun underrun

Air Force study of contractors

Flowe amp Thordahl (1994)

Click to edit Master title style Crosbyrsquos Cost of Quality Model Project Cost

Performance Cost of Quality Generating plans documentation Development - requirements - design - code - integration Prevention Appraisal

Reviews - system - requirements - design - test plan - test scripts Walkthroughs Testing (first time) Independent Verification and Validation (first time) Audits

Training Policies Procedures and methods Tools Planning Quality Improvement Projects Data Gathering and Analysis Root Cause Analysis Quality Reporting

Nonconformance Conformance Fixing Defects Reworking any Document Updating Source Code Re-reviews Re-tests Lab costs for re-tests Patches(Internal and External) Engineering Changes Change Control Boards External Failures and fines Customer Support Help Desks

Dion (1993) Crosby (1979) Quality Is Free

Click to edit Master title style Raytheons Cost of Quality

bull Performance mdash cost of building it right first time bull Nonconformance mdash cost of rework bull Appraisal mdash cost of testing bull Prevention mdash cost of preventing nonconformance

Dion (1993) Haley (1995)

23

18

Year Level Perform Nonconf Appraise Prevent

1988 1 34 41 15 7

1990 2 55 18 15 12

1992 3 66 11

1994 4 76 6 75

Click to edit Master title style Raytheonrsquos Productivity

0

1

2

3

4

1988 1989 1990 1991 1992 1993 1994

Growth relative to 1989 baseline

Lydon (1995)

Click to edit Master title style Raytheonrsquos Cost Reduction

0

02

04

06

08

1

12

1988 1989 1990 1991 1992 1993 1994

Reduction relative to 1989 baseline

Lydon (1995)

Click to edit Master title style

Haley (1995)

1993 1992 1991 1990 1989 1988 1994

50

40

30

20

10

0

-10

Actual cost

Budgeted cost

over-run under-run

4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1

Raytheonrsquos Cost Predictability

Click to edit Master title style OMRONrsquos Reuse gains

Sakamoto et al (1996)

0

5 0 0

1 0 0 0

1 5 0 0

2 0 0 0

2 5 0 0

3 0 0 0

1 9 8 7 1 9 8 8 1 9 8 9 1 9 9 0 1 9 9 1 1 9 9 2 1 9 9 3 1 9 9 4

Steps

100

75

50

25

0

of reuse

Click to edit Master title style Section 4 Section 4 Measuring Quality and

Technical Debt

1) Structural quality measurement

2) Technical debt

3) Consortium for IT Software Quality (CISQ)

89

Click to edit Master title style What about the Product CMMIrsquos primary assumption

ldquoThe quality of a software system is governed by quality of the process used to develop itrdquo

- Watts Humphrey

CMMIrsquos product quality problem

1 Assessments do not verify product quality

2 Compliance focus undermines CMMIrsquos primary assumption

3 Product quality focus at Level 4 in CMM was lost in quantitative process control in CMMI

4 The assumptions underlying control charts are violated by software data

Click to edit Master title style Six Sigmarsquos Challenge

Process focus ndash process improvement ndash Six Sigma Product focus ndash product improvement ndash Design for 6σ

Product focus supplements product focus to unlock even more value from software

Six Sigma projects must have significant benefits Huge benefits

Large benefits

Good benefits

Okay benefits

Small benefits

Now what

Ultimately we run out of projects with significant enough benefits to continue the programhellip

so what is the solution that allows continual improvement

Click to edit Master title style Testing is Not Enough ldquoAs higher levels of assurance are demandedhelliptesting cannot deliver the level of confidence required at a reasonable costrdquo

ldquoThe correctness of the code is rarely the weakest linkrdquo

Jackson D (2009) Spinellis D (2006)

ldquohellipa failure to satisfy a non-functional requirement can be critical even catastrophichellipnon-functional requirements are sometimes difficult to verify We cannot write a test case to verify a systemrsquos reliabilityhellipThe ability to associate code to non-functional properties can be a powerful weapon in a software engineerrsquos arsenalrdquo

Click to edit Master title style CASTrsquos Application Intelligence Platform

Application Analysis

Evaluation of 1200+ coding amp

architectural rules

Application meta-data

Transferability

Changeability

Robustness

Performance

Security

Quality Measurements

Detected Violations

Expensive operation in loop Static vs pooled connections Complex query on big table Large indices on big table

Empty CATCH block Uncontrolled data access Poor memory management Opened resource not closed

SQL injection Cross-site scripting Buffer overflow Uncontrolled format string

Unstructured code Misuse of inheritance Lack of comments Violated naming convention

Highly coupled component Duplicated code Index modified in loop High cyclomatic complexity

Language Parsers

Oracle PLSQL Sybase T-SQL SQL Server T-SQL IBM SQLPSM C C++ C Pro C Cobol CICS Visual Basic VBNet ASPNet Java J2EE JSP XML HTML Javascript VBScript PHP PowerBuilder Oracle Forms PeopleSoft SAP ABAP Netweaver Tibco Business Objects Universal Analyzer for other languages

Click to edit Master title style Modern Apps are a Technology Stack Ar

chite

ctur

al C

ompl

ianc

e

Data Flow Transaction Risk

EJB PLSQL

Oracle

SQL Server

DB2

TSQL

Hibernate

Spring

Struts NET

COBOL

Sybase IMS

Messaging

Integration quality Architectural

compliance Risk propagation Application security Resiliency checks Transaction integrity

Function point Effort estimation Data access control SDK versioning Calibration across

technologies IT organization level

System Level 3

Code style amp layout Expression complexity Code documentation Class or program design Basic coding standards Developer level

Unit Level 1

Java Java Java

Java Java

Web Services Java Java Single languagetechnology layer

Intra-technology architecture Intra-layer dependencies Inter-program invocation Security vulnerabilities Development team level

Technology Level 2

JSP ASPNET APIs

Click to edit Master title style Architecturally Complex Defects

Primary cause of operational

problems

20x as many

fixes to correct

Architecturally Complex Defect

A structural flaw involving interactions among multiple components that reside in different application layers

48

52 92

8 Architecturally

Complex Defects

Code unit-level violations

of total app defects

of total repair effort

Most architecturally complex defects involve an architectural hotspot ndash a badly-built component that should be replaced

Click to edit Master title style Propagated Risk

The impact of violation 284 in Component B is very widely propagated throughout the system ndash

it presents very high risk

The impact of violation 342 in Component A is only narrowly propagated in the system

ndash it presents low risk

Propagated Risk Index

A ranking of the risk created by each violation based on the its severity and the breadth of its impact in the system

Violation 342 in component A

Violation 284 in component B

Remediating violation 284 in Component B will have

the greater impact on reducing risk and cost

Click to edit Master title style Transaction Risk

Customers care most about the dependability of their transactions

The risk of a transaction is determined by the number and severity of violations along its path

Transactions can be ranked by risk to prioritize their remediation

The Transaction Risk Index can be further tuned using the operational frequency of each transaction

Transaction Risk Index

A ranking of transactions based on the risk created by the number and severity of violations along the transaction path

Transaction entryexit in the user interface layer

Transaction fulfillment in the data storage layer

1 3 2 4 5

Transaction 1 poses greatest risk

Transaction processing in the business logic layer

Click to edit Master title style Risk-Based Prioritization

Violation Health Factor Severity Propagated Risk Index

In a high risk transaction

Frequency of path execution

148 Security 9

371 Robustness 9

062 Performance 8

284 Robustness 7

016 Changeability 7

241 Security 5

173 Transferability 4

359 Transferability 3

Click to edit Master title style Static Analysis in US DoD

John Keane (2013)

Testing finds lt13 of structural defects

Static analysis before testing is 85 efficient

With testing and static analysis combined test schedules are reduced by 50

Click to edit Master title style Reducing Operational Incidents amp Costs

Study of structural quality measures and maintenance effort across 20 customers in a large global system integrator

TQI increase of 24 decreased corrective maintenance effort by 50

Total Quality Index

Click to edit Master title style Reducing Operational Losses

0

20

40

60

80

100

120

140

160

180

100 150 200 250 300 350 400

Technical Quality Index

Pro

du

ctio

n In

cid

ents

+$2000000 Annual Loss

$50000 Annual Loss

$5000 Annual Loss

Large international investment bank Business critical applications

Click to edit Master title style Factors Affecting Structural Quality

2017 CRASH Report Request from infocastsoftwarecom

Factor apps Robust Security Perform Change Transfer

Technology 1606 20 3 16 26 7

Java-EE

Industry 677 5 5 7 4 6

App type 510 1 1 1 1 1

Source 580 1 1 4

Shore 540

Maturity 72 28 25 15 24 12

Method 208 10 6 14 6

Team size 186 7 5 8 8

of users 262 4 3 5 4

Percentage of variation in structural quality scores accounted for by various factors

Click to edit Master title style The Technical Debt Metaphor

Structural quality problems in production code

Technical Debt

Principal borrowed

Interest on the debt

Business Risk Liability from debt

Opportunity cost

Interestmdashcontinuing IT costs attributable to the violations causing technical debt ie higher maintenance costs greater resource usage etc

Principalcost of fixing problems remaining in the code after release that must be remediated

Opportunity costmdashbenefits that could have been achieved had resources been put on new capability rather than retiring technical debt

Liabilitymdashbusiness costs related to outages breaches corrupted data etc

Technical Debt the future cost of fixing defects remaining in code at release a component of the cost of ownership

Todayrsquos talk focuses on the principal

Curtis et al (2012) IEEE Software

Click to edit Master title style Tech Debt by Quality Factor

Transferability

40

Changeability

30

Security 7

Robustness

18

70 of Technical Debt is in IT Cost (Transferability Changeability)

30 of Technical Debt is in Business Risk

(Robustness Performance Security)

Health Factor proportions are mostly

consistent across technologies

Curtis et al (2012) IEEE Software

Click to edit Master title style Join CISQ wwwit-cisqorg

Click to edit Master title style Section

References

115

Click to edit Master title style References 1 Anderson DJ (2010) Kanban Sequim WA Blue Hole Press

Basili V Trendowicz A Kowalczyk M Heidrich J Seaman C Munch J amp Rombach D (2014) Aligning Organizations through Measurement Heidelberg Springer

Boehm B (1987) Increasing software productivity IEEE Computer 20 (9) 43-57

Boehm B et al (2000) Software Cost Estimation with COCOMO II Prentice-Hall

Chrissis MB Konrad M amp Shrum S (2011) CMMI Third Edition Guidelines for Process Integration and Product Improvement Reading MA Addison Wesley

Cohn M (2006) Agile Estimating and Planning Upper Saddle River NJ Prentice-Hall

Crosby P (1979) Quality Is Free New York McGraw-Hill

Curtis B Sapiddi J amp Szynkarski A (2012) Estimating the principle of an applicationrsquos technical debt IEEE Software 29 (6) 34-42

Dion R (1993) Process improvement and the corporate balance sheet IEEE Software 10 (4) 28-35

Duncker R (1992) Presentation at the 25th Annual Conference of the Singapore Computer Society Singapore November 1992

Dyne K (1999) ESSI Benefits from Continuous Software Improvements In Proceedings of ESEPGrsquo99 Milton Keyes UK ESPI Foundation

Flowe RM amp Thordahl JB (1994) A Correlational Study of the SEIrsquos Capability Maturity Model and Software Development Performance in DoD Contracts (AFITGSSLAR94D-2) Dayton OH Air Force Institute of Technology Wright Patterson Air Force Base

116

Click to edit Master title style References 2 Haley T Ireland B Wojtaszek E Nash D amp Dion R (1995) Raytheon Electronic Systems

Experience in Software Process Improvement (Tech Rep CMUSEI-95-TR-017) ) Pittsburgh Software Engineering Institute Carnegie Mellon University

Ebert C amp Dumke R (2007) Software Measurement Berlin Springer-Verlag

Fagan E amp Davis N (2005) TSP PSP at Intuit Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Fenton N (2006) New directions for Software Metrics CIO Symposium on Software Best Practices

Garmus D amp Herron D (2001) Function Point Analysis Boston Addison-Wesley

George E (2005) Project management with TSP Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Herbsleb J Carleton A Rozum J Siegel J amp Zubrow D (1994) Benefits of CMM-Based Software Process Improvement Initial Results (Tech Rep CMUSEI-94-TR-13) Pittsburgh Software Engineering Institute Carnegie Mellon University

Humphrey W (1997) Introduction to the Personal Software Process Boston Addison Wesley

Humphrey W (2000) Introduction to the Team Software Process Boston Addison Wesley

Humphrey W (2006) TSP Leading a Development Team Boston Addison Wesley

Jackson D (2009) A direct path to dependable software Communications of the ACM 52 (4)

Kan SH (1995) Metrics and Models in Software Quality Engineering Reading MA Addison-Wesley

Keeni B (2000) IEEE Software 16 (4)

117

Click to edit Master title style References 2 Larman C (2004) Agile and Iterative Development Boston Addison Wesley

Lydon T (1995) Productivity drivers Process and capital In Proceedings of the 1995 SEPG Conference Pittsburgh Software Engineering Institute Carnegie Mellon University

McGarry J et al (2002) Practical Software Measurement Reading MA Addison Wesley

Paulk M Weber C Curtis B amp Chrissis M (1995) The Capability Maturity Model Guidelines for Improving the Software Process Reading MA Addison-Wesley

Peterman W (2000) IEEE Software JulyAugust issue

Poppendieck M amp Poppendieck T (2007) Implementing Lean Software Development Boston Addison-Wesley

Sakamoto K Kishida K amp Nakakoji K (1996) Cultural adaptation of the CMM In Fuggetta A amp Wolf A (Eds) Software Process Chichester UK Wiley 137-154

Sinha M (2006) Performance models ndash Enabling organizations prediction capability Proceedings of SEPGrsquo06 Pittsburgh Software Engineering Institute Carnegie Mellon University

Schwaber K amp Beedle M (2002) Agile Software Development Using Scrum

Spinellis D (2006) Code Quality Addison-Wesley

Van Epps S amp Marshall R (2012) Using TSP at the end of the development cycle Proceedings of TSP 2012 Pittsburgh Software Engineering Institute Carnegie Mellon University

Vu JD (1996) Software process improvement A business case In Proceedings of the European SEPG Conference Milton Keynes UK European Software Process Improvement Foundation

118

  • Best Practices for Software Process and Product Measurement
  • Instructor Dr Bill Curtis
  • Agenda
  • Section 1
  • Why Does Measurement Languish
  • Measurement Process Guidance
  • Measurement Categories
  • Productivity Analysis Objectives
  • Productivity Analysis Measures
  • Software Productivity
  • Software Size Measures
  • How Many Lines of Code
  • Function Point Estimation
  • Effort mdash Weakest Measure
  • How Quality Affects Productivity
  • Carry-forward Rework
  • Example of Quality Impact
  • Quality-Adjusted Productivity
  • Best Practices for Analysis
  • 1 Evaluate Demographics
  • 2 Segment Baselines
  • 3 Beware Sampling Effects
  • 4 Understand Variation
  • 5 Investigate Distributions
  • 6 Account for Maturity Effects
  • 7 Caution for External Data Sets
  • Slide Number 27
  • Measuring Project Progress
  • Requirements Change Impact
  • Phase Defect Removal Model
  • SCRUM Board
  • Burndown Chart and Velocity
  • Analyzing Velocity
  • Measuring and Managing Flow
  • Trends Over Sprints
  • Sources and Measures
  • Recommended Books
  • Personal Software Process (PSP)
  • Team Software Process (TSP)
  • Aggregated Personal Data Best
  • Defect Detection at Intuit
  • TSP Benefits
  • Section 3
  • CMMI Maturity Transitions
  • Enhanced Maturity Framework
  • Maturity Level Transformation
  • Measurement by Level
  • Average Improvement Results
  • Schedule Performance
  • Cost Performance
  • Crosbyrsquos Cost of Quality Model
  • Raytheons Cost of Quality
  • Raytheonrsquos Productivity
  • Raytheonrsquos Cost Reduction
  • Raytheonrsquos Cost Predictability
  • OMRONrsquos Reuse gains
  • Section 4
  • What about the Product
  • Six Sigmarsquos Challenge
  • Testing is Not Enough
  • CASTrsquos Application Intelligence Platform
  • Modern Apps are a Technology Stack
  • Architecturally Complex Defects
  • Propagated Risk
  • Transaction Risk
  • Risk-Based Prioritization
  • Static Analysis in US DoD
  • Reducing Operational Incidents amp Costs
  • Reducing Operational Losses
  • Slide Number 70
  • The Technical Debt Metaphor
  • Tech Debt by Quality Factor
  • Join CISQ wwwit-cisqorg
  • Section
  • References 1
  • References 2
  • References 2
Page 44: Best Practices for Software Process and Product Measurement · Software Process and Product Measurement Dr. Bill Curtis ... measurement best practice • Best guidance for starting

Click to edit Master title style Team Software Process (TSP) bull Built from personal processes of team members

bull Well defined team roles

bull Project launch workshops for planning

bull Team measurement and tracking

bull Post-mortems for improvement

bull Team application of lean principles

Humphrey (2000) Introduction to the Team Software Process

Click to edit Master title style Aggregated Personal Data Best

R2s range from 70 to 87

The ability to predict the amount of time required to produce a piece of software is dramatically improved by estimating at the individual level first and then combining the estimates rather than developing a single team estimate averaged over individuals

R2 = 53

Click to edit Master title style Defect Detection at Intuit

Consequently test changes from defect

detection into correctness verification

PSPTSP shifts defect detection from the test phase back up into development

Fagan (2005)

Personal reviews

33

Compile 33

Team reviews 19

Unit test 15

Test 14

Click to edit Master title style

Dramatic reductions

ndash Delivered defects

ndash Days per KLOC

ndash Schedule deviation

ndash Test defectskloc

ndash Variation in results

Humphrey (2006) TSP Leading a Development Team

TSP Benefits

Click to edit Master title style Section 3 Section 3 Measuring Software

Improvement Programs

1) Improvement program methods

2) Maturity-based measurement

3) Improvement program results

61

Click to edit Master title style CMMI Maturity Transitions

Level 5 Optimizing

Innovation management

Level 4 Quantitatively

Managed

Capability management

Level 3 Defined

Process management

Level 2 Managed

Project management

Level 1 Initial

Inconsistent management

Chrisis Konrad amp Shrum (2005) CMMI

Click to edit Master title style Enhanced Maturity Framework

Level 5 Innovating

Innovation management

Level 4 Optimized

Capability management

Level 3 Standardized

Organizational management

Level 2 Stabilized

Work unit management

Level 1 Initial

Inconsistent management

Click to edit Master title style Maturity Level Transformation

Level 5 Opportunistic improvements

Empowered culture

Level 5 Proactive

improvements Agile culture

Organization

Projects

Individual

Level 4 Process and results

managed statistically Precision culture

Level 3 Organization develops

standard processes Common culture

Level 2 Project managers stabilize projects

Local culture

Level 1 Ad Hoc inconsistent

development practices Relationship culture

Trust

Click to edit Master title style Measurement by Level

Level 5 mdash Exploratory data ∆ t = N x ROI

Level 4 mdash Predictive data X plusmn (196σ radicn )

Level 3 mdash Process data 83 defects per review

Level 2 mdash Management data Effort = α LOCβ

Level 1 mdash Haphazard data ldquoabout a million linesrdquo

Measure-maturity mismatch slows improvement and creates

dysfunction

Click to edit Master title style Average Improvement Results

Productivity growth 4 35 9 - 67

Pre-test defect detection 3 22 6 - 25

Time to market 2 darr 19 15 - 23

Field error reports 5 darr 39 10 - 94

Return on investment 5 501 41 - 881

Annual Annual Improvement benefit Orgs median range

Click to edit Master title style Schedule Performance

0

02

04

06

08

1

12

14

16

1 2 3

Schedule performance

index

CMM maturity level

Budgeted cost of work performed

Budgeted cost of work scheduled

behind ahead

Air Force study of contractors

Flowe amp Thordahl (1994)

Click to edit Master title style Cost Performance

Cost performance

index

0

05

1

15

2

25

1 2 3 CMM Maturity Level

Budgeted cost of work performed Actual cost of

work performed

overrun underrun

Air Force study of contractors

Flowe amp Thordahl (1994)

Click to edit Master title style Crosbyrsquos Cost of Quality Model Project Cost

Performance Cost of Quality Generating plans documentation Development - requirements - design - code - integration Prevention Appraisal

Reviews - system - requirements - design - test plan - test scripts Walkthroughs Testing (first time) Independent Verification and Validation (first time) Audits

Training Policies Procedures and methods Tools Planning Quality Improvement Projects Data Gathering and Analysis Root Cause Analysis Quality Reporting

Nonconformance Conformance Fixing Defects Reworking any Document Updating Source Code Re-reviews Re-tests Lab costs for re-tests Patches(Internal and External) Engineering Changes Change Control Boards External Failures and fines Customer Support Help Desks

Dion (1993) Crosby (1979) Quality Is Free

Click to edit Master title style Raytheons Cost of Quality

bull Performance mdash cost of building it right first time bull Nonconformance mdash cost of rework bull Appraisal mdash cost of testing bull Prevention mdash cost of preventing nonconformance

Dion (1993) Haley (1995)

23

18

Year Level Perform Nonconf Appraise Prevent

1988 1 34 41 15 7

1990 2 55 18 15 12

1992 3 66 11

1994 4 76 6 75

Click to edit Master title style Raytheonrsquos Productivity

0

1

2

3

4

1988 1989 1990 1991 1992 1993 1994

Growth relative to 1989 baseline

Lydon (1995)

Click to edit Master title style Raytheonrsquos Cost Reduction

0

02

04

06

08

1

12

1988 1989 1990 1991 1992 1993 1994

Reduction relative to 1989 baseline

Lydon (1995)

Click to edit Master title style

Haley (1995)

1993 1992 1991 1990 1989 1988 1994

50

40

30

20

10

0

-10

Actual cost

Budgeted cost

over-run under-run

4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1

Raytheonrsquos Cost Predictability

Click to edit Master title style OMRONrsquos Reuse gains

Sakamoto et al (1996)

0

5 0 0

1 0 0 0

1 5 0 0

2 0 0 0

2 5 0 0

3 0 0 0

1 9 8 7 1 9 8 8 1 9 8 9 1 9 9 0 1 9 9 1 1 9 9 2 1 9 9 3 1 9 9 4

Steps

100

75

50

25

0

of reuse

Click to edit Master title style Section 4 Section 4 Measuring Quality and

Technical Debt

1) Structural quality measurement

2) Technical debt

3) Consortium for IT Software Quality (CISQ)

89

Click to edit Master title style What about the Product CMMIrsquos primary assumption

ldquoThe quality of a software system is governed by quality of the process used to develop itrdquo

- Watts Humphrey

CMMIrsquos product quality problem

1 Assessments do not verify product quality

2 Compliance focus undermines CMMIrsquos primary assumption

3 Product quality focus at Level 4 in CMM was lost in quantitative process control in CMMI

4 The assumptions underlying control charts are violated by software data

Click to edit Master title style Six Sigmarsquos Challenge

Process focus ndash process improvement ndash Six Sigma Product focus ndash product improvement ndash Design for 6σ

Product focus supplements product focus to unlock even more value from software

Six Sigma projects must have significant benefits Huge benefits

Large benefits

Good benefits

Okay benefits

Small benefits

Now what

Ultimately we run out of projects with significant enough benefits to continue the programhellip

so what is the solution that allows continual improvement

Click to edit Master title style Testing is Not Enough ldquoAs higher levels of assurance are demandedhelliptesting cannot deliver the level of confidence required at a reasonable costrdquo

ldquoThe correctness of the code is rarely the weakest linkrdquo

Jackson D (2009) Spinellis D (2006)

ldquohellipa failure to satisfy a non-functional requirement can be critical even catastrophichellipnon-functional requirements are sometimes difficult to verify We cannot write a test case to verify a systemrsquos reliabilityhellipThe ability to associate code to non-functional properties can be a powerful weapon in a software engineerrsquos arsenalrdquo

Click to edit Master title style CASTrsquos Application Intelligence Platform

Application Analysis

Evaluation of 1200+ coding amp

architectural rules

Application meta-data

Transferability

Changeability

Robustness

Performance

Security

Quality Measurements

Detected Violations

Expensive operation in loop Static vs pooled connections Complex query on big table Large indices on big table

Empty CATCH block Uncontrolled data access Poor memory management Opened resource not closed

SQL injection Cross-site scripting Buffer overflow Uncontrolled format string

Unstructured code Misuse of inheritance Lack of comments Violated naming convention

Highly coupled component Duplicated code Index modified in loop High cyclomatic complexity

Language Parsers

Oracle PLSQL Sybase T-SQL SQL Server T-SQL IBM SQLPSM C C++ C Pro C Cobol CICS Visual Basic VBNet ASPNet Java J2EE JSP XML HTML Javascript VBScript PHP PowerBuilder Oracle Forms PeopleSoft SAP ABAP Netweaver Tibco Business Objects Universal Analyzer for other languages

Click to edit Master title style Modern Apps are a Technology Stack Ar

chite

ctur

al C

ompl

ianc

e

Data Flow Transaction Risk

EJB PLSQL

Oracle

SQL Server

DB2

TSQL

Hibernate

Spring

Struts NET

COBOL

Sybase IMS

Messaging

Integration quality Architectural

compliance Risk propagation Application security Resiliency checks Transaction integrity

Function point Effort estimation Data access control SDK versioning Calibration across

technologies IT organization level

System Level 3

Code style amp layout Expression complexity Code documentation Class or program design Basic coding standards Developer level

Unit Level 1

Java Java Java

Java Java

Web Services Java Java Single languagetechnology layer

Intra-technology architecture Intra-layer dependencies Inter-program invocation Security vulnerabilities Development team level

Technology Level 2

JSP ASPNET APIs

Click to edit Master title style Architecturally Complex Defects

Primary cause of operational

problems

20x as many

fixes to correct

Architecturally Complex Defect

A structural flaw involving interactions among multiple components that reside in different application layers

48

52 92

8 Architecturally

Complex Defects

Code unit-level violations

of total app defects

of total repair effort

Most architecturally complex defects involve an architectural hotspot ndash a badly-built component that should be replaced

Click to edit Master title style Propagated Risk

The impact of violation 284 in Component B is very widely propagated throughout the system ndash

it presents very high risk

The impact of violation 342 in Component A is only narrowly propagated in the system

ndash it presents low risk

Propagated Risk Index

A ranking of the risk created by each violation based on the its severity and the breadth of its impact in the system

Violation 342 in component A

Violation 284 in component B

Remediating violation 284 in Component B will have

the greater impact on reducing risk and cost

Click to edit Master title style Transaction Risk

Customers care most about the dependability of their transactions

The risk of a transaction is determined by the number and severity of violations along its path

Transactions can be ranked by risk to prioritize their remediation

The Transaction Risk Index can be further tuned using the operational frequency of each transaction

Transaction Risk Index

A ranking of transactions based on the risk created by the number and severity of violations along the transaction path

Transaction entryexit in the user interface layer

Transaction fulfillment in the data storage layer

1 3 2 4 5

Transaction 1 poses greatest risk

Transaction processing in the business logic layer

Click to edit Master title style Risk-Based Prioritization

Violation Health Factor Severity Propagated Risk Index

In a high risk transaction

Frequency of path execution

148 Security 9

371 Robustness 9

062 Performance 8

284 Robustness 7

016 Changeability 7

241 Security 5

173 Transferability 4

359 Transferability 3

Click to edit Master title style Static Analysis in US DoD

John Keane (2013)

Testing finds lt13 of structural defects

Static analysis before testing is 85 efficient

With testing and static analysis combined test schedules are reduced by 50

Click to edit Master title style Reducing Operational Incidents amp Costs

Study of structural quality measures and maintenance effort across 20 customers in a large global system integrator

TQI increase of 24 decreased corrective maintenance effort by 50

Total Quality Index

Click to edit Master title style Reducing Operational Losses

0

20

40

60

80

100

120

140

160

180

100 150 200 250 300 350 400

Technical Quality Index

Pro

du

ctio

n In

cid

ents

+$2000000 Annual Loss

$50000 Annual Loss

$5000 Annual Loss

Large international investment bank Business critical applications

Click to edit Master title style Factors Affecting Structural Quality

2017 CRASH Report Request from infocastsoftwarecom

Factor apps Robust Security Perform Change Transfer

Technology 1606 20 3 16 26 7

Java-EE

Industry 677 5 5 7 4 6

App type 510 1 1 1 1 1

Source 580 1 1 4

Shore 540

Maturity 72 28 25 15 24 12

Method 208 10 6 14 6

Team size 186 7 5 8 8

of users 262 4 3 5 4

Percentage of variation in structural quality scores accounted for by various factors

Click to edit Master title style The Technical Debt Metaphor

Structural quality problems in production code

Technical Debt

Principal borrowed

Interest on the debt

Business Risk Liability from debt

Opportunity cost

Interestmdashcontinuing IT costs attributable to the violations causing technical debt ie higher maintenance costs greater resource usage etc

Principalcost of fixing problems remaining in the code after release that must be remediated

Opportunity costmdashbenefits that could have been achieved had resources been put on new capability rather than retiring technical debt

Liabilitymdashbusiness costs related to outages breaches corrupted data etc

Technical Debt the future cost of fixing defects remaining in code at release a component of the cost of ownership

Todayrsquos talk focuses on the principal

Curtis et al (2012) IEEE Software

Click to edit Master title style Tech Debt by Quality Factor

Transferability

40

Changeability

30

Security 7

Robustness

18

70 of Technical Debt is in IT Cost (Transferability Changeability)

30 of Technical Debt is in Business Risk

(Robustness Performance Security)

Health Factor proportions are mostly

consistent across technologies

Curtis et al (2012) IEEE Software

Click to edit Master title style Join CISQ wwwit-cisqorg

Click to edit Master title style Section

References

115

Click to edit Master title style References 1 Anderson DJ (2010) Kanban Sequim WA Blue Hole Press

Basili V Trendowicz A Kowalczyk M Heidrich J Seaman C Munch J amp Rombach D (2014) Aligning Organizations through Measurement Heidelberg Springer

Boehm B (1987) Increasing software productivity IEEE Computer 20 (9) 43-57

Boehm B et al (2000) Software Cost Estimation with COCOMO II Prentice-Hall

Chrissis MB Konrad M amp Shrum S (2011) CMMI Third Edition Guidelines for Process Integration and Product Improvement Reading MA Addison Wesley

Cohn M (2006) Agile Estimating and Planning Upper Saddle River NJ Prentice-Hall

Crosby P (1979) Quality Is Free New York McGraw-Hill

Curtis B Sapiddi J amp Szynkarski A (2012) Estimating the principle of an applicationrsquos technical debt IEEE Software 29 (6) 34-42

Dion R (1993) Process improvement and the corporate balance sheet IEEE Software 10 (4) 28-35

Duncker R (1992) Presentation at the 25th Annual Conference of the Singapore Computer Society Singapore November 1992

Dyne K (1999) ESSI Benefits from Continuous Software Improvements In Proceedings of ESEPGrsquo99 Milton Keyes UK ESPI Foundation

Flowe RM amp Thordahl JB (1994) A Correlational Study of the SEIrsquos Capability Maturity Model and Software Development Performance in DoD Contracts (AFITGSSLAR94D-2) Dayton OH Air Force Institute of Technology Wright Patterson Air Force Base

116

Click to edit Master title style References 2 Haley T Ireland B Wojtaszek E Nash D amp Dion R (1995) Raytheon Electronic Systems

Experience in Software Process Improvement (Tech Rep CMUSEI-95-TR-017) ) Pittsburgh Software Engineering Institute Carnegie Mellon University

Ebert C amp Dumke R (2007) Software Measurement Berlin Springer-Verlag

Fagan E amp Davis N (2005) TSP PSP at Intuit Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Fenton N (2006) New directions for Software Metrics CIO Symposium on Software Best Practices

Garmus D amp Herron D (2001) Function Point Analysis Boston Addison-Wesley

George E (2005) Project management with TSP Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Herbsleb J Carleton A Rozum J Siegel J amp Zubrow D (1994) Benefits of CMM-Based Software Process Improvement Initial Results (Tech Rep CMUSEI-94-TR-13) Pittsburgh Software Engineering Institute Carnegie Mellon University

Humphrey W (1997) Introduction to the Personal Software Process Boston Addison Wesley

Humphrey W (2000) Introduction to the Team Software Process Boston Addison Wesley

Humphrey W (2006) TSP Leading a Development Team Boston Addison Wesley

Jackson D (2009) A direct path to dependable software Communications of the ACM 52 (4)

Kan SH (1995) Metrics and Models in Software Quality Engineering Reading MA Addison-Wesley

Keeni B (2000) IEEE Software 16 (4)

117

Click to edit Master title style References 2 Larman C (2004) Agile and Iterative Development Boston Addison Wesley

Lydon T (1995) Productivity drivers Process and capital In Proceedings of the 1995 SEPG Conference Pittsburgh Software Engineering Institute Carnegie Mellon University

McGarry J et al (2002) Practical Software Measurement Reading MA Addison Wesley

Paulk M Weber C Curtis B amp Chrissis M (1995) The Capability Maturity Model Guidelines for Improving the Software Process Reading MA Addison-Wesley

Peterman W (2000) IEEE Software JulyAugust issue

Poppendieck M amp Poppendieck T (2007) Implementing Lean Software Development Boston Addison-Wesley

Sakamoto K Kishida K amp Nakakoji K (1996) Cultural adaptation of the CMM In Fuggetta A amp Wolf A (Eds) Software Process Chichester UK Wiley 137-154

Sinha M (2006) Performance models ndash Enabling organizations prediction capability Proceedings of SEPGrsquo06 Pittsburgh Software Engineering Institute Carnegie Mellon University

Schwaber K amp Beedle M (2002) Agile Software Development Using Scrum

Spinellis D (2006) Code Quality Addison-Wesley

Van Epps S amp Marshall R (2012) Using TSP at the end of the development cycle Proceedings of TSP 2012 Pittsburgh Software Engineering Institute Carnegie Mellon University

Vu JD (1996) Software process improvement A business case In Proceedings of the European SEPG Conference Milton Keynes UK European Software Process Improvement Foundation

118

  • Best Practices for Software Process and Product Measurement
  • Instructor Dr Bill Curtis
  • Agenda
  • Section 1
  • Why Does Measurement Languish
  • Measurement Process Guidance
  • Measurement Categories
  • Productivity Analysis Objectives
  • Productivity Analysis Measures
  • Software Productivity
  • Software Size Measures
  • How Many Lines of Code
  • Function Point Estimation
  • Effort mdash Weakest Measure
  • How Quality Affects Productivity
  • Carry-forward Rework
  • Example of Quality Impact
  • Quality-Adjusted Productivity
  • Best Practices for Analysis
  • 1 Evaluate Demographics
  • 2 Segment Baselines
  • 3 Beware Sampling Effects
  • 4 Understand Variation
  • 5 Investigate Distributions
  • 6 Account for Maturity Effects
  • 7 Caution for External Data Sets
  • Slide Number 27
  • Measuring Project Progress
  • Requirements Change Impact
  • Phase Defect Removal Model
  • SCRUM Board
  • Burndown Chart and Velocity
  • Analyzing Velocity
  • Measuring and Managing Flow
  • Trends Over Sprints
  • Sources and Measures
  • Recommended Books
  • Personal Software Process (PSP)
  • Team Software Process (TSP)
  • Aggregated Personal Data Best
  • Defect Detection at Intuit
  • TSP Benefits
  • Section 3
  • CMMI Maturity Transitions
  • Enhanced Maturity Framework
  • Maturity Level Transformation
  • Measurement by Level
  • Average Improvement Results
  • Schedule Performance
  • Cost Performance
  • Crosbyrsquos Cost of Quality Model
  • Raytheons Cost of Quality
  • Raytheonrsquos Productivity
  • Raytheonrsquos Cost Reduction
  • Raytheonrsquos Cost Predictability
  • OMRONrsquos Reuse gains
  • Section 4
  • What about the Product
  • Six Sigmarsquos Challenge
  • Testing is Not Enough
  • CASTrsquos Application Intelligence Platform
  • Modern Apps are a Technology Stack
  • Architecturally Complex Defects
  • Propagated Risk
  • Transaction Risk
  • Risk-Based Prioritization
  • Static Analysis in US DoD
  • Reducing Operational Incidents amp Costs
  • Reducing Operational Losses
  • Slide Number 70
  • The Technical Debt Metaphor
  • Tech Debt by Quality Factor
  • Join CISQ wwwit-cisqorg
  • Section
  • References 1
  • References 2
  • References 2
Page 45: Best Practices for Software Process and Product Measurement · Software Process and Product Measurement Dr. Bill Curtis ... measurement best practice • Best guidance for starting

Click to edit Master title style Aggregated Personal Data Best

R2s range from 70 to 87

The ability to predict the amount of time required to produce a piece of software is dramatically improved by estimating at the individual level first and then combining the estimates rather than developing a single team estimate averaged over individuals

R2 = 53

Click to edit Master title style Defect Detection at Intuit

Consequently test changes from defect

detection into correctness verification

PSPTSP shifts defect detection from the test phase back up into development

Fagan (2005)

Personal reviews

33

Compile 33

Team reviews 19

Unit test 15

Test 14

Click to edit Master title style

Dramatic reductions

ndash Delivered defects

ndash Days per KLOC

ndash Schedule deviation

ndash Test defectskloc

ndash Variation in results

Humphrey (2006) TSP Leading a Development Team

TSP Benefits

Click to edit Master title style Section 3 Section 3 Measuring Software

Improvement Programs

1) Improvement program methods

2) Maturity-based measurement

3) Improvement program results

61

Click to edit Master title style CMMI Maturity Transitions

Level 5 Optimizing

Innovation management

Level 4 Quantitatively

Managed

Capability management

Level 3 Defined

Process management

Level 2 Managed

Project management

Level 1 Initial

Inconsistent management

Chrisis Konrad amp Shrum (2005) CMMI

Click to edit Master title style Enhanced Maturity Framework

Level 5 Innovating

Innovation management

Level 4 Optimized

Capability management

Level 3 Standardized

Organizational management

Level 2 Stabilized

Work unit management

Level 1 Initial

Inconsistent management

Click to edit Master title style Maturity Level Transformation

Level 5 Opportunistic improvements

Empowered culture

Level 5 Proactive

improvements Agile culture

Organization

Projects

Individual

Level 4 Process and results

managed statistically Precision culture

Level 3 Organization develops

standard processes Common culture

Level 2 Project managers stabilize projects

Local culture

Level 1 Ad Hoc inconsistent

development practices Relationship culture

Trust

Click to edit Master title style Measurement by Level

Level 5 mdash Exploratory data ∆ t = N x ROI

Level 4 mdash Predictive data X plusmn (196σ radicn )

Level 3 mdash Process data 83 defects per review

Level 2 mdash Management data Effort = α LOCβ

Level 1 mdash Haphazard data ldquoabout a million linesrdquo

Measure-maturity mismatch slows improvement and creates

dysfunction

Click to edit Master title style Average Improvement Results

Productivity growth 4 35 9 - 67

Pre-test defect detection 3 22 6 - 25

Time to market 2 darr 19 15 - 23

Field error reports 5 darr 39 10 - 94

Return on investment 5 501 41 - 881

Annual Annual Improvement benefit Orgs median range

Click to edit Master title style Schedule Performance

0

02

04

06

08

1

12

14

16

1 2 3

Schedule performance

index

CMM maturity level

Budgeted cost of work performed

Budgeted cost of work scheduled

behind ahead

Air Force study of contractors

Flowe amp Thordahl (1994)

Click to edit Master title style Cost Performance

Cost performance

index

0

05

1

15

2

25

1 2 3 CMM Maturity Level

Budgeted cost of work performed Actual cost of

work performed

overrun underrun

Air Force study of contractors

Flowe amp Thordahl (1994)

Click to edit Master title style Crosbyrsquos Cost of Quality Model Project Cost

Performance Cost of Quality Generating plans documentation Development - requirements - design - code - integration Prevention Appraisal

Reviews - system - requirements - design - test plan - test scripts Walkthroughs Testing (first time) Independent Verification and Validation (first time) Audits

Training Policies Procedures and methods Tools Planning Quality Improvement Projects Data Gathering and Analysis Root Cause Analysis Quality Reporting

Nonconformance Conformance Fixing Defects Reworking any Document Updating Source Code Re-reviews Re-tests Lab costs for re-tests Patches(Internal and External) Engineering Changes Change Control Boards External Failures and fines Customer Support Help Desks

Dion (1993) Crosby (1979) Quality Is Free

Click to edit Master title style Raytheons Cost of Quality

bull Performance mdash cost of building it right first time bull Nonconformance mdash cost of rework bull Appraisal mdash cost of testing bull Prevention mdash cost of preventing nonconformance

Dion (1993) Haley (1995)

23

18

Year Level Perform Nonconf Appraise Prevent

1988 1 34 41 15 7

1990 2 55 18 15 12

1992 3 66 11

1994 4 76 6 75

Click to edit Master title style Raytheonrsquos Productivity

0

1

2

3

4

1988 1989 1990 1991 1992 1993 1994

Growth relative to 1989 baseline

Lydon (1995)

Click to edit Master title style Raytheonrsquos Cost Reduction

0

02

04

06

08

1

12

1988 1989 1990 1991 1992 1993 1994

Reduction relative to 1989 baseline

Lydon (1995)

Click to edit Master title style

Haley (1995)

1993 1992 1991 1990 1989 1988 1994

50

40

30

20

10

0

-10

Actual cost

Budgeted cost

over-run under-run

4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1

Raytheonrsquos Cost Predictability

Click to edit Master title style OMRONrsquos Reuse gains

Sakamoto et al (1996)

0

5 0 0

1 0 0 0

1 5 0 0

2 0 0 0

2 5 0 0

3 0 0 0

1 9 8 7 1 9 8 8 1 9 8 9 1 9 9 0 1 9 9 1 1 9 9 2 1 9 9 3 1 9 9 4

Steps

100

75

50

25

0

of reuse

Click to edit Master title style Section 4 Section 4 Measuring Quality and

Technical Debt

1) Structural quality measurement

2) Technical debt

3) Consortium for IT Software Quality (CISQ)

89

Click to edit Master title style What about the Product CMMIrsquos primary assumption

ldquoThe quality of a software system is governed by quality of the process used to develop itrdquo

- Watts Humphrey

CMMIrsquos product quality problem

1 Assessments do not verify product quality

2 Compliance focus undermines CMMIrsquos primary assumption

3 Product quality focus at Level 4 in CMM was lost in quantitative process control in CMMI

4 The assumptions underlying control charts are violated by software data

Click to edit Master title style Six Sigmarsquos Challenge

Process focus ndash process improvement ndash Six Sigma Product focus ndash product improvement ndash Design for 6σ

Product focus supplements product focus to unlock even more value from software

Six Sigma projects must have significant benefits Huge benefits

Large benefits

Good benefits

Okay benefits

Small benefits

Now what

Ultimately we run out of projects with significant enough benefits to continue the programhellip

so what is the solution that allows continual improvement

Click to edit Master title style Testing is Not Enough ldquoAs higher levels of assurance are demandedhelliptesting cannot deliver the level of confidence required at a reasonable costrdquo

ldquoThe correctness of the code is rarely the weakest linkrdquo

Jackson D (2009) Spinellis D (2006)

ldquohellipa failure to satisfy a non-functional requirement can be critical even catastrophichellipnon-functional requirements are sometimes difficult to verify We cannot write a test case to verify a systemrsquos reliabilityhellipThe ability to associate code to non-functional properties can be a powerful weapon in a software engineerrsquos arsenalrdquo

Click to edit Master title style CASTrsquos Application Intelligence Platform

Application Analysis

Evaluation of 1200+ coding amp

architectural rules

Application meta-data

Transferability

Changeability

Robustness

Performance

Security

Quality Measurements

Detected Violations

Expensive operation in loop Static vs pooled connections Complex query on big table Large indices on big table

Empty CATCH block Uncontrolled data access Poor memory management Opened resource not closed

SQL injection Cross-site scripting Buffer overflow Uncontrolled format string

Unstructured code Misuse of inheritance Lack of comments Violated naming convention

Highly coupled component Duplicated code Index modified in loop High cyclomatic complexity

Language Parsers

Oracle PLSQL Sybase T-SQL SQL Server T-SQL IBM SQLPSM C C++ C Pro C Cobol CICS Visual Basic VBNet ASPNet Java J2EE JSP XML HTML Javascript VBScript PHP PowerBuilder Oracle Forms PeopleSoft SAP ABAP Netweaver Tibco Business Objects Universal Analyzer for other languages

Click to edit Master title style Modern Apps are a Technology Stack Ar

chite

ctur

al C

ompl

ianc

e

Data Flow Transaction Risk

EJB PLSQL

Oracle

SQL Server

DB2

TSQL

Hibernate

Spring

Struts NET

COBOL

Sybase IMS

Messaging

Integration quality Architectural

compliance Risk propagation Application security Resiliency checks Transaction integrity

Function point Effort estimation Data access control SDK versioning Calibration across

technologies IT organization level

System Level 3

Code style amp layout Expression complexity Code documentation Class or program design Basic coding standards Developer level

Unit Level 1

Java Java Java

Java Java

Web Services Java Java Single languagetechnology layer

Intra-technology architecture Intra-layer dependencies Inter-program invocation Security vulnerabilities Development team level

Technology Level 2

JSP ASPNET APIs

Click to edit Master title style Architecturally Complex Defects

Primary cause of operational

problems

20x as many

fixes to correct

Architecturally Complex Defect

A structural flaw involving interactions among multiple components that reside in different application layers

48

52 92

8 Architecturally

Complex Defects

Code unit-level violations

of total app defects

of total repair effort

Most architecturally complex defects involve an architectural hotspot ndash a badly-built component that should be replaced

Click to edit Master title style Propagated Risk

The impact of violation 284 in Component B is very widely propagated throughout the system ndash

it presents very high risk

The impact of violation 342 in Component A is only narrowly propagated in the system

ndash it presents low risk

Propagated Risk Index

A ranking of the risk created by each violation based on the its severity and the breadth of its impact in the system

Violation 342 in component A

Violation 284 in component B

Remediating violation 284 in Component B will have

the greater impact on reducing risk and cost

Click to edit Master title style Transaction Risk

Customers care most about the dependability of their transactions

The risk of a transaction is determined by the number and severity of violations along its path

Transactions can be ranked by risk to prioritize their remediation

The Transaction Risk Index can be further tuned using the operational frequency of each transaction

Transaction Risk Index

A ranking of transactions based on the risk created by the number and severity of violations along the transaction path

Transaction entryexit in the user interface layer

Transaction fulfillment in the data storage layer

1 3 2 4 5

Transaction 1 poses greatest risk

Transaction processing in the business logic layer

Click to edit Master title style Risk-Based Prioritization

Violation Health Factor Severity Propagated Risk Index

In a high risk transaction

Frequency of path execution

148 Security 9

371 Robustness 9

062 Performance 8

284 Robustness 7

016 Changeability 7

241 Security 5

173 Transferability 4

359 Transferability 3

Click to edit Master title style Static Analysis in US DoD

John Keane (2013)

Testing finds lt13 of structural defects

Static analysis before testing is 85 efficient

With testing and static analysis combined test schedules are reduced by 50

Click to edit Master title style Reducing Operational Incidents amp Costs

Study of structural quality measures and maintenance effort across 20 customers in a large global system integrator

TQI increase of 24 decreased corrective maintenance effort by 50

Total Quality Index

Click to edit Master title style Reducing Operational Losses

0

20

40

60

80

100

120

140

160

180

100 150 200 250 300 350 400

Technical Quality Index

Pro

du

ctio

n In

cid

ents

+$2000000 Annual Loss

$50000 Annual Loss

$5000 Annual Loss

Large international investment bank Business critical applications

Click to edit Master title style Factors Affecting Structural Quality

2017 CRASH Report Request from infocastsoftwarecom

Factor apps Robust Security Perform Change Transfer

Technology 1606 20 3 16 26 7

Java-EE

Industry 677 5 5 7 4 6

App type 510 1 1 1 1 1

Source 580 1 1 4

Shore 540

Maturity 72 28 25 15 24 12

Method 208 10 6 14 6

Team size 186 7 5 8 8

of users 262 4 3 5 4

Percentage of variation in structural quality scores accounted for by various factors

Click to edit Master title style The Technical Debt Metaphor

Structural quality problems in production code

Technical Debt

Principal borrowed

Interest on the debt

Business Risk Liability from debt

Opportunity cost

Interestmdashcontinuing IT costs attributable to the violations causing technical debt ie higher maintenance costs greater resource usage etc

Principalcost of fixing problems remaining in the code after release that must be remediated

Opportunity costmdashbenefits that could have been achieved had resources been put on new capability rather than retiring technical debt

Liabilitymdashbusiness costs related to outages breaches corrupted data etc

Technical Debt the future cost of fixing defects remaining in code at release a component of the cost of ownership

Todayrsquos talk focuses on the principal

Curtis et al (2012) IEEE Software

Click to edit Master title style Tech Debt by Quality Factor

Transferability

40

Changeability

30

Security 7

Robustness

18

70 of Technical Debt is in IT Cost (Transferability Changeability)

30 of Technical Debt is in Business Risk

(Robustness Performance Security)

Health Factor proportions are mostly

consistent across technologies

Curtis et al (2012) IEEE Software

Click to edit Master title style Join CISQ wwwit-cisqorg

Click to edit Master title style Section

References

115

Click to edit Master title style References 1 Anderson DJ (2010) Kanban Sequim WA Blue Hole Press

Basili V Trendowicz A Kowalczyk M Heidrich J Seaman C Munch J amp Rombach D (2014) Aligning Organizations through Measurement Heidelberg Springer

Boehm B (1987) Increasing software productivity IEEE Computer 20 (9) 43-57

Boehm B et al (2000) Software Cost Estimation with COCOMO II Prentice-Hall

Chrissis MB Konrad M amp Shrum S (2011) CMMI Third Edition Guidelines for Process Integration and Product Improvement Reading MA Addison Wesley

Cohn M (2006) Agile Estimating and Planning Upper Saddle River NJ Prentice-Hall

Crosby P (1979) Quality Is Free New York McGraw-Hill

Curtis B Sapiddi J amp Szynkarski A (2012) Estimating the principle of an applicationrsquos technical debt IEEE Software 29 (6) 34-42

Dion R (1993) Process improvement and the corporate balance sheet IEEE Software 10 (4) 28-35

Duncker R (1992) Presentation at the 25th Annual Conference of the Singapore Computer Society Singapore November 1992

Dyne K (1999) ESSI Benefits from Continuous Software Improvements In Proceedings of ESEPGrsquo99 Milton Keyes UK ESPI Foundation

Flowe RM amp Thordahl JB (1994) A Correlational Study of the SEIrsquos Capability Maturity Model and Software Development Performance in DoD Contracts (AFITGSSLAR94D-2) Dayton OH Air Force Institute of Technology Wright Patterson Air Force Base

116

Click to edit Master title style References 2 Haley T Ireland B Wojtaszek E Nash D amp Dion R (1995) Raytheon Electronic Systems

Experience in Software Process Improvement (Tech Rep CMUSEI-95-TR-017) ) Pittsburgh Software Engineering Institute Carnegie Mellon University

Ebert C amp Dumke R (2007) Software Measurement Berlin Springer-Verlag

Fagan E amp Davis N (2005) TSP PSP at Intuit Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Fenton N (2006) New directions for Software Metrics CIO Symposium on Software Best Practices

Garmus D amp Herron D (2001) Function Point Analysis Boston Addison-Wesley

George E (2005) Project management with TSP Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Herbsleb J Carleton A Rozum J Siegel J amp Zubrow D (1994) Benefits of CMM-Based Software Process Improvement Initial Results (Tech Rep CMUSEI-94-TR-13) Pittsburgh Software Engineering Institute Carnegie Mellon University

Humphrey W (1997) Introduction to the Personal Software Process Boston Addison Wesley

Humphrey W (2000) Introduction to the Team Software Process Boston Addison Wesley

Humphrey W (2006) TSP Leading a Development Team Boston Addison Wesley

Jackson D (2009) A direct path to dependable software Communications of the ACM 52 (4)

Kan SH (1995) Metrics and Models in Software Quality Engineering Reading MA Addison-Wesley

Keeni B (2000) IEEE Software 16 (4)

117

Click to edit Master title style References 2 Larman C (2004) Agile and Iterative Development Boston Addison Wesley

Lydon T (1995) Productivity drivers Process and capital In Proceedings of the 1995 SEPG Conference Pittsburgh Software Engineering Institute Carnegie Mellon University

McGarry J et al (2002) Practical Software Measurement Reading MA Addison Wesley

Paulk M Weber C Curtis B amp Chrissis M (1995) The Capability Maturity Model Guidelines for Improving the Software Process Reading MA Addison-Wesley

Peterman W (2000) IEEE Software JulyAugust issue

Poppendieck M amp Poppendieck T (2007) Implementing Lean Software Development Boston Addison-Wesley

Sakamoto K Kishida K amp Nakakoji K (1996) Cultural adaptation of the CMM In Fuggetta A amp Wolf A (Eds) Software Process Chichester UK Wiley 137-154

Sinha M (2006) Performance models ndash Enabling organizations prediction capability Proceedings of SEPGrsquo06 Pittsburgh Software Engineering Institute Carnegie Mellon University

Schwaber K amp Beedle M (2002) Agile Software Development Using Scrum

Spinellis D (2006) Code Quality Addison-Wesley

Van Epps S amp Marshall R (2012) Using TSP at the end of the development cycle Proceedings of TSP 2012 Pittsburgh Software Engineering Institute Carnegie Mellon University

Vu JD (1996) Software process improvement A business case In Proceedings of the European SEPG Conference Milton Keynes UK European Software Process Improvement Foundation

118

  • Best Practices for Software Process and Product Measurement
  • Instructor Dr Bill Curtis
  • Agenda
  • Section 1
  • Why Does Measurement Languish
  • Measurement Process Guidance
  • Measurement Categories
  • Productivity Analysis Objectives
  • Productivity Analysis Measures
  • Software Productivity
  • Software Size Measures
  • How Many Lines of Code
  • Function Point Estimation
  • Effort mdash Weakest Measure
  • How Quality Affects Productivity
  • Carry-forward Rework
  • Example of Quality Impact
  • Quality-Adjusted Productivity
  • Best Practices for Analysis
  • 1 Evaluate Demographics
  • 2 Segment Baselines
  • 3 Beware Sampling Effects
  • 4 Understand Variation
  • 5 Investigate Distributions
  • 6 Account for Maturity Effects
  • 7 Caution for External Data Sets
  • Slide Number 27
  • Measuring Project Progress
  • Requirements Change Impact
  • Phase Defect Removal Model
  • SCRUM Board
  • Burndown Chart and Velocity
  • Analyzing Velocity
  • Measuring and Managing Flow
  • Trends Over Sprints
  • Sources and Measures
  • Recommended Books
  • Personal Software Process (PSP)
  • Team Software Process (TSP)
  • Aggregated Personal Data Best
  • Defect Detection at Intuit
  • TSP Benefits
  • Section 3
  • CMMI Maturity Transitions
  • Enhanced Maturity Framework
  • Maturity Level Transformation
  • Measurement by Level
  • Average Improvement Results
  • Schedule Performance
  • Cost Performance
  • Crosbyrsquos Cost of Quality Model
  • Raytheons Cost of Quality
  • Raytheonrsquos Productivity
  • Raytheonrsquos Cost Reduction
  • Raytheonrsquos Cost Predictability
  • OMRONrsquos Reuse gains
  • Section 4
  • What about the Product
  • Six Sigmarsquos Challenge
  • Testing is Not Enough
  • CASTrsquos Application Intelligence Platform
  • Modern Apps are a Technology Stack
  • Architecturally Complex Defects
  • Propagated Risk
  • Transaction Risk
  • Risk-Based Prioritization
  • Static Analysis in US DoD
  • Reducing Operational Incidents amp Costs
  • Reducing Operational Losses
  • Slide Number 70
  • The Technical Debt Metaphor
  • Tech Debt by Quality Factor
  • Join CISQ wwwit-cisqorg
  • Section
  • References 1
  • References 2
  • References 2
Page 46: Best Practices for Software Process and Product Measurement · Software Process and Product Measurement Dr. Bill Curtis ... measurement best practice • Best guidance for starting

Click to edit Master title style Defect Detection at Intuit

Consequently test changes from defect

detection into correctness verification

PSPTSP shifts defect detection from the test phase back up into development

Fagan (2005)

Personal reviews

33

Compile 33

Team reviews 19

Unit test 15

Test 14

Click to edit Master title style

Dramatic reductions

ndash Delivered defects

ndash Days per KLOC

ndash Schedule deviation

ndash Test defectskloc

ndash Variation in results

Humphrey (2006) TSP Leading a Development Team

TSP Benefits

Click to edit Master title style Section 3 Section 3 Measuring Software

Improvement Programs

1) Improvement program methods

2) Maturity-based measurement

3) Improvement program results

61

Click to edit Master title style CMMI Maturity Transitions

Level 5 Optimizing

Innovation management

Level 4 Quantitatively

Managed

Capability management

Level 3 Defined

Process management

Level 2 Managed

Project management

Level 1 Initial

Inconsistent management

Chrisis Konrad amp Shrum (2005) CMMI

Click to edit Master title style Enhanced Maturity Framework

Level 5 Innovating

Innovation management

Level 4 Optimized

Capability management

Level 3 Standardized

Organizational management

Level 2 Stabilized

Work unit management

Level 1 Initial

Inconsistent management

Click to edit Master title style Maturity Level Transformation

Level 5 Opportunistic improvements

Empowered culture

Level 5 Proactive

improvements Agile culture

Organization

Projects

Individual

Level 4 Process and results

managed statistically Precision culture

Level 3 Organization develops

standard processes Common culture

Level 2 Project managers stabilize projects

Local culture

Level 1 Ad Hoc inconsistent

development practices Relationship culture

Trust

Click to edit Master title style Measurement by Level

Level 5 mdash Exploratory data ∆ t = N x ROI

Level 4 mdash Predictive data X plusmn (196σ radicn )

Level 3 mdash Process data 83 defects per review

Level 2 mdash Management data Effort = α LOCβ

Level 1 mdash Haphazard data ldquoabout a million linesrdquo

Measure-maturity mismatch slows improvement and creates

dysfunction

Click to edit Master title style Average Improvement Results

Productivity growth 4 35 9 - 67

Pre-test defect detection 3 22 6 - 25

Time to market 2 darr 19 15 - 23

Field error reports 5 darr 39 10 - 94

Return on investment 5 501 41 - 881

Annual Annual Improvement benefit Orgs median range

Click to edit Master title style Schedule Performance

0

02

04

06

08

1

12

14

16

1 2 3

Schedule performance

index

CMM maturity level

Budgeted cost of work performed

Budgeted cost of work scheduled

behind ahead

Air Force study of contractors

Flowe amp Thordahl (1994)

Click to edit Master title style Cost Performance

Cost performance

index

0

05

1

15

2

25

1 2 3 CMM Maturity Level

Budgeted cost of work performed Actual cost of

work performed

overrun underrun

Air Force study of contractors

Flowe amp Thordahl (1994)

Click to edit Master title style Crosbyrsquos Cost of Quality Model Project Cost

Performance Cost of Quality Generating plans documentation Development - requirements - design - code - integration Prevention Appraisal

Reviews - system - requirements - design - test plan - test scripts Walkthroughs Testing (first time) Independent Verification and Validation (first time) Audits

Training Policies Procedures and methods Tools Planning Quality Improvement Projects Data Gathering and Analysis Root Cause Analysis Quality Reporting

Nonconformance Conformance Fixing Defects Reworking any Document Updating Source Code Re-reviews Re-tests Lab costs for re-tests Patches(Internal and External) Engineering Changes Change Control Boards External Failures and fines Customer Support Help Desks

Dion (1993) Crosby (1979) Quality Is Free

Click to edit Master title style Raytheons Cost of Quality

bull Performance mdash cost of building it right first time bull Nonconformance mdash cost of rework bull Appraisal mdash cost of testing bull Prevention mdash cost of preventing nonconformance

Dion (1993) Haley (1995)

23

18

Year Level Perform Nonconf Appraise Prevent

1988 1 34 41 15 7

1990 2 55 18 15 12

1992 3 66 11

1994 4 76 6 75

Click to edit Master title style Raytheonrsquos Productivity

0

1

2

3

4

1988 1989 1990 1991 1992 1993 1994

Growth relative to 1989 baseline

Lydon (1995)

Click to edit Master title style Raytheonrsquos Cost Reduction

0

02

04

06

08

1

12

1988 1989 1990 1991 1992 1993 1994

Reduction relative to 1989 baseline

Lydon (1995)

Click to edit Master title style

Haley (1995)

1993 1992 1991 1990 1989 1988 1994

50

40

30

20

10

0

-10

Actual cost

Budgeted cost

over-run under-run

4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1

Raytheonrsquos Cost Predictability

Click to edit Master title style OMRONrsquos Reuse gains

Sakamoto et al (1996)

0

5 0 0

1 0 0 0

1 5 0 0

2 0 0 0

2 5 0 0

3 0 0 0

1 9 8 7 1 9 8 8 1 9 8 9 1 9 9 0 1 9 9 1 1 9 9 2 1 9 9 3 1 9 9 4

Steps

100

75

50

25

0

of reuse

Click to edit Master title style Section 4 Section 4 Measuring Quality and

Technical Debt

1) Structural quality measurement

2) Technical debt

3) Consortium for IT Software Quality (CISQ)

89

Click to edit Master title style What about the Product CMMIrsquos primary assumption

ldquoThe quality of a software system is governed by quality of the process used to develop itrdquo

- Watts Humphrey

CMMIrsquos product quality problem

1 Assessments do not verify product quality

2 Compliance focus undermines CMMIrsquos primary assumption

3 Product quality focus at Level 4 in CMM was lost in quantitative process control in CMMI

4 The assumptions underlying control charts are violated by software data

Click to edit Master title style Six Sigmarsquos Challenge

Process focus ndash process improvement ndash Six Sigma Product focus ndash product improvement ndash Design for 6σ

Product focus supplements product focus to unlock even more value from software

Six Sigma projects must have significant benefits Huge benefits

Large benefits

Good benefits

Okay benefits

Small benefits

Now what

Ultimately we run out of projects with significant enough benefits to continue the programhellip

so what is the solution that allows continual improvement

Click to edit Master title style Testing is Not Enough ldquoAs higher levels of assurance are demandedhelliptesting cannot deliver the level of confidence required at a reasonable costrdquo

ldquoThe correctness of the code is rarely the weakest linkrdquo

Jackson D (2009) Spinellis D (2006)

ldquohellipa failure to satisfy a non-functional requirement can be critical even catastrophichellipnon-functional requirements are sometimes difficult to verify We cannot write a test case to verify a systemrsquos reliabilityhellipThe ability to associate code to non-functional properties can be a powerful weapon in a software engineerrsquos arsenalrdquo

Click to edit Master title style CASTrsquos Application Intelligence Platform

Application Analysis

Evaluation of 1200+ coding amp

architectural rules

Application meta-data

Transferability

Changeability

Robustness

Performance

Security

Quality Measurements

Detected Violations

Expensive operation in loop Static vs pooled connections Complex query on big table Large indices on big table

Empty CATCH block Uncontrolled data access Poor memory management Opened resource not closed

SQL injection Cross-site scripting Buffer overflow Uncontrolled format string

Unstructured code Misuse of inheritance Lack of comments Violated naming convention

Highly coupled component Duplicated code Index modified in loop High cyclomatic complexity

Language Parsers

Oracle PLSQL Sybase T-SQL SQL Server T-SQL IBM SQLPSM C C++ C Pro C Cobol CICS Visual Basic VBNet ASPNet Java J2EE JSP XML HTML Javascript VBScript PHP PowerBuilder Oracle Forms PeopleSoft SAP ABAP Netweaver Tibco Business Objects Universal Analyzer for other languages

Click to edit Master title style Modern Apps are a Technology Stack Ar

chite

ctur

al C

ompl

ianc

e

Data Flow Transaction Risk

EJB PLSQL

Oracle

SQL Server

DB2

TSQL

Hibernate

Spring

Struts NET

COBOL

Sybase IMS

Messaging

Integration quality Architectural

compliance Risk propagation Application security Resiliency checks Transaction integrity

Function point Effort estimation Data access control SDK versioning Calibration across

technologies IT organization level

System Level 3

Code style amp layout Expression complexity Code documentation Class or program design Basic coding standards Developer level

Unit Level 1

Java Java Java

Java Java

Web Services Java Java Single languagetechnology layer

Intra-technology architecture Intra-layer dependencies Inter-program invocation Security vulnerabilities Development team level

Technology Level 2

JSP ASPNET APIs

Click to edit Master title style Architecturally Complex Defects

Primary cause of operational

problems

20x as many

fixes to correct

Architecturally Complex Defect

A structural flaw involving interactions among multiple components that reside in different application layers

48

52 92

8 Architecturally

Complex Defects

Code unit-level violations

of total app defects

of total repair effort

Most architecturally complex defects involve an architectural hotspot ndash a badly-built component that should be replaced

Click to edit Master title style Propagated Risk

The impact of violation 284 in Component B is very widely propagated throughout the system ndash

it presents very high risk

The impact of violation 342 in Component A is only narrowly propagated in the system

ndash it presents low risk

Propagated Risk Index

A ranking of the risk created by each violation based on the its severity and the breadth of its impact in the system

Violation 342 in component A

Violation 284 in component B

Remediating violation 284 in Component B will have

the greater impact on reducing risk and cost

Click to edit Master title style Transaction Risk

Customers care most about the dependability of their transactions

The risk of a transaction is determined by the number and severity of violations along its path

Transactions can be ranked by risk to prioritize their remediation

The Transaction Risk Index can be further tuned using the operational frequency of each transaction

Transaction Risk Index

A ranking of transactions based on the risk created by the number and severity of violations along the transaction path

Transaction entryexit in the user interface layer

Transaction fulfillment in the data storage layer

1 3 2 4 5

Transaction 1 poses greatest risk

Transaction processing in the business logic layer

Click to edit Master title style Risk-Based Prioritization

Violation Health Factor Severity Propagated Risk Index

In a high risk transaction

Frequency of path execution

148 Security 9

371 Robustness 9

062 Performance 8

284 Robustness 7

016 Changeability 7

241 Security 5

173 Transferability 4

359 Transferability 3

Click to edit Master title style Static Analysis in US DoD

John Keane (2013)

Testing finds lt13 of structural defects

Static analysis before testing is 85 efficient

With testing and static analysis combined test schedules are reduced by 50

Click to edit Master title style Reducing Operational Incidents amp Costs

Study of structural quality measures and maintenance effort across 20 customers in a large global system integrator

TQI increase of 24 decreased corrective maintenance effort by 50

Total Quality Index

Click to edit Master title style Reducing Operational Losses

0

20

40

60

80

100

120

140

160

180

100 150 200 250 300 350 400

Technical Quality Index

Pro

du

ctio

n In

cid

ents

+$2000000 Annual Loss

$50000 Annual Loss

$5000 Annual Loss

Large international investment bank Business critical applications

Click to edit Master title style Factors Affecting Structural Quality

2017 CRASH Report Request from infocastsoftwarecom

Factor apps Robust Security Perform Change Transfer

Technology 1606 20 3 16 26 7

Java-EE

Industry 677 5 5 7 4 6

App type 510 1 1 1 1 1

Source 580 1 1 4

Shore 540

Maturity 72 28 25 15 24 12

Method 208 10 6 14 6

Team size 186 7 5 8 8

of users 262 4 3 5 4

Percentage of variation in structural quality scores accounted for by various factors

Click to edit Master title style The Technical Debt Metaphor

Structural quality problems in production code

Technical Debt

Principal borrowed

Interest on the debt

Business Risk Liability from debt

Opportunity cost

Interestmdashcontinuing IT costs attributable to the violations causing technical debt ie higher maintenance costs greater resource usage etc

Principalcost of fixing problems remaining in the code after release that must be remediated

Opportunity costmdashbenefits that could have been achieved had resources been put on new capability rather than retiring technical debt

Liabilitymdashbusiness costs related to outages breaches corrupted data etc

Technical Debt the future cost of fixing defects remaining in code at release a component of the cost of ownership

Todayrsquos talk focuses on the principal

Curtis et al (2012) IEEE Software

Click to edit Master title style Tech Debt by Quality Factor

Transferability

40

Changeability

30

Security 7

Robustness

18

70 of Technical Debt is in IT Cost (Transferability Changeability)

30 of Technical Debt is in Business Risk

(Robustness Performance Security)

Health Factor proportions are mostly

consistent across technologies

Curtis et al (2012) IEEE Software

Click to edit Master title style Join CISQ wwwit-cisqorg

Click to edit Master title style Section

References

115

Click to edit Master title style References 1 Anderson DJ (2010) Kanban Sequim WA Blue Hole Press

Basili V Trendowicz A Kowalczyk M Heidrich J Seaman C Munch J amp Rombach D (2014) Aligning Organizations through Measurement Heidelberg Springer

Boehm B (1987) Increasing software productivity IEEE Computer 20 (9) 43-57

Boehm B et al (2000) Software Cost Estimation with COCOMO II Prentice-Hall

Chrissis MB Konrad M amp Shrum S (2011) CMMI Third Edition Guidelines for Process Integration and Product Improvement Reading MA Addison Wesley

Cohn M (2006) Agile Estimating and Planning Upper Saddle River NJ Prentice-Hall

Crosby P (1979) Quality Is Free New York McGraw-Hill

Curtis B Sapiddi J amp Szynkarski A (2012) Estimating the principle of an applicationrsquos technical debt IEEE Software 29 (6) 34-42

Dion R (1993) Process improvement and the corporate balance sheet IEEE Software 10 (4) 28-35

Duncker R (1992) Presentation at the 25th Annual Conference of the Singapore Computer Society Singapore November 1992

Dyne K (1999) ESSI Benefits from Continuous Software Improvements In Proceedings of ESEPGrsquo99 Milton Keyes UK ESPI Foundation

Flowe RM amp Thordahl JB (1994) A Correlational Study of the SEIrsquos Capability Maturity Model and Software Development Performance in DoD Contracts (AFITGSSLAR94D-2) Dayton OH Air Force Institute of Technology Wright Patterson Air Force Base

116

Click to edit Master title style References 2 Haley T Ireland B Wojtaszek E Nash D amp Dion R (1995) Raytheon Electronic Systems

Experience in Software Process Improvement (Tech Rep CMUSEI-95-TR-017) ) Pittsburgh Software Engineering Institute Carnegie Mellon University

Ebert C amp Dumke R (2007) Software Measurement Berlin Springer-Verlag

Fagan E amp Davis N (2005) TSP PSP at Intuit Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Fenton N (2006) New directions for Software Metrics CIO Symposium on Software Best Practices

Garmus D amp Herron D (2001) Function Point Analysis Boston Addison-Wesley

George E (2005) Project management with TSP Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Herbsleb J Carleton A Rozum J Siegel J amp Zubrow D (1994) Benefits of CMM-Based Software Process Improvement Initial Results (Tech Rep CMUSEI-94-TR-13) Pittsburgh Software Engineering Institute Carnegie Mellon University

Humphrey W (1997) Introduction to the Personal Software Process Boston Addison Wesley

Humphrey W (2000) Introduction to the Team Software Process Boston Addison Wesley

Humphrey W (2006) TSP Leading a Development Team Boston Addison Wesley

Jackson D (2009) A direct path to dependable software Communications of the ACM 52 (4)

Kan SH (1995) Metrics and Models in Software Quality Engineering Reading MA Addison-Wesley

Keeni B (2000) IEEE Software 16 (4)

117

Click to edit Master title style References 2 Larman C (2004) Agile and Iterative Development Boston Addison Wesley

Lydon T (1995) Productivity drivers Process and capital In Proceedings of the 1995 SEPG Conference Pittsburgh Software Engineering Institute Carnegie Mellon University

McGarry J et al (2002) Practical Software Measurement Reading MA Addison Wesley

Paulk M Weber C Curtis B amp Chrissis M (1995) The Capability Maturity Model Guidelines for Improving the Software Process Reading MA Addison-Wesley

Peterman W (2000) IEEE Software JulyAugust issue

Poppendieck M amp Poppendieck T (2007) Implementing Lean Software Development Boston Addison-Wesley

Sakamoto K Kishida K amp Nakakoji K (1996) Cultural adaptation of the CMM In Fuggetta A amp Wolf A (Eds) Software Process Chichester UK Wiley 137-154

Sinha M (2006) Performance models ndash Enabling organizations prediction capability Proceedings of SEPGrsquo06 Pittsburgh Software Engineering Institute Carnegie Mellon University

Schwaber K amp Beedle M (2002) Agile Software Development Using Scrum

Spinellis D (2006) Code Quality Addison-Wesley

Van Epps S amp Marshall R (2012) Using TSP at the end of the development cycle Proceedings of TSP 2012 Pittsburgh Software Engineering Institute Carnegie Mellon University

Vu JD (1996) Software process improvement A business case In Proceedings of the European SEPG Conference Milton Keynes UK European Software Process Improvement Foundation

118

  • Best Practices for Software Process and Product Measurement
  • Instructor Dr Bill Curtis
  • Agenda
  • Section 1
  • Why Does Measurement Languish
  • Measurement Process Guidance
  • Measurement Categories
  • Productivity Analysis Objectives
  • Productivity Analysis Measures
  • Software Productivity
  • Software Size Measures
  • How Many Lines of Code
  • Function Point Estimation
  • Effort mdash Weakest Measure
  • How Quality Affects Productivity
  • Carry-forward Rework
  • Example of Quality Impact
  • Quality-Adjusted Productivity
  • Best Practices for Analysis
  • 1 Evaluate Demographics
  • 2 Segment Baselines
  • 3 Beware Sampling Effects
  • 4 Understand Variation
  • 5 Investigate Distributions
  • 6 Account for Maturity Effects
  • 7 Caution for External Data Sets
  • Slide Number 27
  • Measuring Project Progress
  • Requirements Change Impact
  • Phase Defect Removal Model
  • SCRUM Board
  • Burndown Chart and Velocity
  • Analyzing Velocity
  • Measuring and Managing Flow
  • Trends Over Sprints
  • Sources and Measures
  • Recommended Books
  • Personal Software Process (PSP)
  • Team Software Process (TSP)
  • Aggregated Personal Data Best
  • Defect Detection at Intuit
  • TSP Benefits
  • Section 3
  • CMMI Maturity Transitions
  • Enhanced Maturity Framework
  • Maturity Level Transformation
  • Measurement by Level
  • Average Improvement Results
  • Schedule Performance
  • Cost Performance
  • Crosbyrsquos Cost of Quality Model
  • Raytheons Cost of Quality
  • Raytheonrsquos Productivity
  • Raytheonrsquos Cost Reduction
  • Raytheonrsquos Cost Predictability
  • OMRONrsquos Reuse gains
  • Section 4
  • What about the Product
  • Six Sigmarsquos Challenge
  • Testing is Not Enough
  • CASTrsquos Application Intelligence Platform
  • Modern Apps are a Technology Stack
  • Architecturally Complex Defects
  • Propagated Risk
  • Transaction Risk
  • Risk-Based Prioritization
  • Static Analysis in US DoD
  • Reducing Operational Incidents amp Costs
  • Reducing Operational Losses
  • Slide Number 70
  • The Technical Debt Metaphor
  • Tech Debt by Quality Factor
  • Join CISQ wwwit-cisqorg
  • Section
  • References 1
  • References 2
  • References 2
Page 47: Best Practices for Software Process and Product Measurement · Software Process and Product Measurement Dr. Bill Curtis ... measurement best practice • Best guidance for starting

Click to edit Master title style

Dramatic reductions

ndash Delivered defects

ndash Days per KLOC

ndash Schedule deviation

ndash Test defectskloc

ndash Variation in results

Humphrey (2006) TSP Leading a Development Team

TSP Benefits

Click to edit Master title style Section 3 Section 3 Measuring Software

Improvement Programs

1) Improvement program methods

2) Maturity-based measurement

3) Improvement program results

61

Click to edit Master title style CMMI Maturity Transitions

Level 5 Optimizing

Innovation management

Level 4 Quantitatively

Managed

Capability management

Level 3 Defined

Process management

Level 2 Managed

Project management

Level 1 Initial

Inconsistent management

Chrisis Konrad amp Shrum (2005) CMMI

Click to edit Master title style Enhanced Maturity Framework

Level 5 Innovating

Innovation management

Level 4 Optimized

Capability management

Level 3 Standardized

Organizational management

Level 2 Stabilized

Work unit management

Level 1 Initial

Inconsistent management

Click to edit Master title style Maturity Level Transformation

Level 5 Opportunistic improvements

Empowered culture

Level 5 Proactive

improvements Agile culture

Organization

Projects

Individual

Level 4 Process and results

managed statistically Precision culture

Level 3 Organization develops

standard processes Common culture

Level 2 Project managers stabilize projects

Local culture

Level 1 Ad Hoc inconsistent

development practices Relationship culture

Trust

Click to edit Master title style Measurement by Level

Level 5 mdash Exploratory data ∆ t = N x ROI

Level 4 mdash Predictive data X plusmn (196σ radicn )

Level 3 mdash Process data 83 defects per review

Level 2 mdash Management data Effort = α LOCβ

Level 1 mdash Haphazard data ldquoabout a million linesrdquo

Measure-maturity mismatch slows improvement and creates

dysfunction

Click to edit Master title style Average Improvement Results

Productivity growth 4 35 9 - 67

Pre-test defect detection 3 22 6 - 25

Time to market 2 darr 19 15 - 23

Field error reports 5 darr 39 10 - 94

Return on investment 5 501 41 - 881

Annual Annual Improvement benefit Orgs median range

Click to edit Master title style Schedule Performance

0

02

04

06

08

1

12

14

16

1 2 3

Schedule performance

index

CMM maturity level

Budgeted cost of work performed

Budgeted cost of work scheduled

behind ahead

Air Force study of contractors

Flowe amp Thordahl (1994)

Click to edit Master title style Cost Performance

Cost performance

index

0

05

1

15

2

25

1 2 3 CMM Maturity Level

Budgeted cost of work performed Actual cost of

work performed

overrun underrun

Air Force study of contractors

Flowe amp Thordahl (1994)

Click to edit Master title style Crosbyrsquos Cost of Quality Model Project Cost

Performance Cost of Quality Generating plans documentation Development - requirements - design - code - integration Prevention Appraisal

Reviews - system - requirements - design - test plan - test scripts Walkthroughs Testing (first time) Independent Verification and Validation (first time) Audits

Training Policies Procedures and methods Tools Planning Quality Improvement Projects Data Gathering and Analysis Root Cause Analysis Quality Reporting

Nonconformance Conformance Fixing Defects Reworking any Document Updating Source Code Re-reviews Re-tests Lab costs for re-tests Patches(Internal and External) Engineering Changes Change Control Boards External Failures and fines Customer Support Help Desks

Dion (1993) Crosby (1979) Quality Is Free

Click to edit Master title style Raytheons Cost of Quality

bull Performance mdash cost of building it right first time bull Nonconformance mdash cost of rework bull Appraisal mdash cost of testing bull Prevention mdash cost of preventing nonconformance

Dion (1993) Haley (1995)

23

18

Year Level Perform Nonconf Appraise Prevent

1988 1 34 41 15 7

1990 2 55 18 15 12

1992 3 66 11

1994 4 76 6 75

Click to edit Master title style Raytheonrsquos Productivity

0

1

2

3

4

1988 1989 1990 1991 1992 1993 1994

Growth relative to 1989 baseline

Lydon (1995)

Click to edit Master title style Raytheonrsquos Cost Reduction

0

02

04

06

08

1

12

1988 1989 1990 1991 1992 1993 1994

Reduction relative to 1989 baseline

Lydon (1995)

Click to edit Master title style

Haley (1995)

1993 1992 1991 1990 1989 1988 1994

50

40

30

20

10

0

-10

Actual cost

Budgeted cost

over-run under-run

4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1

Raytheonrsquos Cost Predictability

Click to edit Master title style OMRONrsquos Reuse gains

Sakamoto et al (1996)

0

5 0 0

1 0 0 0

1 5 0 0

2 0 0 0

2 5 0 0

3 0 0 0

1 9 8 7 1 9 8 8 1 9 8 9 1 9 9 0 1 9 9 1 1 9 9 2 1 9 9 3 1 9 9 4

Steps

100

75

50

25

0

of reuse

Click to edit Master title style Section 4 Section 4 Measuring Quality and

Technical Debt

1) Structural quality measurement

2) Technical debt

3) Consortium for IT Software Quality (CISQ)

89

Click to edit Master title style What about the Product CMMIrsquos primary assumption

ldquoThe quality of a software system is governed by quality of the process used to develop itrdquo

- Watts Humphrey

CMMIrsquos product quality problem

1 Assessments do not verify product quality

2 Compliance focus undermines CMMIrsquos primary assumption

3 Product quality focus at Level 4 in CMM was lost in quantitative process control in CMMI

4 The assumptions underlying control charts are violated by software data

Click to edit Master title style Six Sigmarsquos Challenge

Process focus ndash process improvement ndash Six Sigma Product focus ndash product improvement ndash Design for 6σ

Product focus supplements product focus to unlock even more value from software

Six Sigma projects must have significant benefits Huge benefits

Large benefits

Good benefits

Okay benefits

Small benefits

Now what

Ultimately we run out of projects with significant enough benefits to continue the programhellip

so what is the solution that allows continual improvement

Click to edit Master title style Testing is Not Enough ldquoAs higher levels of assurance are demandedhelliptesting cannot deliver the level of confidence required at a reasonable costrdquo

ldquoThe correctness of the code is rarely the weakest linkrdquo

Jackson D (2009) Spinellis D (2006)

ldquohellipa failure to satisfy a non-functional requirement can be critical even catastrophichellipnon-functional requirements are sometimes difficult to verify We cannot write a test case to verify a systemrsquos reliabilityhellipThe ability to associate code to non-functional properties can be a powerful weapon in a software engineerrsquos arsenalrdquo

Click to edit Master title style CASTrsquos Application Intelligence Platform

Application Analysis

Evaluation of 1200+ coding amp

architectural rules

Application meta-data

Transferability

Changeability

Robustness

Performance

Security

Quality Measurements

Detected Violations

Expensive operation in loop Static vs pooled connections Complex query on big table Large indices on big table

Empty CATCH block Uncontrolled data access Poor memory management Opened resource not closed

SQL injection Cross-site scripting Buffer overflow Uncontrolled format string

Unstructured code Misuse of inheritance Lack of comments Violated naming convention

Highly coupled component Duplicated code Index modified in loop High cyclomatic complexity

Language Parsers

Oracle PLSQL Sybase T-SQL SQL Server T-SQL IBM SQLPSM C C++ C Pro C Cobol CICS Visual Basic VBNet ASPNet Java J2EE JSP XML HTML Javascript VBScript PHP PowerBuilder Oracle Forms PeopleSoft SAP ABAP Netweaver Tibco Business Objects Universal Analyzer for other languages

Click to edit Master title style Modern Apps are a Technology Stack Ar

chite

ctur

al C

ompl

ianc

e

Data Flow Transaction Risk

EJB PLSQL

Oracle

SQL Server

DB2

TSQL

Hibernate

Spring

Struts NET

COBOL

Sybase IMS

Messaging

Integration quality Architectural

compliance Risk propagation Application security Resiliency checks Transaction integrity

Function point Effort estimation Data access control SDK versioning Calibration across

technologies IT organization level

System Level 3

Code style amp layout Expression complexity Code documentation Class or program design Basic coding standards Developer level

Unit Level 1

Java Java Java

Java Java

Web Services Java Java Single languagetechnology layer

Intra-technology architecture Intra-layer dependencies Inter-program invocation Security vulnerabilities Development team level

Technology Level 2

JSP ASPNET APIs

Click to edit Master title style Architecturally Complex Defects

Primary cause of operational

problems

20x as many

fixes to correct

Architecturally Complex Defect

A structural flaw involving interactions among multiple components that reside in different application layers

48

52 92

8 Architecturally

Complex Defects

Code unit-level violations

of total app defects

of total repair effort

Most architecturally complex defects involve an architectural hotspot ndash a badly-built component that should be replaced

Click to edit Master title style Propagated Risk

The impact of violation 284 in Component B is very widely propagated throughout the system ndash

it presents very high risk

The impact of violation 342 in Component A is only narrowly propagated in the system

ndash it presents low risk

Propagated Risk Index

A ranking of the risk created by each violation based on the its severity and the breadth of its impact in the system

Violation 342 in component A

Violation 284 in component B

Remediating violation 284 in Component B will have

the greater impact on reducing risk and cost

Click to edit Master title style Transaction Risk

Customers care most about the dependability of their transactions

The risk of a transaction is determined by the number and severity of violations along its path

Transactions can be ranked by risk to prioritize their remediation

The Transaction Risk Index can be further tuned using the operational frequency of each transaction

Transaction Risk Index

A ranking of transactions based on the risk created by the number and severity of violations along the transaction path

Transaction entryexit in the user interface layer

Transaction fulfillment in the data storage layer

1 3 2 4 5

Transaction 1 poses greatest risk

Transaction processing in the business logic layer

Click to edit Master title style Risk-Based Prioritization

Violation Health Factor Severity Propagated Risk Index

In a high risk transaction

Frequency of path execution

148 Security 9

371 Robustness 9

062 Performance 8

284 Robustness 7

016 Changeability 7

241 Security 5

173 Transferability 4

359 Transferability 3

Click to edit Master title style Static Analysis in US DoD

John Keane (2013)

Testing finds lt13 of structural defects

Static analysis before testing is 85 efficient

With testing and static analysis combined test schedules are reduced by 50

Click to edit Master title style Reducing Operational Incidents amp Costs

Study of structural quality measures and maintenance effort across 20 customers in a large global system integrator

TQI increase of 24 decreased corrective maintenance effort by 50

Total Quality Index

Click to edit Master title style Reducing Operational Losses

0

20

40

60

80

100

120

140

160

180

100 150 200 250 300 350 400

Technical Quality Index

Pro

du

ctio

n In

cid

ents

+$2000000 Annual Loss

$50000 Annual Loss

$5000 Annual Loss

Large international investment bank Business critical applications

Click to edit Master title style Factors Affecting Structural Quality

2017 CRASH Report Request from infocastsoftwarecom

Factor apps Robust Security Perform Change Transfer

Technology 1606 20 3 16 26 7

Java-EE

Industry 677 5 5 7 4 6

App type 510 1 1 1 1 1

Source 580 1 1 4

Shore 540

Maturity 72 28 25 15 24 12

Method 208 10 6 14 6

Team size 186 7 5 8 8

of users 262 4 3 5 4

Percentage of variation in structural quality scores accounted for by various factors

Click to edit Master title style The Technical Debt Metaphor

Structural quality problems in production code

Technical Debt

Principal borrowed

Interest on the debt

Business Risk Liability from debt

Opportunity cost

Interestmdashcontinuing IT costs attributable to the violations causing technical debt ie higher maintenance costs greater resource usage etc

Principalcost of fixing problems remaining in the code after release that must be remediated

Opportunity costmdashbenefits that could have been achieved had resources been put on new capability rather than retiring technical debt

Liabilitymdashbusiness costs related to outages breaches corrupted data etc

Technical Debt the future cost of fixing defects remaining in code at release a component of the cost of ownership

Todayrsquos talk focuses on the principal

Curtis et al (2012) IEEE Software

Click to edit Master title style Tech Debt by Quality Factor

Transferability

40

Changeability

30

Security 7

Robustness

18

70 of Technical Debt is in IT Cost (Transferability Changeability)

30 of Technical Debt is in Business Risk

(Robustness Performance Security)

Health Factor proportions are mostly

consistent across technologies

Curtis et al (2012) IEEE Software

Click to edit Master title style Join CISQ wwwit-cisqorg

Click to edit Master title style Section

References

115

Click to edit Master title style References 1 Anderson DJ (2010) Kanban Sequim WA Blue Hole Press

Basili V Trendowicz A Kowalczyk M Heidrich J Seaman C Munch J amp Rombach D (2014) Aligning Organizations through Measurement Heidelberg Springer

Boehm B (1987) Increasing software productivity IEEE Computer 20 (9) 43-57

Boehm B et al (2000) Software Cost Estimation with COCOMO II Prentice-Hall

Chrissis MB Konrad M amp Shrum S (2011) CMMI Third Edition Guidelines for Process Integration and Product Improvement Reading MA Addison Wesley

Cohn M (2006) Agile Estimating and Planning Upper Saddle River NJ Prentice-Hall

Crosby P (1979) Quality Is Free New York McGraw-Hill

Curtis B Sapiddi J amp Szynkarski A (2012) Estimating the principle of an applicationrsquos technical debt IEEE Software 29 (6) 34-42

Dion R (1993) Process improvement and the corporate balance sheet IEEE Software 10 (4) 28-35

Duncker R (1992) Presentation at the 25th Annual Conference of the Singapore Computer Society Singapore November 1992

Dyne K (1999) ESSI Benefits from Continuous Software Improvements In Proceedings of ESEPGrsquo99 Milton Keyes UK ESPI Foundation

Flowe RM amp Thordahl JB (1994) A Correlational Study of the SEIrsquos Capability Maturity Model and Software Development Performance in DoD Contracts (AFITGSSLAR94D-2) Dayton OH Air Force Institute of Technology Wright Patterson Air Force Base

116

Click to edit Master title style References 2 Haley T Ireland B Wojtaszek E Nash D amp Dion R (1995) Raytheon Electronic Systems

Experience in Software Process Improvement (Tech Rep CMUSEI-95-TR-017) ) Pittsburgh Software Engineering Institute Carnegie Mellon University

Ebert C amp Dumke R (2007) Software Measurement Berlin Springer-Verlag

Fagan E amp Davis N (2005) TSP PSP at Intuit Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Fenton N (2006) New directions for Software Metrics CIO Symposium on Software Best Practices

Garmus D amp Herron D (2001) Function Point Analysis Boston Addison-Wesley

George E (2005) Project management with TSP Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Herbsleb J Carleton A Rozum J Siegel J amp Zubrow D (1994) Benefits of CMM-Based Software Process Improvement Initial Results (Tech Rep CMUSEI-94-TR-13) Pittsburgh Software Engineering Institute Carnegie Mellon University

Humphrey W (1997) Introduction to the Personal Software Process Boston Addison Wesley

Humphrey W (2000) Introduction to the Team Software Process Boston Addison Wesley

Humphrey W (2006) TSP Leading a Development Team Boston Addison Wesley

Jackson D (2009) A direct path to dependable software Communications of the ACM 52 (4)

Kan SH (1995) Metrics and Models in Software Quality Engineering Reading MA Addison-Wesley

Keeni B (2000) IEEE Software 16 (4)

117

Click to edit Master title style References 2 Larman C (2004) Agile and Iterative Development Boston Addison Wesley

Lydon T (1995) Productivity drivers Process and capital In Proceedings of the 1995 SEPG Conference Pittsburgh Software Engineering Institute Carnegie Mellon University

McGarry J et al (2002) Practical Software Measurement Reading MA Addison Wesley

Paulk M Weber C Curtis B amp Chrissis M (1995) The Capability Maturity Model Guidelines for Improving the Software Process Reading MA Addison-Wesley

Peterman W (2000) IEEE Software JulyAugust issue

Poppendieck M amp Poppendieck T (2007) Implementing Lean Software Development Boston Addison-Wesley

Sakamoto K Kishida K amp Nakakoji K (1996) Cultural adaptation of the CMM In Fuggetta A amp Wolf A (Eds) Software Process Chichester UK Wiley 137-154

Sinha M (2006) Performance models ndash Enabling organizations prediction capability Proceedings of SEPGrsquo06 Pittsburgh Software Engineering Institute Carnegie Mellon University

Schwaber K amp Beedle M (2002) Agile Software Development Using Scrum

Spinellis D (2006) Code Quality Addison-Wesley

Van Epps S amp Marshall R (2012) Using TSP at the end of the development cycle Proceedings of TSP 2012 Pittsburgh Software Engineering Institute Carnegie Mellon University

Vu JD (1996) Software process improvement A business case In Proceedings of the European SEPG Conference Milton Keynes UK European Software Process Improvement Foundation

118

  • Best Practices for Software Process and Product Measurement
  • Instructor Dr Bill Curtis
  • Agenda
  • Section 1
  • Why Does Measurement Languish
  • Measurement Process Guidance
  • Measurement Categories
  • Productivity Analysis Objectives
  • Productivity Analysis Measures
  • Software Productivity
  • Software Size Measures
  • How Many Lines of Code
  • Function Point Estimation
  • Effort mdash Weakest Measure
  • How Quality Affects Productivity
  • Carry-forward Rework
  • Example of Quality Impact
  • Quality-Adjusted Productivity
  • Best Practices for Analysis
  • 1 Evaluate Demographics
  • 2 Segment Baselines
  • 3 Beware Sampling Effects
  • 4 Understand Variation
  • 5 Investigate Distributions
  • 6 Account for Maturity Effects
  • 7 Caution for External Data Sets
  • Slide Number 27
  • Measuring Project Progress
  • Requirements Change Impact
  • Phase Defect Removal Model
  • SCRUM Board
  • Burndown Chart and Velocity
  • Analyzing Velocity
  • Measuring and Managing Flow
  • Trends Over Sprints
  • Sources and Measures
  • Recommended Books
  • Personal Software Process (PSP)
  • Team Software Process (TSP)
  • Aggregated Personal Data Best
  • Defect Detection at Intuit
  • TSP Benefits
  • Section 3
  • CMMI Maturity Transitions
  • Enhanced Maturity Framework
  • Maturity Level Transformation
  • Measurement by Level
  • Average Improvement Results
  • Schedule Performance
  • Cost Performance
  • Crosbyrsquos Cost of Quality Model
  • Raytheons Cost of Quality
  • Raytheonrsquos Productivity
  • Raytheonrsquos Cost Reduction
  • Raytheonrsquos Cost Predictability
  • OMRONrsquos Reuse gains
  • Section 4
  • What about the Product
  • Six Sigmarsquos Challenge
  • Testing is Not Enough
  • CASTrsquos Application Intelligence Platform
  • Modern Apps are a Technology Stack
  • Architecturally Complex Defects
  • Propagated Risk
  • Transaction Risk
  • Risk-Based Prioritization
  • Static Analysis in US DoD
  • Reducing Operational Incidents amp Costs
  • Reducing Operational Losses
  • Slide Number 70
  • The Technical Debt Metaphor
  • Tech Debt by Quality Factor
  • Join CISQ wwwit-cisqorg
  • Section
  • References 1
  • References 2
  • References 2
Page 48: Best Practices for Software Process and Product Measurement · Software Process and Product Measurement Dr. Bill Curtis ... measurement best practice • Best guidance for starting

Click to edit Master title style Section 3 Section 3 Measuring Software

Improvement Programs

1) Improvement program methods

2) Maturity-based measurement

3) Improvement program results

61

Click to edit Master title style CMMI Maturity Transitions

Level 5 Optimizing

Innovation management

Level 4 Quantitatively

Managed

Capability management

Level 3 Defined

Process management

Level 2 Managed

Project management

Level 1 Initial

Inconsistent management

Chrisis Konrad amp Shrum (2005) CMMI

Click to edit Master title style Enhanced Maturity Framework

Level 5 Innovating

Innovation management

Level 4 Optimized

Capability management

Level 3 Standardized

Organizational management

Level 2 Stabilized

Work unit management

Level 1 Initial

Inconsistent management

Click to edit Master title style Maturity Level Transformation

Level 5 Opportunistic improvements

Empowered culture

Level 5 Proactive

improvements Agile culture

Organization

Projects

Individual

Level 4 Process and results

managed statistically Precision culture

Level 3 Organization develops

standard processes Common culture

Level 2 Project managers stabilize projects

Local culture

Level 1 Ad Hoc inconsistent

development practices Relationship culture

Trust

Click to edit Master title style Measurement by Level

Level 5 mdash Exploratory data ∆ t = N x ROI

Level 4 mdash Predictive data X plusmn (196σ radicn )

Level 3 mdash Process data 83 defects per review

Level 2 mdash Management data Effort = α LOCβ

Level 1 mdash Haphazard data ldquoabout a million linesrdquo

Measure-maturity mismatch slows improvement and creates

dysfunction

Click to edit Master title style Average Improvement Results

Productivity growth 4 35 9 - 67

Pre-test defect detection 3 22 6 - 25

Time to market 2 darr 19 15 - 23

Field error reports 5 darr 39 10 - 94

Return on investment 5 501 41 - 881

Annual Annual Improvement benefit Orgs median range

Click to edit Master title style Schedule Performance

0

02

04

06

08

1

12

14

16

1 2 3

Schedule performance

index

CMM maturity level

Budgeted cost of work performed

Budgeted cost of work scheduled

behind ahead

Air Force study of contractors

Flowe amp Thordahl (1994)

Click to edit Master title style Cost Performance

Cost performance

index

0

05

1

15

2

25

1 2 3 CMM Maturity Level

Budgeted cost of work performed Actual cost of

work performed

overrun underrun

Air Force study of contractors

Flowe amp Thordahl (1994)

Click to edit Master title style Crosbyrsquos Cost of Quality Model Project Cost

Performance Cost of Quality Generating plans documentation Development - requirements - design - code - integration Prevention Appraisal

Reviews - system - requirements - design - test plan - test scripts Walkthroughs Testing (first time) Independent Verification and Validation (first time) Audits

Training Policies Procedures and methods Tools Planning Quality Improvement Projects Data Gathering and Analysis Root Cause Analysis Quality Reporting

Nonconformance Conformance Fixing Defects Reworking any Document Updating Source Code Re-reviews Re-tests Lab costs for re-tests Patches(Internal and External) Engineering Changes Change Control Boards External Failures and fines Customer Support Help Desks

Dion (1993) Crosby (1979) Quality Is Free

Click to edit Master title style Raytheons Cost of Quality

bull Performance mdash cost of building it right first time bull Nonconformance mdash cost of rework bull Appraisal mdash cost of testing bull Prevention mdash cost of preventing nonconformance

Dion (1993) Haley (1995)

23

18

Year Level Perform Nonconf Appraise Prevent

1988 1 34 41 15 7

1990 2 55 18 15 12

1992 3 66 11

1994 4 76 6 75

Click to edit Master title style Raytheonrsquos Productivity

0

1

2

3

4

1988 1989 1990 1991 1992 1993 1994

Growth relative to 1989 baseline

Lydon (1995)

Click to edit Master title style Raytheonrsquos Cost Reduction

0

02

04

06

08

1

12

1988 1989 1990 1991 1992 1993 1994

Reduction relative to 1989 baseline

Lydon (1995)

Click to edit Master title style

Haley (1995)

1993 1992 1991 1990 1989 1988 1994

50

40

30

20

10

0

-10

Actual cost

Budgeted cost

over-run under-run

4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1

Raytheonrsquos Cost Predictability

Click to edit Master title style OMRONrsquos Reuse gains

Sakamoto et al (1996)

0

5 0 0

1 0 0 0

1 5 0 0

2 0 0 0

2 5 0 0

3 0 0 0

1 9 8 7 1 9 8 8 1 9 8 9 1 9 9 0 1 9 9 1 1 9 9 2 1 9 9 3 1 9 9 4

Steps

100

75

50

25

0

of reuse

Click to edit Master title style Section 4 Section 4 Measuring Quality and

Technical Debt

1) Structural quality measurement

2) Technical debt

3) Consortium for IT Software Quality (CISQ)

89

Click to edit Master title style What about the Product CMMIrsquos primary assumption

ldquoThe quality of a software system is governed by quality of the process used to develop itrdquo

- Watts Humphrey

CMMIrsquos product quality problem

1 Assessments do not verify product quality

2 Compliance focus undermines CMMIrsquos primary assumption

3 Product quality focus at Level 4 in CMM was lost in quantitative process control in CMMI

4 The assumptions underlying control charts are violated by software data

Click to edit Master title style Six Sigmarsquos Challenge

Process focus ndash process improvement ndash Six Sigma Product focus ndash product improvement ndash Design for 6σ

Product focus supplements product focus to unlock even more value from software

Six Sigma projects must have significant benefits Huge benefits

Large benefits

Good benefits

Okay benefits

Small benefits

Now what

Ultimately we run out of projects with significant enough benefits to continue the programhellip

so what is the solution that allows continual improvement

Click to edit Master title style Testing is Not Enough ldquoAs higher levels of assurance are demandedhelliptesting cannot deliver the level of confidence required at a reasonable costrdquo

ldquoThe correctness of the code is rarely the weakest linkrdquo

Jackson D (2009) Spinellis D (2006)

ldquohellipa failure to satisfy a non-functional requirement can be critical even catastrophichellipnon-functional requirements are sometimes difficult to verify We cannot write a test case to verify a systemrsquos reliabilityhellipThe ability to associate code to non-functional properties can be a powerful weapon in a software engineerrsquos arsenalrdquo

Click to edit Master title style CASTrsquos Application Intelligence Platform

Application Analysis

Evaluation of 1200+ coding amp

architectural rules

Application meta-data

Transferability

Changeability

Robustness

Performance

Security

Quality Measurements

Detected Violations

Expensive operation in loop Static vs pooled connections Complex query on big table Large indices on big table

Empty CATCH block Uncontrolled data access Poor memory management Opened resource not closed

SQL injection Cross-site scripting Buffer overflow Uncontrolled format string

Unstructured code Misuse of inheritance Lack of comments Violated naming convention

Highly coupled component Duplicated code Index modified in loop High cyclomatic complexity

Language Parsers

Oracle PLSQL Sybase T-SQL SQL Server T-SQL IBM SQLPSM C C++ C Pro C Cobol CICS Visual Basic VBNet ASPNet Java J2EE JSP XML HTML Javascript VBScript PHP PowerBuilder Oracle Forms PeopleSoft SAP ABAP Netweaver Tibco Business Objects Universal Analyzer for other languages

Click to edit Master title style Modern Apps are a Technology Stack Ar

chite

ctur

al C

ompl

ianc

e

Data Flow Transaction Risk

EJB PLSQL

Oracle

SQL Server

DB2

TSQL

Hibernate

Spring

Struts NET

COBOL

Sybase IMS

Messaging

Integration quality Architectural

compliance Risk propagation Application security Resiliency checks Transaction integrity

Function point Effort estimation Data access control SDK versioning Calibration across

technologies IT organization level

System Level 3

Code style amp layout Expression complexity Code documentation Class or program design Basic coding standards Developer level

Unit Level 1

Java Java Java

Java Java

Web Services Java Java Single languagetechnology layer

Intra-technology architecture Intra-layer dependencies Inter-program invocation Security vulnerabilities Development team level

Technology Level 2

JSP ASPNET APIs

Click to edit Master title style Architecturally Complex Defects

Primary cause of operational

problems

20x as many

fixes to correct

Architecturally Complex Defect

A structural flaw involving interactions among multiple components that reside in different application layers

48

52 92

8 Architecturally

Complex Defects

Code unit-level violations

of total app defects

of total repair effort

Most architecturally complex defects involve an architectural hotspot ndash a badly-built component that should be replaced

Click to edit Master title style Propagated Risk

The impact of violation 284 in Component B is very widely propagated throughout the system ndash

it presents very high risk

The impact of violation 342 in Component A is only narrowly propagated in the system

ndash it presents low risk

Propagated Risk Index

A ranking of the risk created by each violation based on the its severity and the breadth of its impact in the system

Violation 342 in component A

Violation 284 in component B

Remediating violation 284 in Component B will have

the greater impact on reducing risk and cost

Click to edit Master title style Transaction Risk

Customers care most about the dependability of their transactions

The risk of a transaction is determined by the number and severity of violations along its path

Transactions can be ranked by risk to prioritize their remediation

The Transaction Risk Index can be further tuned using the operational frequency of each transaction

Transaction Risk Index

A ranking of transactions based on the risk created by the number and severity of violations along the transaction path

Transaction entryexit in the user interface layer

Transaction fulfillment in the data storage layer

1 3 2 4 5

Transaction 1 poses greatest risk

Transaction processing in the business logic layer

Click to edit Master title style Risk-Based Prioritization

Violation Health Factor Severity Propagated Risk Index

In a high risk transaction

Frequency of path execution

148 Security 9

371 Robustness 9

062 Performance 8

284 Robustness 7

016 Changeability 7

241 Security 5

173 Transferability 4

359 Transferability 3

Click to edit Master title style Static Analysis in US DoD

John Keane (2013)

Testing finds lt13 of structural defects

Static analysis before testing is 85 efficient

With testing and static analysis combined test schedules are reduced by 50

Click to edit Master title style Reducing Operational Incidents amp Costs

Study of structural quality measures and maintenance effort across 20 customers in a large global system integrator

TQI increase of 24 decreased corrective maintenance effort by 50

Total Quality Index

Click to edit Master title style Reducing Operational Losses

0

20

40

60

80

100

120

140

160

180

100 150 200 250 300 350 400

Technical Quality Index

Pro

du

ctio

n In

cid

ents

+$2000000 Annual Loss

$50000 Annual Loss

$5000 Annual Loss

Large international investment bank Business critical applications

Click to edit Master title style Factors Affecting Structural Quality

2017 CRASH Report Request from infocastsoftwarecom

Factor apps Robust Security Perform Change Transfer

Technology 1606 20 3 16 26 7

Java-EE

Industry 677 5 5 7 4 6

App type 510 1 1 1 1 1

Source 580 1 1 4

Shore 540

Maturity 72 28 25 15 24 12

Method 208 10 6 14 6

Team size 186 7 5 8 8

of users 262 4 3 5 4

Percentage of variation in structural quality scores accounted for by various factors

Click to edit Master title style The Technical Debt Metaphor

Structural quality problems in production code

Technical Debt

Principal borrowed

Interest on the debt

Business Risk Liability from debt

Opportunity cost

Interestmdashcontinuing IT costs attributable to the violations causing technical debt ie higher maintenance costs greater resource usage etc

Principalcost of fixing problems remaining in the code after release that must be remediated

Opportunity costmdashbenefits that could have been achieved had resources been put on new capability rather than retiring technical debt

Liabilitymdashbusiness costs related to outages breaches corrupted data etc

Technical Debt the future cost of fixing defects remaining in code at release a component of the cost of ownership

Todayrsquos talk focuses on the principal

Curtis et al (2012) IEEE Software

Click to edit Master title style Tech Debt by Quality Factor

Transferability

40

Changeability

30

Security 7

Robustness

18

70 of Technical Debt is in IT Cost (Transferability Changeability)

30 of Technical Debt is in Business Risk

(Robustness Performance Security)

Health Factor proportions are mostly

consistent across technologies

Curtis et al (2012) IEEE Software

Click to edit Master title style Join CISQ wwwit-cisqorg

Click to edit Master title style Section

References

115

Click to edit Master title style References 1 Anderson DJ (2010) Kanban Sequim WA Blue Hole Press

Basili V Trendowicz A Kowalczyk M Heidrich J Seaman C Munch J amp Rombach D (2014) Aligning Organizations through Measurement Heidelberg Springer

Boehm B (1987) Increasing software productivity IEEE Computer 20 (9) 43-57

Boehm B et al (2000) Software Cost Estimation with COCOMO II Prentice-Hall

Chrissis MB Konrad M amp Shrum S (2011) CMMI Third Edition Guidelines for Process Integration and Product Improvement Reading MA Addison Wesley

Cohn M (2006) Agile Estimating and Planning Upper Saddle River NJ Prentice-Hall

Crosby P (1979) Quality Is Free New York McGraw-Hill

Curtis B Sapiddi J amp Szynkarski A (2012) Estimating the principle of an applicationrsquos technical debt IEEE Software 29 (6) 34-42

Dion R (1993) Process improvement and the corporate balance sheet IEEE Software 10 (4) 28-35

Duncker R (1992) Presentation at the 25th Annual Conference of the Singapore Computer Society Singapore November 1992

Dyne K (1999) ESSI Benefits from Continuous Software Improvements In Proceedings of ESEPGrsquo99 Milton Keyes UK ESPI Foundation

Flowe RM amp Thordahl JB (1994) A Correlational Study of the SEIrsquos Capability Maturity Model and Software Development Performance in DoD Contracts (AFITGSSLAR94D-2) Dayton OH Air Force Institute of Technology Wright Patterson Air Force Base

116

Click to edit Master title style References 2 Haley T Ireland B Wojtaszek E Nash D amp Dion R (1995) Raytheon Electronic Systems

Experience in Software Process Improvement (Tech Rep CMUSEI-95-TR-017) ) Pittsburgh Software Engineering Institute Carnegie Mellon University

Ebert C amp Dumke R (2007) Software Measurement Berlin Springer-Verlag

Fagan E amp Davis N (2005) TSP PSP at Intuit Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Fenton N (2006) New directions for Software Metrics CIO Symposium on Software Best Practices

Garmus D amp Herron D (2001) Function Point Analysis Boston Addison-Wesley

George E (2005) Project management with TSP Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Herbsleb J Carleton A Rozum J Siegel J amp Zubrow D (1994) Benefits of CMM-Based Software Process Improvement Initial Results (Tech Rep CMUSEI-94-TR-13) Pittsburgh Software Engineering Institute Carnegie Mellon University

Humphrey W (1997) Introduction to the Personal Software Process Boston Addison Wesley

Humphrey W (2000) Introduction to the Team Software Process Boston Addison Wesley

Humphrey W (2006) TSP Leading a Development Team Boston Addison Wesley

Jackson D (2009) A direct path to dependable software Communications of the ACM 52 (4)

Kan SH (1995) Metrics and Models in Software Quality Engineering Reading MA Addison-Wesley

Keeni B (2000) IEEE Software 16 (4)

117

Click to edit Master title style References 2 Larman C (2004) Agile and Iterative Development Boston Addison Wesley

Lydon T (1995) Productivity drivers Process and capital In Proceedings of the 1995 SEPG Conference Pittsburgh Software Engineering Institute Carnegie Mellon University

McGarry J et al (2002) Practical Software Measurement Reading MA Addison Wesley

Paulk M Weber C Curtis B amp Chrissis M (1995) The Capability Maturity Model Guidelines for Improving the Software Process Reading MA Addison-Wesley

Peterman W (2000) IEEE Software JulyAugust issue

Poppendieck M amp Poppendieck T (2007) Implementing Lean Software Development Boston Addison-Wesley

Sakamoto K Kishida K amp Nakakoji K (1996) Cultural adaptation of the CMM In Fuggetta A amp Wolf A (Eds) Software Process Chichester UK Wiley 137-154

Sinha M (2006) Performance models ndash Enabling organizations prediction capability Proceedings of SEPGrsquo06 Pittsburgh Software Engineering Institute Carnegie Mellon University

Schwaber K amp Beedle M (2002) Agile Software Development Using Scrum

Spinellis D (2006) Code Quality Addison-Wesley

Van Epps S amp Marshall R (2012) Using TSP at the end of the development cycle Proceedings of TSP 2012 Pittsburgh Software Engineering Institute Carnegie Mellon University

Vu JD (1996) Software process improvement A business case In Proceedings of the European SEPG Conference Milton Keynes UK European Software Process Improvement Foundation

118

  • Best Practices for Software Process and Product Measurement
  • Instructor Dr Bill Curtis
  • Agenda
  • Section 1
  • Why Does Measurement Languish
  • Measurement Process Guidance
  • Measurement Categories
  • Productivity Analysis Objectives
  • Productivity Analysis Measures
  • Software Productivity
  • Software Size Measures
  • How Many Lines of Code
  • Function Point Estimation
  • Effort mdash Weakest Measure
  • How Quality Affects Productivity
  • Carry-forward Rework
  • Example of Quality Impact
  • Quality-Adjusted Productivity
  • Best Practices for Analysis
  • 1 Evaluate Demographics
  • 2 Segment Baselines
  • 3 Beware Sampling Effects
  • 4 Understand Variation
  • 5 Investigate Distributions
  • 6 Account for Maturity Effects
  • 7 Caution for External Data Sets
  • Slide Number 27
  • Measuring Project Progress
  • Requirements Change Impact
  • Phase Defect Removal Model
  • SCRUM Board
  • Burndown Chart and Velocity
  • Analyzing Velocity
  • Measuring and Managing Flow
  • Trends Over Sprints
  • Sources and Measures
  • Recommended Books
  • Personal Software Process (PSP)
  • Team Software Process (TSP)
  • Aggregated Personal Data Best
  • Defect Detection at Intuit
  • TSP Benefits
  • Section 3
  • CMMI Maturity Transitions
  • Enhanced Maturity Framework
  • Maturity Level Transformation
  • Measurement by Level
  • Average Improvement Results
  • Schedule Performance
  • Cost Performance
  • Crosbyrsquos Cost of Quality Model
  • Raytheons Cost of Quality
  • Raytheonrsquos Productivity
  • Raytheonrsquos Cost Reduction
  • Raytheonrsquos Cost Predictability
  • OMRONrsquos Reuse gains
  • Section 4
  • What about the Product
  • Six Sigmarsquos Challenge
  • Testing is Not Enough
  • CASTrsquos Application Intelligence Platform
  • Modern Apps are a Technology Stack
  • Architecturally Complex Defects
  • Propagated Risk
  • Transaction Risk
  • Risk-Based Prioritization
  • Static Analysis in US DoD
  • Reducing Operational Incidents amp Costs
  • Reducing Operational Losses
  • Slide Number 70
  • The Technical Debt Metaphor
  • Tech Debt by Quality Factor
  • Join CISQ wwwit-cisqorg
  • Section
  • References 1
  • References 2
  • References 2
Page 49: Best Practices for Software Process and Product Measurement · Software Process and Product Measurement Dr. Bill Curtis ... measurement best practice • Best guidance for starting

Click to edit Master title style CMMI Maturity Transitions

Level 5 Optimizing

Innovation management

Level 4 Quantitatively

Managed

Capability management

Level 3 Defined

Process management

Level 2 Managed

Project management

Level 1 Initial

Inconsistent management

Chrisis Konrad amp Shrum (2005) CMMI

Click to edit Master title style Enhanced Maturity Framework

Level 5 Innovating

Innovation management

Level 4 Optimized

Capability management

Level 3 Standardized

Organizational management

Level 2 Stabilized

Work unit management

Level 1 Initial

Inconsistent management

Click to edit Master title style Maturity Level Transformation

Level 5 Opportunistic improvements

Empowered culture

Level 5 Proactive

improvements Agile culture

Organization

Projects

Individual

Level 4 Process and results

managed statistically Precision culture

Level 3 Organization develops

standard processes Common culture

Level 2 Project managers stabilize projects

Local culture

Level 1 Ad Hoc inconsistent

development practices Relationship culture

Trust

Click to edit Master title style Measurement by Level

Level 5 mdash Exploratory data ∆ t = N x ROI

Level 4 mdash Predictive data X plusmn (196σ radicn )

Level 3 mdash Process data 83 defects per review

Level 2 mdash Management data Effort = α LOCβ

Level 1 mdash Haphazard data ldquoabout a million linesrdquo

Measure-maturity mismatch slows improvement and creates

dysfunction

Click to edit Master title style Average Improvement Results

Productivity growth 4 35 9 - 67

Pre-test defect detection 3 22 6 - 25

Time to market 2 darr 19 15 - 23

Field error reports 5 darr 39 10 - 94

Return on investment 5 501 41 - 881

Annual Annual Improvement benefit Orgs median range

Click to edit Master title style Schedule Performance

0

02

04

06

08

1

12

14

16

1 2 3

Schedule performance

index

CMM maturity level

Budgeted cost of work performed

Budgeted cost of work scheduled

behind ahead

Air Force study of contractors

Flowe amp Thordahl (1994)

Click to edit Master title style Cost Performance

Cost performance

index

0

05

1

15

2

25

1 2 3 CMM Maturity Level

Budgeted cost of work performed Actual cost of

work performed

overrun underrun

Air Force study of contractors

Flowe amp Thordahl (1994)

Click to edit Master title style Crosbyrsquos Cost of Quality Model Project Cost

Performance Cost of Quality Generating plans documentation Development - requirements - design - code - integration Prevention Appraisal

Reviews - system - requirements - design - test plan - test scripts Walkthroughs Testing (first time) Independent Verification and Validation (first time) Audits

Training Policies Procedures and methods Tools Planning Quality Improvement Projects Data Gathering and Analysis Root Cause Analysis Quality Reporting

Nonconformance Conformance Fixing Defects Reworking any Document Updating Source Code Re-reviews Re-tests Lab costs for re-tests Patches(Internal and External) Engineering Changes Change Control Boards External Failures and fines Customer Support Help Desks

Dion (1993) Crosby (1979) Quality Is Free

Click to edit Master title style Raytheons Cost of Quality

bull Performance mdash cost of building it right first time bull Nonconformance mdash cost of rework bull Appraisal mdash cost of testing bull Prevention mdash cost of preventing nonconformance

Dion (1993) Haley (1995)

23

18

Year Level Perform Nonconf Appraise Prevent

1988 1 34 41 15 7

1990 2 55 18 15 12

1992 3 66 11

1994 4 76 6 75

Click to edit Master title style Raytheonrsquos Productivity

0

1

2

3

4

1988 1989 1990 1991 1992 1993 1994

Growth relative to 1989 baseline

Lydon (1995)

Click to edit Master title style Raytheonrsquos Cost Reduction

0

02

04

06

08

1

12

1988 1989 1990 1991 1992 1993 1994

Reduction relative to 1989 baseline

Lydon (1995)

Click to edit Master title style

Haley (1995)

1993 1992 1991 1990 1989 1988 1994

50

40

30

20

10

0

-10

Actual cost

Budgeted cost

over-run under-run

4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1

Raytheonrsquos Cost Predictability

Click to edit Master title style OMRONrsquos Reuse gains

Sakamoto et al (1996)

0

5 0 0

1 0 0 0

1 5 0 0

2 0 0 0

2 5 0 0

3 0 0 0

1 9 8 7 1 9 8 8 1 9 8 9 1 9 9 0 1 9 9 1 1 9 9 2 1 9 9 3 1 9 9 4

Steps

100

75

50

25

0

of reuse

Click to edit Master title style Section 4 Section 4 Measuring Quality and

Technical Debt

1) Structural quality measurement

2) Technical debt

3) Consortium for IT Software Quality (CISQ)

89

Click to edit Master title style What about the Product CMMIrsquos primary assumption

ldquoThe quality of a software system is governed by quality of the process used to develop itrdquo

- Watts Humphrey

CMMIrsquos product quality problem

1 Assessments do not verify product quality

2 Compliance focus undermines CMMIrsquos primary assumption

3 Product quality focus at Level 4 in CMM was lost in quantitative process control in CMMI

4 The assumptions underlying control charts are violated by software data

Click to edit Master title style Six Sigmarsquos Challenge

Process focus ndash process improvement ndash Six Sigma Product focus ndash product improvement ndash Design for 6σ

Product focus supplements product focus to unlock even more value from software

Six Sigma projects must have significant benefits Huge benefits

Large benefits

Good benefits

Okay benefits

Small benefits

Now what

Ultimately we run out of projects with significant enough benefits to continue the programhellip

so what is the solution that allows continual improvement

Click to edit Master title style Testing is Not Enough ldquoAs higher levels of assurance are demandedhelliptesting cannot deliver the level of confidence required at a reasonable costrdquo

ldquoThe correctness of the code is rarely the weakest linkrdquo

Jackson D (2009) Spinellis D (2006)

ldquohellipa failure to satisfy a non-functional requirement can be critical even catastrophichellipnon-functional requirements are sometimes difficult to verify We cannot write a test case to verify a systemrsquos reliabilityhellipThe ability to associate code to non-functional properties can be a powerful weapon in a software engineerrsquos arsenalrdquo

Click to edit Master title style CASTrsquos Application Intelligence Platform

Application Analysis

Evaluation of 1200+ coding amp

architectural rules

Application meta-data

Transferability

Changeability

Robustness

Performance

Security

Quality Measurements

Detected Violations

Expensive operation in loop Static vs pooled connections Complex query on big table Large indices on big table

Empty CATCH block Uncontrolled data access Poor memory management Opened resource not closed

SQL injection Cross-site scripting Buffer overflow Uncontrolled format string

Unstructured code Misuse of inheritance Lack of comments Violated naming convention

Highly coupled component Duplicated code Index modified in loop High cyclomatic complexity

Language Parsers

Oracle PLSQL Sybase T-SQL SQL Server T-SQL IBM SQLPSM C C++ C Pro C Cobol CICS Visual Basic VBNet ASPNet Java J2EE JSP XML HTML Javascript VBScript PHP PowerBuilder Oracle Forms PeopleSoft SAP ABAP Netweaver Tibco Business Objects Universal Analyzer for other languages

Click to edit Master title style Modern Apps are a Technology Stack Ar

chite

ctur

al C

ompl

ianc

e

Data Flow Transaction Risk

EJB PLSQL

Oracle

SQL Server

DB2

TSQL

Hibernate

Spring

Struts NET

COBOL

Sybase IMS

Messaging

Integration quality Architectural

compliance Risk propagation Application security Resiliency checks Transaction integrity

Function point Effort estimation Data access control SDK versioning Calibration across

technologies IT organization level

System Level 3

Code style amp layout Expression complexity Code documentation Class or program design Basic coding standards Developer level

Unit Level 1

Java Java Java

Java Java

Web Services Java Java Single languagetechnology layer

Intra-technology architecture Intra-layer dependencies Inter-program invocation Security vulnerabilities Development team level

Technology Level 2

JSP ASPNET APIs

Click to edit Master title style Architecturally Complex Defects

Primary cause of operational

problems

20x as many

fixes to correct

Architecturally Complex Defect

A structural flaw involving interactions among multiple components that reside in different application layers

48

52 92

8 Architecturally

Complex Defects

Code unit-level violations

of total app defects

of total repair effort

Most architecturally complex defects involve an architectural hotspot ndash a badly-built component that should be replaced

Click to edit Master title style Propagated Risk

The impact of violation 284 in Component B is very widely propagated throughout the system ndash

it presents very high risk

The impact of violation 342 in Component A is only narrowly propagated in the system

ndash it presents low risk

Propagated Risk Index

A ranking of the risk created by each violation based on the its severity and the breadth of its impact in the system

Violation 342 in component A

Violation 284 in component B

Remediating violation 284 in Component B will have

the greater impact on reducing risk and cost

Click to edit Master title style Transaction Risk

Customers care most about the dependability of their transactions

The risk of a transaction is determined by the number and severity of violations along its path

Transactions can be ranked by risk to prioritize their remediation

The Transaction Risk Index can be further tuned using the operational frequency of each transaction

Transaction Risk Index

A ranking of transactions based on the risk created by the number and severity of violations along the transaction path

Transaction entryexit in the user interface layer

Transaction fulfillment in the data storage layer

1 3 2 4 5

Transaction 1 poses greatest risk

Transaction processing in the business logic layer

Click to edit Master title style Risk-Based Prioritization

Violation Health Factor Severity Propagated Risk Index

In a high risk transaction

Frequency of path execution

148 Security 9

371 Robustness 9

062 Performance 8

284 Robustness 7

016 Changeability 7

241 Security 5

173 Transferability 4

359 Transferability 3

Click to edit Master title style Static Analysis in US DoD

John Keane (2013)

Testing finds lt13 of structural defects

Static analysis before testing is 85 efficient

With testing and static analysis combined test schedules are reduced by 50

Click to edit Master title style Reducing Operational Incidents amp Costs

Study of structural quality measures and maintenance effort across 20 customers in a large global system integrator

TQI increase of 24 decreased corrective maintenance effort by 50

Total Quality Index

Click to edit Master title style Reducing Operational Losses

0

20

40

60

80

100

120

140

160

180

100 150 200 250 300 350 400

Technical Quality Index

Pro

du

ctio

n In

cid

ents

+$2000000 Annual Loss

$50000 Annual Loss

$5000 Annual Loss

Large international investment bank Business critical applications

Click to edit Master title style Factors Affecting Structural Quality

2017 CRASH Report Request from infocastsoftwarecom

Factor apps Robust Security Perform Change Transfer

Technology 1606 20 3 16 26 7

Java-EE

Industry 677 5 5 7 4 6

App type 510 1 1 1 1 1

Source 580 1 1 4

Shore 540

Maturity 72 28 25 15 24 12

Method 208 10 6 14 6

Team size 186 7 5 8 8

of users 262 4 3 5 4

Percentage of variation in structural quality scores accounted for by various factors

Click to edit Master title style The Technical Debt Metaphor

Structural quality problems in production code

Technical Debt

Principal borrowed

Interest on the debt

Business Risk Liability from debt

Opportunity cost

Interestmdashcontinuing IT costs attributable to the violations causing technical debt ie higher maintenance costs greater resource usage etc

Principalcost of fixing problems remaining in the code after release that must be remediated

Opportunity costmdashbenefits that could have been achieved had resources been put on new capability rather than retiring technical debt

Liabilitymdashbusiness costs related to outages breaches corrupted data etc

Technical Debt the future cost of fixing defects remaining in code at release a component of the cost of ownership

Todayrsquos talk focuses on the principal

Curtis et al (2012) IEEE Software

Click to edit Master title style Tech Debt by Quality Factor

Transferability

40

Changeability

30

Security 7

Robustness

18

70 of Technical Debt is in IT Cost (Transferability Changeability)

30 of Technical Debt is in Business Risk

(Robustness Performance Security)

Health Factor proportions are mostly

consistent across technologies

Curtis et al (2012) IEEE Software

Click to edit Master title style Join CISQ wwwit-cisqorg

Click to edit Master title style Section

References

115

Click to edit Master title style References 1 Anderson DJ (2010) Kanban Sequim WA Blue Hole Press

Basili V Trendowicz A Kowalczyk M Heidrich J Seaman C Munch J amp Rombach D (2014) Aligning Organizations through Measurement Heidelberg Springer

Boehm B (1987) Increasing software productivity IEEE Computer 20 (9) 43-57

Boehm B et al (2000) Software Cost Estimation with COCOMO II Prentice-Hall

Chrissis MB Konrad M amp Shrum S (2011) CMMI Third Edition Guidelines for Process Integration and Product Improvement Reading MA Addison Wesley

Cohn M (2006) Agile Estimating and Planning Upper Saddle River NJ Prentice-Hall

Crosby P (1979) Quality Is Free New York McGraw-Hill

Curtis B Sapiddi J amp Szynkarski A (2012) Estimating the principle of an applicationrsquos technical debt IEEE Software 29 (6) 34-42

Dion R (1993) Process improvement and the corporate balance sheet IEEE Software 10 (4) 28-35

Duncker R (1992) Presentation at the 25th Annual Conference of the Singapore Computer Society Singapore November 1992

Dyne K (1999) ESSI Benefits from Continuous Software Improvements In Proceedings of ESEPGrsquo99 Milton Keyes UK ESPI Foundation

Flowe RM amp Thordahl JB (1994) A Correlational Study of the SEIrsquos Capability Maturity Model and Software Development Performance in DoD Contracts (AFITGSSLAR94D-2) Dayton OH Air Force Institute of Technology Wright Patterson Air Force Base

116

Click to edit Master title style References 2 Haley T Ireland B Wojtaszek E Nash D amp Dion R (1995) Raytheon Electronic Systems

Experience in Software Process Improvement (Tech Rep CMUSEI-95-TR-017) ) Pittsburgh Software Engineering Institute Carnegie Mellon University

Ebert C amp Dumke R (2007) Software Measurement Berlin Springer-Verlag

Fagan E amp Davis N (2005) TSP PSP at Intuit Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Fenton N (2006) New directions for Software Metrics CIO Symposium on Software Best Practices

Garmus D amp Herron D (2001) Function Point Analysis Boston Addison-Wesley

George E (2005) Project management with TSP Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Herbsleb J Carleton A Rozum J Siegel J amp Zubrow D (1994) Benefits of CMM-Based Software Process Improvement Initial Results (Tech Rep CMUSEI-94-TR-13) Pittsburgh Software Engineering Institute Carnegie Mellon University

Humphrey W (1997) Introduction to the Personal Software Process Boston Addison Wesley

Humphrey W (2000) Introduction to the Team Software Process Boston Addison Wesley

Humphrey W (2006) TSP Leading a Development Team Boston Addison Wesley

Jackson D (2009) A direct path to dependable software Communications of the ACM 52 (4)

Kan SH (1995) Metrics and Models in Software Quality Engineering Reading MA Addison-Wesley

Keeni B (2000) IEEE Software 16 (4)

117

Click to edit Master title style References 2 Larman C (2004) Agile and Iterative Development Boston Addison Wesley

Lydon T (1995) Productivity drivers Process and capital In Proceedings of the 1995 SEPG Conference Pittsburgh Software Engineering Institute Carnegie Mellon University

McGarry J et al (2002) Practical Software Measurement Reading MA Addison Wesley

Paulk M Weber C Curtis B amp Chrissis M (1995) The Capability Maturity Model Guidelines for Improving the Software Process Reading MA Addison-Wesley

Peterman W (2000) IEEE Software JulyAugust issue

Poppendieck M amp Poppendieck T (2007) Implementing Lean Software Development Boston Addison-Wesley

Sakamoto K Kishida K amp Nakakoji K (1996) Cultural adaptation of the CMM In Fuggetta A amp Wolf A (Eds) Software Process Chichester UK Wiley 137-154

Sinha M (2006) Performance models ndash Enabling organizations prediction capability Proceedings of SEPGrsquo06 Pittsburgh Software Engineering Institute Carnegie Mellon University

Schwaber K amp Beedle M (2002) Agile Software Development Using Scrum

Spinellis D (2006) Code Quality Addison-Wesley

Van Epps S amp Marshall R (2012) Using TSP at the end of the development cycle Proceedings of TSP 2012 Pittsburgh Software Engineering Institute Carnegie Mellon University

Vu JD (1996) Software process improvement A business case In Proceedings of the European SEPG Conference Milton Keynes UK European Software Process Improvement Foundation

118

  • Best Practices for Software Process and Product Measurement
  • Instructor Dr Bill Curtis
  • Agenda
  • Section 1
  • Why Does Measurement Languish
  • Measurement Process Guidance
  • Measurement Categories
  • Productivity Analysis Objectives
  • Productivity Analysis Measures
  • Software Productivity
  • Software Size Measures
  • How Many Lines of Code
  • Function Point Estimation
  • Effort mdash Weakest Measure
  • How Quality Affects Productivity
  • Carry-forward Rework
  • Example of Quality Impact
  • Quality-Adjusted Productivity
  • Best Practices for Analysis
  • 1 Evaluate Demographics
  • 2 Segment Baselines
  • 3 Beware Sampling Effects
  • 4 Understand Variation
  • 5 Investigate Distributions
  • 6 Account for Maturity Effects
  • 7 Caution for External Data Sets
  • Slide Number 27
  • Measuring Project Progress
  • Requirements Change Impact
  • Phase Defect Removal Model
  • SCRUM Board
  • Burndown Chart and Velocity
  • Analyzing Velocity
  • Measuring and Managing Flow
  • Trends Over Sprints
  • Sources and Measures
  • Recommended Books
  • Personal Software Process (PSP)
  • Team Software Process (TSP)
  • Aggregated Personal Data Best
  • Defect Detection at Intuit
  • TSP Benefits
  • Section 3
  • CMMI Maturity Transitions
  • Enhanced Maturity Framework
  • Maturity Level Transformation
  • Measurement by Level
  • Average Improvement Results
  • Schedule Performance
  • Cost Performance
  • Crosbyrsquos Cost of Quality Model
  • Raytheons Cost of Quality
  • Raytheonrsquos Productivity
  • Raytheonrsquos Cost Reduction
  • Raytheonrsquos Cost Predictability
  • OMRONrsquos Reuse gains
  • Section 4
  • What about the Product
  • Six Sigmarsquos Challenge
  • Testing is Not Enough
  • CASTrsquos Application Intelligence Platform
  • Modern Apps are a Technology Stack
  • Architecturally Complex Defects
  • Propagated Risk
  • Transaction Risk
  • Risk-Based Prioritization
  • Static Analysis in US DoD
  • Reducing Operational Incidents amp Costs
  • Reducing Operational Losses
  • Slide Number 70
  • The Technical Debt Metaphor
  • Tech Debt by Quality Factor
  • Join CISQ wwwit-cisqorg
  • Section
  • References 1
  • References 2
  • References 2
Page 50: Best Practices for Software Process and Product Measurement · Software Process and Product Measurement Dr. Bill Curtis ... measurement best practice • Best guidance for starting

Click to edit Master title style Enhanced Maturity Framework

Level 5 Innovating

Innovation management

Level 4 Optimized

Capability management

Level 3 Standardized

Organizational management

Level 2 Stabilized

Work unit management

Level 1 Initial

Inconsistent management

Click to edit Master title style Maturity Level Transformation

Level 5 Opportunistic improvements

Empowered culture

Level 5 Proactive

improvements Agile culture

Organization

Projects

Individual

Level 4 Process and results

managed statistically Precision culture

Level 3 Organization develops

standard processes Common culture

Level 2 Project managers stabilize projects

Local culture

Level 1 Ad Hoc inconsistent

development practices Relationship culture

Trust

Click to edit Master title style Measurement by Level

Level 5 mdash Exploratory data ∆ t = N x ROI

Level 4 mdash Predictive data X plusmn (196σ radicn )

Level 3 mdash Process data 83 defects per review

Level 2 mdash Management data Effort = α LOCβ

Level 1 mdash Haphazard data ldquoabout a million linesrdquo

Measure-maturity mismatch slows improvement and creates

dysfunction

Click to edit Master title style Average Improvement Results

Productivity growth 4 35 9 - 67

Pre-test defect detection 3 22 6 - 25

Time to market 2 darr 19 15 - 23

Field error reports 5 darr 39 10 - 94

Return on investment 5 501 41 - 881

Annual Annual Improvement benefit Orgs median range

Click to edit Master title style Schedule Performance

0

02

04

06

08

1

12

14

16

1 2 3

Schedule performance

index

CMM maturity level

Budgeted cost of work performed

Budgeted cost of work scheduled

behind ahead

Air Force study of contractors

Flowe amp Thordahl (1994)

Click to edit Master title style Cost Performance

Cost performance

index

0

05

1

15

2

25

1 2 3 CMM Maturity Level

Budgeted cost of work performed Actual cost of

work performed

overrun underrun

Air Force study of contractors

Flowe amp Thordahl (1994)

Click to edit Master title style Crosbyrsquos Cost of Quality Model Project Cost

Performance Cost of Quality Generating plans documentation Development - requirements - design - code - integration Prevention Appraisal

Reviews - system - requirements - design - test plan - test scripts Walkthroughs Testing (first time) Independent Verification and Validation (first time) Audits

Training Policies Procedures and methods Tools Planning Quality Improvement Projects Data Gathering and Analysis Root Cause Analysis Quality Reporting

Nonconformance Conformance Fixing Defects Reworking any Document Updating Source Code Re-reviews Re-tests Lab costs for re-tests Patches(Internal and External) Engineering Changes Change Control Boards External Failures and fines Customer Support Help Desks

Dion (1993) Crosby (1979) Quality Is Free

Click to edit Master title style Raytheons Cost of Quality

bull Performance mdash cost of building it right first time bull Nonconformance mdash cost of rework bull Appraisal mdash cost of testing bull Prevention mdash cost of preventing nonconformance

Dion (1993) Haley (1995)

23

18

Year Level Perform Nonconf Appraise Prevent

1988 1 34 41 15 7

1990 2 55 18 15 12

1992 3 66 11

1994 4 76 6 75

Click to edit Master title style Raytheonrsquos Productivity

0

1

2

3

4

1988 1989 1990 1991 1992 1993 1994

Growth relative to 1989 baseline

Lydon (1995)

Click to edit Master title style Raytheonrsquos Cost Reduction

0

02

04

06

08

1

12

1988 1989 1990 1991 1992 1993 1994

Reduction relative to 1989 baseline

Lydon (1995)

Click to edit Master title style

Haley (1995)

1993 1992 1991 1990 1989 1988 1994

50

40

30

20

10

0

-10

Actual cost

Budgeted cost

over-run under-run

4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1

Raytheonrsquos Cost Predictability

Click to edit Master title style OMRONrsquos Reuse gains

Sakamoto et al (1996)

0

5 0 0

1 0 0 0

1 5 0 0

2 0 0 0

2 5 0 0

3 0 0 0

1 9 8 7 1 9 8 8 1 9 8 9 1 9 9 0 1 9 9 1 1 9 9 2 1 9 9 3 1 9 9 4

Steps

100

75

50

25

0

of reuse

Click to edit Master title style Section 4 Section 4 Measuring Quality and

Technical Debt

1) Structural quality measurement

2) Technical debt

3) Consortium for IT Software Quality (CISQ)

89

Click to edit Master title style What about the Product CMMIrsquos primary assumption

ldquoThe quality of a software system is governed by quality of the process used to develop itrdquo

- Watts Humphrey

CMMIrsquos product quality problem

1 Assessments do not verify product quality

2 Compliance focus undermines CMMIrsquos primary assumption

3 Product quality focus at Level 4 in CMM was lost in quantitative process control in CMMI

4 The assumptions underlying control charts are violated by software data

Click to edit Master title style Six Sigmarsquos Challenge

Process focus ndash process improvement ndash Six Sigma Product focus ndash product improvement ndash Design for 6σ

Product focus supplements product focus to unlock even more value from software

Six Sigma projects must have significant benefits Huge benefits

Large benefits

Good benefits

Okay benefits

Small benefits

Now what

Ultimately we run out of projects with significant enough benefits to continue the programhellip

so what is the solution that allows continual improvement

Click to edit Master title style Testing is Not Enough ldquoAs higher levels of assurance are demandedhelliptesting cannot deliver the level of confidence required at a reasonable costrdquo

ldquoThe correctness of the code is rarely the weakest linkrdquo

Jackson D (2009) Spinellis D (2006)

ldquohellipa failure to satisfy a non-functional requirement can be critical even catastrophichellipnon-functional requirements are sometimes difficult to verify We cannot write a test case to verify a systemrsquos reliabilityhellipThe ability to associate code to non-functional properties can be a powerful weapon in a software engineerrsquos arsenalrdquo

Click to edit Master title style CASTrsquos Application Intelligence Platform

Application Analysis

Evaluation of 1200+ coding amp

architectural rules

Application meta-data

Transferability

Changeability

Robustness

Performance

Security

Quality Measurements

Detected Violations

Expensive operation in loop Static vs pooled connections Complex query on big table Large indices on big table

Empty CATCH block Uncontrolled data access Poor memory management Opened resource not closed

SQL injection Cross-site scripting Buffer overflow Uncontrolled format string

Unstructured code Misuse of inheritance Lack of comments Violated naming convention

Highly coupled component Duplicated code Index modified in loop High cyclomatic complexity

Language Parsers

Oracle PLSQL Sybase T-SQL SQL Server T-SQL IBM SQLPSM C C++ C Pro C Cobol CICS Visual Basic VBNet ASPNet Java J2EE JSP XML HTML Javascript VBScript PHP PowerBuilder Oracle Forms PeopleSoft SAP ABAP Netweaver Tibco Business Objects Universal Analyzer for other languages

Click to edit Master title style Modern Apps are a Technology Stack Ar

chite

ctur

al C

ompl

ianc

e

Data Flow Transaction Risk

EJB PLSQL

Oracle

SQL Server

DB2

TSQL

Hibernate

Spring

Struts NET

COBOL

Sybase IMS

Messaging

Integration quality Architectural

compliance Risk propagation Application security Resiliency checks Transaction integrity

Function point Effort estimation Data access control SDK versioning Calibration across

technologies IT organization level

System Level 3

Code style amp layout Expression complexity Code documentation Class or program design Basic coding standards Developer level

Unit Level 1

Java Java Java

Java Java

Web Services Java Java Single languagetechnology layer

Intra-technology architecture Intra-layer dependencies Inter-program invocation Security vulnerabilities Development team level

Technology Level 2

JSP ASPNET APIs

Click to edit Master title style Architecturally Complex Defects

Primary cause of operational

problems

20x as many

fixes to correct

Architecturally Complex Defect

A structural flaw involving interactions among multiple components that reside in different application layers

48

52 92

8 Architecturally

Complex Defects

Code unit-level violations

of total app defects

of total repair effort

Most architecturally complex defects involve an architectural hotspot ndash a badly-built component that should be replaced

Click to edit Master title style Propagated Risk

The impact of violation 284 in Component B is very widely propagated throughout the system ndash

it presents very high risk

The impact of violation 342 in Component A is only narrowly propagated in the system

ndash it presents low risk

Propagated Risk Index

A ranking of the risk created by each violation based on the its severity and the breadth of its impact in the system

Violation 342 in component A

Violation 284 in component B

Remediating violation 284 in Component B will have

the greater impact on reducing risk and cost

Click to edit Master title style Transaction Risk

Customers care most about the dependability of their transactions

The risk of a transaction is determined by the number and severity of violations along its path

Transactions can be ranked by risk to prioritize their remediation

The Transaction Risk Index can be further tuned using the operational frequency of each transaction

Transaction Risk Index

A ranking of transactions based on the risk created by the number and severity of violations along the transaction path

Transaction entryexit in the user interface layer

Transaction fulfillment in the data storage layer

1 3 2 4 5

Transaction 1 poses greatest risk

Transaction processing in the business logic layer

Click to edit Master title style Risk-Based Prioritization

Violation Health Factor Severity Propagated Risk Index

In a high risk transaction

Frequency of path execution

148 Security 9

371 Robustness 9

062 Performance 8

284 Robustness 7

016 Changeability 7

241 Security 5

173 Transferability 4

359 Transferability 3

Click to edit Master title style Static Analysis in US DoD

John Keane (2013)

Testing finds lt13 of structural defects

Static analysis before testing is 85 efficient

With testing and static analysis combined test schedules are reduced by 50

Click to edit Master title style Reducing Operational Incidents amp Costs

Study of structural quality measures and maintenance effort across 20 customers in a large global system integrator

TQI increase of 24 decreased corrective maintenance effort by 50

Total Quality Index

Click to edit Master title style Reducing Operational Losses

0

20

40

60

80

100

120

140

160

180

100 150 200 250 300 350 400

Technical Quality Index

Pro

du

ctio

n In

cid

ents

+$2000000 Annual Loss

$50000 Annual Loss

$5000 Annual Loss

Large international investment bank Business critical applications

Click to edit Master title style Factors Affecting Structural Quality

2017 CRASH Report Request from infocastsoftwarecom

Factor apps Robust Security Perform Change Transfer

Technology 1606 20 3 16 26 7

Java-EE

Industry 677 5 5 7 4 6

App type 510 1 1 1 1 1

Source 580 1 1 4

Shore 540

Maturity 72 28 25 15 24 12

Method 208 10 6 14 6

Team size 186 7 5 8 8

of users 262 4 3 5 4

Percentage of variation in structural quality scores accounted for by various factors

Click to edit Master title style The Technical Debt Metaphor

Structural quality problems in production code

Technical Debt

Principal borrowed

Interest on the debt

Business Risk Liability from debt

Opportunity cost

Interestmdashcontinuing IT costs attributable to the violations causing technical debt ie higher maintenance costs greater resource usage etc

Principalcost of fixing problems remaining in the code after release that must be remediated

Opportunity costmdashbenefits that could have been achieved had resources been put on new capability rather than retiring technical debt

Liabilitymdashbusiness costs related to outages breaches corrupted data etc

Technical Debt the future cost of fixing defects remaining in code at release a component of the cost of ownership

Todayrsquos talk focuses on the principal

Curtis et al (2012) IEEE Software

Click to edit Master title style Tech Debt by Quality Factor

Transferability

40

Changeability

30

Security 7

Robustness

18

70 of Technical Debt is in IT Cost (Transferability Changeability)

30 of Technical Debt is in Business Risk

(Robustness Performance Security)

Health Factor proportions are mostly

consistent across technologies

Curtis et al (2012) IEEE Software

Click to edit Master title style Join CISQ wwwit-cisqorg

Click to edit Master title style Section

References

115

Click to edit Master title style References 1 Anderson DJ (2010) Kanban Sequim WA Blue Hole Press

Basili V Trendowicz A Kowalczyk M Heidrich J Seaman C Munch J amp Rombach D (2014) Aligning Organizations through Measurement Heidelberg Springer

Boehm B (1987) Increasing software productivity IEEE Computer 20 (9) 43-57

Boehm B et al (2000) Software Cost Estimation with COCOMO II Prentice-Hall

Chrissis MB Konrad M amp Shrum S (2011) CMMI Third Edition Guidelines for Process Integration and Product Improvement Reading MA Addison Wesley

Cohn M (2006) Agile Estimating and Planning Upper Saddle River NJ Prentice-Hall

Crosby P (1979) Quality Is Free New York McGraw-Hill

Curtis B Sapiddi J amp Szynkarski A (2012) Estimating the principle of an applicationrsquos technical debt IEEE Software 29 (6) 34-42

Dion R (1993) Process improvement and the corporate balance sheet IEEE Software 10 (4) 28-35

Duncker R (1992) Presentation at the 25th Annual Conference of the Singapore Computer Society Singapore November 1992

Dyne K (1999) ESSI Benefits from Continuous Software Improvements In Proceedings of ESEPGrsquo99 Milton Keyes UK ESPI Foundation

Flowe RM amp Thordahl JB (1994) A Correlational Study of the SEIrsquos Capability Maturity Model and Software Development Performance in DoD Contracts (AFITGSSLAR94D-2) Dayton OH Air Force Institute of Technology Wright Patterson Air Force Base

116

Click to edit Master title style References 2 Haley T Ireland B Wojtaszek E Nash D amp Dion R (1995) Raytheon Electronic Systems

Experience in Software Process Improvement (Tech Rep CMUSEI-95-TR-017) ) Pittsburgh Software Engineering Institute Carnegie Mellon University

Ebert C amp Dumke R (2007) Software Measurement Berlin Springer-Verlag

Fagan E amp Davis N (2005) TSP PSP at Intuit Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Fenton N (2006) New directions for Software Metrics CIO Symposium on Software Best Practices

Garmus D amp Herron D (2001) Function Point Analysis Boston Addison-Wesley

George E (2005) Project management with TSP Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Herbsleb J Carleton A Rozum J Siegel J amp Zubrow D (1994) Benefits of CMM-Based Software Process Improvement Initial Results (Tech Rep CMUSEI-94-TR-13) Pittsburgh Software Engineering Institute Carnegie Mellon University

Humphrey W (1997) Introduction to the Personal Software Process Boston Addison Wesley

Humphrey W (2000) Introduction to the Team Software Process Boston Addison Wesley

Humphrey W (2006) TSP Leading a Development Team Boston Addison Wesley

Jackson D (2009) A direct path to dependable software Communications of the ACM 52 (4)

Kan SH (1995) Metrics and Models in Software Quality Engineering Reading MA Addison-Wesley

Keeni B (2000) IEEE Software 16 (4)

117

Click to edit Master title style References 2 Larman C (2004) Agile and Iterative Development Boston Addison Wesley

Lydon T (1995) Productivity drivers Process and capital In Proceedings of the 1995 SEPG Conference Pittsburgh Software Engineering Institute Carnegie Mellon University

McGarry J et al (2002) Practical Software Measurement Reading MA Addison Wesley

Paulk M Weber C Curtis B amp Chrissis M (1995) The Capability Maturity Model Guidelines for Improving the Software Process Reading MA Addison-Wesley

Peterman W (2000) IEEE Software JulyAugust issue

Poppendieck M amp Poppendieck T (2007) Implementing Lean Software Development Boston Addison-Wesley

Sakamoto K Kishida K amp Nakakoji K (1996) Cultural adaptation of the CMM In Fuggetta A amp Wolf A (Eds) Software Process Chichester UK Wiley 137-154

Sinha M (2006) Performance models ndash Enabling organizations prediction capability Proceedings of SEPGrsquo06 Pittsburgh Software Engineering Institute Carnegie Mellon University

Schwaber K amp Beedle M (2002) Agile Software Development Using Scrum

Spinellis D (2006) Code Quality Addison-Wesley

Van Epps S amp Marshall R (2012) Using TSP at the end of the development cycle Proceedings of TSP 2012 Pittsburgh Software Engineering Institute Carnegie Mellon University

Vu JD (1996) Software process improvement A business case In Proceedings of the European SEPG Conference Milton Keynes UK European Software Process Improvement Foundation

118

  • Best Practices for Software Process and Product Measurement
  • Instructor Dr Bill Curtis
  • Agenda
  • Section 1
  • Why Does Measurement Languish
  • Measurement Process Guidance
  • Measurement Categories
  • Productivity Analysis Objectives
  • Productivity Analysis Measures
  • Software Productivity
  • Software Size Measures
  • How Many Lines of Code
  • Function Point Estimation
  • Effort mdash Weakest Measure
  • How Quality Affects Productivity
  • Carry-forward Rework
  • Example of Quality Impact
  • Quality-Adjusted Productivity
  • Best Practices for Analysis
  • 1 Evaluate Demographics
  • 2 Segment Baselines
  • 3 Beware Sampling Effects
  • 4 Understand Variation
  • 5 Investigate Distributions
  • 6 Account for Maturity Effects
  • 7 Caution for External Data Sets
  • Slide Number 27
  • Measuring Project Progress
  • Requirements Change Impact
  • Phase Defect Removal Model
  • SCRUM Board
  • Burndown Chart and Velocity
  • Analyzing Velocity
  • Measuring and Managing Flow
  • Trends Over Sprints
  • Sources and Measures
  • Recommended Books
  • Personal Software Process (PSP)
  • Team Software Process (TSP)
  • Aggregated Personal Data Best
  • Defect Detection at Intuit
  • TSP Benefits
  • Section 3
  • CMMI Maturity Transitions
  • Enhanced Maturity Framework
  • Maturity Level Transformation
  • Measurement by Level
  • Average Improvement Results
  • Schedule Performance
  • Cost Performance
  • Crosbyrsquos Cost of Quality Model
  • Raytheons Cost of Quality
  • Raytheonrsquos Productivity
  • Raytheonrsquos Cost Reduction
  • Raytheonrsquos Cost Predictability
  • OMRONrsquos Reuse gains
  • Section 4
  • What about the Product
  • Six Sigmarsquos Challenge
  • Testing is Not Enough
  • CASTrsquos Application Intelligence Platform
  • Modern Apps are a Technology Stack
  • Architecturally Complex Defects
  • Propagated Risk
  • Transaction Risk
  • Risk-Based Prioritization
  • Static Analysis in US DoD
  • Reducing Operational Incidents amp Costs
  • Reducing Operational Losses
  • Slide Number 70
  • The Technical Debt Metaphor
  • Tech Debt by Quality Factor
  • Join CISQ wwwit-cisqorg
  • Section
  • References 1
  • References 2
  • References 2
Page 51: Best Practices for Software Process and Product Measurement · Software Process and Product Measurement Dr. Bill Curtis ... measurement best practice • Best guidance for starting

Click to edit Master title style Maturity Level Transformation

Level 5 Opportunistic improvements

Empowered culture

Level 5 Proactive

improvements Agile culture

Organization

Projects

Individual

Level 4 Process and results

managed statistically Precision culture

Level 3 Organization develops

standard processes Common culture

Level 2 Project managers stabilize projects

Local culture

Level 1 Ad Hoc inconsistent

development practices Relationship culture

Trust

Click to edit Master title style Measurement by Level

Level 5 mdash Exploratory data ∆ t = N x ROI

Level 4 mdash Predictive data X plusmn (196σ radicn )

Level 3 mdash Process data 83 defects per review

Level 2 mdash Management data Effort = α LOCβ

Level 1 mdash Haphazard data ldquoabout a million linesrdquo

Measure-maturity mismatch slows improvement and creates

dysfunction

Click to edit Master title style Average Improvement Results

Productivity growth 4 35 9 - 67

Pre-test defect detection 3 22 6 - 25

Time to market 2 darr 19 15 - 23

Field error reports 5 darr 39 10 - 94

Return on investment 5 501 41 - 881

Annual Annual Improvement benefit Orgs median range

Click to edit Master title style Schedule Performance

0

02

04

06

08

1

12

14

16

1 2 3

Schedule performance

index

CMM maturity level

Budgeted cost of work performed

Budgeted cost of work scheduled

behind ahead

Air Force study of contractors

Flowe amp Thordahl (1994)

Click to edit Master title style Cost Performance

Cost performance

index

0

05

1

15

2

25

1 2 3 CMM Maturity Level

Budgeted cost of work performed Actual cost of

work performed

overrun underrun

Air Force study of contractors

Flowe amp Thordahl (1994)

Click to edit Master title style Crosbyrsquos Cost of Quality Model Project Cost

Performance Cost of Quality Generating plans documentation Development - requirements - design - code - integration Prevention Appraisal

Reviews - system - requirements - design - test plan - test scripts Walkthroughs Testing (first time) Independent Verification and Validation (first time) Audits

Training Policies Procedures and methods Tools Planning Quality Improvement Projects Data Gathering and Analysis Root Cause Analysis Quality Reporting

Nonconformance Conformance Fixing Defects Reworking any Document Updating Source Code Re-reviews Re-tests Lab costs for re-tests Patches(Internal and External) Engineering Changes Change Control Boards External Failures and fines Customer Support Help Desks

Dion (1993) Crosby (1979) Quality Is Free

Click to edit Master title style Raytheons Cost of Quality

bull Performance mdash cost of building it right first time bull Nonconformance mdash cost of rework bull Appraisal mdash cost of testing bull Prevention mdash cost of preventing nonconformance

Dion (1993) Haley (1995)

23

18

Year Level Perform Nonconf Appraise Prevent

1988 1 34 41 15 7

1990 2 55 18 15 12

1992 3 66 11

1994 4 76 6 75

Click to edit Master title style Raytheonrsquos Productivity

0

1

2

3

4

1988 1989 1990 1991 1992 1993 1994

Growth relative to 1989 baseline

Lydon (1995)

Click to edit Master title style Raytheonrsquos Cost Reduction

0

02

04

06

08

1

12

1988 1989 1990 1991 1992 1993 1994

Reduction relative to 1989 baseline

Lydon (1995)

Click to edit Master title style

Haley (1995)

1993 1992 1991 1990 1989 1988 1994

50

40

30

20

10

0

-10

Actual cost

Budgeted cost

over-run under-run

4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1

Raytheonrsquos Cost Predictability

Click to edit Master title style OMRONrsquos Reuse gains

Sakamoto et al (1996)

0

5 0 0

1 0 0 0

1 5 0 0

2 0 0 0

2 5 0 0

3 0 0 0

1 9 8 7 1 9 8 8 1 9 8 9 1 9 9 0 1 9 9 1 1 9 9 2 1 9 9 3 1 9 9 4

Steps

100

75

50

25

0

of reuse

Click to edit Master title style Section 4 Section 4 Measuring Quality and

Technical Debt

1) Structural quality measurement

2) Technical debt

3) Consortium for IT Software Quality (CISQ)

89

Click to edit Master title style What about the Product CMMIrsquos primary assumption

ldquoThe quality of a software system is governed by quality of the process used to develop itrdquo

- Watts Humphrey

CMMIrsquos product quality problem

1 Assessments do not verify product quality

2 Compliance focus undermines CMMIrsquos primary assumption

3 Product quality focus at Level 4 in CMM was lost in quantitative process control in CMMI

4 The assumptions underlying control charts are violated by software data

Click to edit Master title style Six Sigmarsquos Challenge

Process focus ndash process improvement ndash Six Sigma Product focus ndash product improvement ndash Design for 6σ

Product focus supplements product focus to unlock even more value from software

Six Sigma projects must have significant benefits Huge benefits

Large benefits

Good benefits

Okay benefits

Small benefits

Now what

Ultimately we run out of projects with significant enough benefits to continue the programhellip

so what is the solution that allows continual improvement

Click to edit Master title style Testing is Not Enough ldquoAs higher levels of assurance are demandedhelliptesting cannot deliver the level of confidence required at a reasonable costrdquo

ldquoThe correctness of the code is rarely the weakest linkrdquo

Jackson D (2009) Spinellis D (2006)

ldquohellipa failure to satisfy a non-functional requirement can be critical even catastrophichellipnon-functional requirements are sometimes difficult to verify We cannot write a test case to verify a systemrsquos reliabilityhellipThe ability to associate code to non-functional properties can be a powerful weapon in a software engineerrsquos arsenalrdquo

Click to edit Master title style CASTrsquos Application Intelligence Platform

Application Analysis

Evaluation of 1200+ coding amp

architectural rules

Application meta-data

Transferability

Changeability

Robustness

Performance

Security

Quality Measurements

Detected Violations

Expensive operation in loop Static vs pooled connections Complex query on big table Large indices on big table

Empty CATCH block Uncontrolled data access Poor memory management Opened resource not closed

SQL injection Cross-site scripting Buffer overflow Uncontrolled format string

Unstructured code Misuse of inheritance Lack of comments Violated naming convention

Highly coupled component Duplicated code Index modified in loop High cyclomatic complexity

Language Parsers

Oracle PLSQL Sybase T-SQL SQL Server T-SQL IBM SQLPSM C C++ C Pro C Cobol CICS Visual Basic VBNet ASPNet Java J2EE JSP XML HTML Javascript VBScript PHP PowerBuilder Oracle Forms PeopleSoft SAP ABAP Netweaver Tibco Business Objects Universal Analyzer for other languages

Click to edit Master title style Modern Apps are a Technology Stack Ar

chite

ctur

al C

ompl

ianc

e

Data Flow Transaction Risk

EJB PLSQL

Oracle

SQL Server

DB2

TSQL

Hibernate

Spring

Struts NET

COBOL

Sybase IMS

Messaging

Integration quality Architectural

compliance Risk propagation Application security Resiliency checks Transaction integrity

Function point Effort estimation Data access control SDK versioning Calibration across

technologies IT organization level

System Level 3

Code style amp layout Expression complexity Code documentation Class or program design Basic coding standards Developer level

Unit Level 1

Java Java Java

Java Java

Web Services Java Java Single languagetechnology layer

Intra-technology architecture Intra-layer dependencies Inter-program invocation Security vulnerabilities Development team level

Technology Level 2

JSP ASPNET APIs

Click to edit Master title style Architecturally Complex Defects

Primary cause of operational

problems

20x as many

fixes to correct

Architecturally Complex Defect

A structural flaw involving interactions among multiple components that reside in different application layers

48

52 92

8 Architecturally

Complex Defects

Code unit-level violations

of total app defects

of total repair effort

Most architecturally complex defects involve an architectural hotspot ndash a badly-built component that should be replaced

Click to edit Master title style Propagated Risk

The impact of violation 284 in Component B is very widely propagated throughout the system ndash

it presents very high risk

The impact of violation 342 in Component A is only narrowly propagated in the system

ndash it presents low risk

Propagated Risk Index

A ranking of the risk created by each violation based on the its severity and the breadth of its impact in the system

Violation 342 in component A

Violation 284 in component B

Remediating violation 284 in Component B will have

the greater impact on reducing risk and cost

Click to edit Master title style Transaction Risk

Customers care most about the dependability of their transactions

The risk of a transaction is determined by the number and severity of violations along its path

Transactions can be ranked by risk to prioritize their remediation

The Transaction Risk Index can be further tuned using the operational frequency of each transaction

Transaction Risk Index

A ranking of transactions based on the risk created by the number and severity of violations along the transaction path

Transaction entryexit in the user interface layer

Transaction fulfillment in the data storage layer

1 3 2 4 5

Transaction 1 poses greatest risk

Transaction processing in the business logic layer

Click to edit Master title style Risk-Based Prioritization

Violation Health Factor Severity Propagated Risk Index

In a high risk transaction

Frequency of path execution

148 Security 9

371 Robustness 9

062 Performance 8

284 Robustness 7

016 Changeability 7

241 Security 5

173 Transferability 4

359 Transferability 3

Click to edit Master title style Static Analysis in US DoD

John Keane (2013)

Testing finds lt13 of structural defects

Static analysis before testing is 85 efficient

With testing and static analysis combined test schedules are reduced by 50

Click to edit Master title style Reducing Operational Incidents amp Costs

Study of structural quality measures and maintenance effort across 20 customers in a large global system integrator

TQI increase of 24 decreased corrective maintenance effort by 50

Total Quality Index

Click to edit Master title style Reducing Operational Losses

0

20

40

60

80

100

120

140

160

180

100 150 200 250 300 350 400

Technical Quality Index

Pro

du

ctio

n In

cid

ents

+$2000000 Annual Loss

$50000 Annual Loss

$5000 Annual Loss

Large international investment bank Business critical applications

Click to edit Master title style Factors Affecting Structural Quality

2017 CRASH Report Request from infocastsoftwarecom

Factor apps Robust Security Perform Change Transfer

Technology 1606 20 3 16 26 7

Java-EE

Industry 677 5 5 7 4 6

App type 510 1 1 1 1 1

Source 580 1 1 4

Shore 540

Maturity 72 28 25 15 24 12

Method 208 10 6 14 6

Team size 186 7 5 8 8

of users 262 4 3 5 4

Percentage of variation in structural quality scores accounted for by various factors

Click to edit Master title style The Technical Debt Metaphor

Structural quality problems in production code

Technical Debt

Principal borrowed

Interest on the debt

Business Risk Liability from debt

Opportunity cost

Interestmdashcontinuing IT costs attributable to the violations causing technical debt ie higher maintenance costs greater resource usage etc

Principalcost of fixing problems remaining in the code after release that must be remediated

Opportunity costmdashbenefits that could have been achieved had resources been put on new capability rather than retiring technical debt

Liabilitymdashbusiness costs related to outages breaches corrupted data etc

Technical Debt the future cost of fixing defects remaining in code at release a component of the cost of ownership

Todayrsquos talk focuses on the principal

Curtis et al (2012) IEEE Software

Click to edit Master title style Tech Debt by Quality Factor

Transferability

40

Changeability

30

Security 7

Robustness

18

70 of Technical Debt is in IT Cost (Transferability Changeability)

30 of Technical Debt is in Business Risk

(Robustness Performance Security)

Health Factor proportions are mostly

consistent across technologies

Curtis et al (2012) IEEE Software

Click to edit Master title style Join CISQ wwwit-cisqorg

Click to edit Master title style Section

References

115

Click to edit Master title style References 1 Anderson DJ (2010) Kanban Sequim WA Blue Hole Press

Basili V Trendowicz A Kowalczyk M Heidrich J Seaman C Munch J amp Rombach D (2014) Aligning Organizations through Measurement Heidelberg Springer

Boehm B (1987) Increasing software productivity IEEE Computer 20 (9) 43-57

Boehm B et al (2000) Software Cost Estimation with COCOMO II Prentice-Hall

Chrissis MB Konrad M amp Shrum S (2011) CMMI Third Edition Guidelines for Process Integration and Product Improvement Reading MA Addison Wesley

Cohn M (2006) Agile Estimating and Planning Upper Saddle River NJ Prentice-Hall

Crosby P (1979) Quality Is Free New York McGraw-Hill

Curtis B Sapiddi J amp Szynkarski A (2012) Estimating the principle of an applicationrsquos technical debt IEEE Software 29 (6) 34-42

Dion R (1993) Process improvement and the corporate balance sheet IEEE Software 10 (4) 28-35

Duncker R (1992) Presentation at the 25th Annual Conference of the Singapore Computer Society Singapore November 1992

Dyne K (1999) ESSI Benefits from Continuous Software Improvements In Proceedings of ESEPGrsquo99 Milton Keyes UK ESPI Foundation

Flowe RM amp Thordahl JB (1994) A Correlational Study of the SEIrsquos Capability Maturity Model and Software Development Performance in DoD Contracts (AFITGSSLAR94D-2) Dayton OH Air Force Institute of Technology Wright Patterson Air Force Base

116

Click to edit Master title style References 2 Haley T Ireland B Wojtaszek E Nash D amp Dion R (1995) Raytheon Electronic Systems

Experience in Software Process Improvement (Tech Rep CMUSEI-95-TR-017) ) Pittsburgh Software Engineering Institute Carnegie Mellon University

Ebert C amp Dumke R (2007) Software Measurement Berlin Springer-Verlag

Fagan E amp Davis N (2005) TSP PSP at Intuit Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Fenton N (2006) New directions for Software Metrics CIO Symposium on Software Best Practices

Garmus D amp Herron D (2001) Function Point Analysis Boston Addison-Wesley

George E (2005) Project management with TSP Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Herbsleb J Carleton A Rozum J Siegel J amp Zubrow D (1994) Benefits of CMM-Based Software Process Improvement Initial Results (Tech Rep CMUSEI-94-TR-13) Pittsburgh Software Engineering Institute Carnegie Mellon University

Humphrey W (1997) Introduction to the Personal Software Process Boston Addison Wesley

Humphrey W (2000) Introduction to the Team Software Process Boston Addison Wesley

Humphrey W (2006) TSP Leading a Development Team Boston Addison Wesley

Jackson D (2009) A direct path to dependable software Communications of the ACM 52 (4)

Kan SH (1995) Metrics and Models in Software Quality Engineering Reading MA Addison-Wesley

Keeni B (2000) IEEE Software 16 (4)

117

Click to edit Master title style References 2 Larman C (2004) Agile and Iterative Development Boston Addison Wesley

Lydon T (1995) Productivity drivers Process and capital In Proceedings of the 1995 SEPG Conference Pittsburgh Software Engineering Institute Carnegie Mellon University

McGarry J et al (2002) Practical Software Measurement Reading MA Addison Wesley

Paulk M Weber C Curtis B amp Chrissis M (1995) The Capability Maturity Model Guidelines for Improving the Software Process Reading MA Addison-Wesley

Peterman W (2000) IEEE Software JulyAugust issue

Poppendieck M amp Poppendieck T (2007) Implementing Lean Software Development Boston Addison-Wesley

Sakamoto K Kishida K amp Nakakoji K (1996) Cultural adaptation of the CMM In Fuggetta A amp Wolf A (Eds) Software Process Chichester UK Wiley 137-154

Sinha M (2006) Performance models ndash Enabling organizations prediction capability Proceedings of SEPGrsquo06 Pittsburgh Software Engineering Institute Carnegie Mellon University

Schwaber K amp Beedle M (2002) Agile Software Development Using Scrum

Spinellis D (2006) Code Quality Addison-Wesley

Van Epps S amp Marshall R (2012) Using TSP at the end of the development cycle Proceedings of TSP 2012 Pittsburgh Software Engineering Institute Carnegie Mellon University

Vu JD (1996) Software process improvement A business case In Proceedings of the European SEPG Conference Milton Keynes UK European Software Process Improvement Foundation

118

  • Best Practices for Software Process and Product Measurement
  • Instructor Dr Bill Curtis
  • Agenda
  • Section 1
  • Why Does Measurement Languish
  • Measurement Process Guidance
  • Measurement Categories
  • Productivity Analysis Objectives
  • Productivity Analysis Measures
  • Software Productivity
  • Software Size Measures
  • How Many Lines of Code
  • Function Point Estimation
  • Effort mdash Weakest Measure
  • How Quality Affects Productivity
  • Carry-forward Rework
  • Example of Quality Impact
  • Quality-Adjusted Productivity
  • Best Practices for Analysis
  • 1 Evaluate Demographics
  • 2 Segment Baselines
  • 3 Beware Sampling Effects
  • 4 Understand Variation
  • 5 Investigate Distributions
  • 6 Account for Maturity Effects
  • 7 Caution for External Data Sets
  • Slide Number 27
  • Measuring Project Progress
  • Requirements Change Impact
  • Phase Defect Removal Model
  • SCRUM Board
  • Burndown Chart and Velocity
  • Analyzing Velocity
  • Measuring and Managing Flow
  • Trends Over Sprints
  • Sources and Measures
  • Recommended Books
  • Personal Software Process (PSP)
  • Team Software Process (TSP)
  • Aggregated Personal Data Best
  • Defect Detection at Intuit
  • TSP Benefits
  • Section 3
  • CMMI Maturity Transitions
  • Enhanced Maturity Framework
  • Maturity Level Transformation
  • Measurement by Level
  • Average Improvement Results
  • Schedule Performance
  • Cost Performance
  • Crosbyrsquos Cost of Quality Model
  • Raytheons Cost of Quality
  • Raytheonrsquos Productivity
  • Raytheonrsquos Cost Reduction
  • Raytheonrsquos Cost Predictability
  • OMRONrsquos Reuse gains
  • Section 4
  • What about the Product
  • Six Sigmarsquos Challenge
  • Testing is Not Enough
  • CASTrsquos Application Intelligence Platform
  • Modern Apps are a Technology Stack
  • Architecturally Complex Defects
  • Propagated Risk
  • Transaction Risk
  • Risk-Based Prioritization
  • Static Analysis in US DoD
  • Reducing Operational Incidents amp Costs
  • Reducing Operational Losses
  • Slide Number 70
  • The Technical Debt Metaphor
  • Tech Debt by Quality Factor
  • Join CISQ wwwit-cisqorg
  • Section
  • References 1
  • References 2
  • References 2
Page 52: Best Practices for Software Process and Product Measurement · Software Process and Product Measurement Dr. Bill Curtis ... measurement best practice • Best guidance for starting

Click to edit Master title style Measurement by Level

Level 5 mdash Exploratory data ∆ t = N x ROI

Level 4 mdash Predictive data X plusmn (196σ radicn )

Level 3 mdash Process data 83 defects per review

Level 2 mdash Management data Effort = α LOCβ

Level 1 mdash Haphazard data ldquoabout a million linesrdquo

Measure-maturity mismatch slows improvement and creates

dysfunction

Click to edit Master title style Average Improvement Results

Productivity growth 4 35 9 - 67

Pre-test defect detection 3 22 6 - 25

Time to market 2 darr 19 15 - 23

Field error reports 5 darr 39 10 - 94

Return on investment 5 501 41 - 881

Annual Annual Improvement benefit Orgs median range

Click to edit Master title style Schedule Performance

0

02

04

06

08

1

12

14

16

1 2 3

Schedule performance

index

CMM maturity level

Budgeted cost of work performed

Budgeted cost of work scheduled

behind ahead

Air Force study of contractors

Flowe amp Thordahl (1994)

Click to edit Master title style Cost Performance

Cost performance

index

0

05

1

15

2

25

1 2 3 CMM Maturity Level

Budgeted cost of work performed Actual cost of

work performed

overrun underrun

Air Force study of contractors

Flowe amp Thordahl (1994)

Click to edit Master title style Crosbyrsquos Cost of Quality Model Project Cost

Performance Cost of Quality Generating plans documentation Development - requirements - design - code - integration Prevention Appraisal

Reviews - system - requirements - design - test plan - test scripts Walkthroughs Testing (first time) Independent Verification and Validation (first time) Audits

Training Policies Procedures and methods Tools Planning Quality Improvement Projects Data Gathering and Analysis Root Cause Analysis Quality Reporting

Nonconformance Conformance Fixing Defects Reworking any Document Updating Source Code Re-reviews Re-tests Lab costs for re-tests Patches(Internal and External) Engineering Changes Change Control Boards External Failures and fines Customer Support Help Desks

Dion (1993) Crosby (1979) Quality Is Free

Click to edit Master title style Raytheons Cost of Quality

bull Performance mdash cost of building it right first time bull Nonconformance mdash cost of rework bull Appraisal mdash cost of testing bull Prevention mdash cost of preventing nonconformance

Dion (1993) Haley (1995)

23

18

Year Level Perform Nonconf Appraise Prevent

1988 1 34 41 15 7

1990 2 55 18 15 12

1992 3 66 11

1994 4 76 6 75

Click to edit Master title style Raytheonrsquos Productivity

0

1

2

3

4

1988 1989 1990 1991 1992 1993 1994

Growth relative to 1989 baseline

Lydon (1995)

Click to edit Master title style Raytheonrsquos Cost Reduction

0

02

04

06

08

1

12

1988 1989 1990 1991 1992 1993 1994

Reduction relative to 1989 baseline

Lydon (1995)

Click to edit Master title style

Haley (1995)

1993 1992 1991 1990 1989 1988 1994

50

40

30

20

10

0

-10

Actual cost

Budgeted cost

over-run under-run

4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1

Raytheonrsquos Cost Predictability

Click to edit Master title style OMRONrsquos Reuse gains

Sakamoto et al (1996)

0

5 0 0

1 0 0 0

1 5 0 0

2 0 0 0

2 5 0 0

3 0 0 0

1 9 8 7 1 9 8 8 1 9 8 9 1 9 9 0 1 9 9 1 1 9 9 2 1 9 9 3 1 9 9 4

Steps

100

75

50

25

0

of reuse

Click to edit Master title style Section 4 Section 4 Measuring Quality and

Technical Debt

1) Structural quality measurement

2) Technical debt

3) Consortium for IT Software Quality (CISQ)

89

Click to edit Master title style What about the Product CMMIrsquos primary assumption

ldquoThe quality of a software system is governed by quality of the process used to develop itrdquo

- Watts Humphrey

CMMIrsquos product quality problem

1 Assessments do not verify product quality

2 Compliance focus undermines CMMIrsquos primary assumption

3 Product quality focus at Level 4 in CMM was lost in quantitative process control in CMMI

4 The assumptions underlying control charts are violated by software data

Click to edit Master title style Six Sigmarsquos Challenge

Process focus ndash process improvement ndash Six Sigma Product focus ndash product improvement ndash Design for 6σ

Product focus supplements product focus to unlock even more value from software

Six Sigma projects must have significant benefits Huge benefits

Large benefits

Good benefits

Okay benefits

Small benefits

Now what

Ultimately we run out of projects with significant enough benefits to continue the programhellip

so what is the solution that allows continual improvement

Click to edit Master title style Testing is Not Enough ldquoAs higher levels of assurance are demandedhelliptesting cannot deliver the level of confidence required at a reasonable costrdquo

ldquoThe correctness of the code is rarely the weakest linkrdquo

Jackson D (2009) Spinellis D (2006)

ldquohellipa failure to satisfy a non-functional requirement can be critical even catastrophichellipnon-functional requirements are sometimes difficult to verify We cannot write a test case to verify a systemrsquos reliabilityhellipThe ability to associate code to non-functional properties can be a powerful weapon in a software engineerrsquos arsenalrdquo

Click to edit Master title style CASTrsquos Application Intelligence Platform

Application Analysis

Evaluation of 1200+ coding amp

architectural rules

Application meta-data

Transferability

Changeability

Robustness

Performance

Security

Quality Measurements

Detected Violations

Expensive operation in loop Static vs pooled connections Complex query on big table Large indices on big table

Empty CATCH block Uncontrolled data access Poor memory management Opened resource not closed

SQL injection Cross-site scripting Buffer overflow Uncontrolled format string

Unstructured code Misuse of inheritance Lack of comments Violated naming convention

Highly coupled component Duplicated code Index modified in loop High cyclomatic complexity

Language Parsers

Oracle PLSQL Sybase T-SQL SQL Server T-SQL IBM SQLPSM C C++ C Pro C Cobol CICS Visual Basic VBNet ASPNet Java J2EE JSP XML HTML Javascript VBScript PHP PowerBuilder Oracle Forms PeopleSoft SAP ABAP Netweaver Tibco Business Objects Universal Analyzer for other languages

Click to edit Master title style Modern Apps are a Technology Stack Ar

chite

ctur

al C

ompl

ianc

e

Data Flow Transaction Risk

EJB PLSQL

Oracle

SQL Server

DB2

TSQL

Hibernate

Spring

Struts NET

COBOL

Sybase IMS

Messaging

Integration quality Architectural

compliance Risk propagation Application security Resiliency checks Transaction integrity

Function point Effort estimation Data access control SDK versioning Calibration across

technologies IT organization level

System Level 3

Code style amp layout Expression complexity Code documentation Class or program design Basic coding standards Developer level

Unit Level 1

Java Java Java

Java Java

Web Services Java Java Single languagetechnology layer

Intra-technology architecture Intra-layer dependencies Inter-program invocation Security vulnerabilities Development team level

Technology Level 2

JSP ASPNET APIs

Click to edit Master title style Architecturally Complex Defects

Primary cause of operational

problems

20x as many

fixes to correct

Architecturally Complex Defect

A structural flaw involving interactions among multiple components that reside in different application layers

48

52 92

8 Architecturally

Complex Defects

Code unit-level violations

of total app defects

of total repair effort

Most architecturally complex defects involve an architectural hotspot ndash a badly-built component that should be replaced

Click to edit Master title style Propagated Risk

The impact of violation 284 in Component B is very widely propagated throughout the system ndash

it presents very high risk

The impact of violation 342 in Component A is only narrowly propagated in the system

ndash it presents low risk

Propagated Risk Index

A ranking of the risk created by each violation based on the its severity and the breadth of its impact in the system

Violation 342 in component A

Violation 284 in component B

Remediating violation 284 in Component B will have

the greater impact on reducing risk and cost

Click to edit Master title style Transaction Risk

Customers care most about the dependability of their transactions

The risk of a transaction is determined by the number and severity of violations along its path

Transactions can be ranked by risk to prioritize their remediation

The Transaction Risk Index can be further tuned using the operational frequency of each transaction

Transaction Risk Index

A ranking of transactions based on the risk created by the number and severity of violations along the transaction path

Transaction entryexit in the user interface layer

Transaction fulfillment in the data storage layer

1 3 2 4 5

Transaction 1 poses greatest risk

Transaction processing in the business logic layer

Click to edit Master title style Risk-Based Prioritization

Violation Health Factor Severity Propagated Risk Index

In a high risk transaction

Frequency of path execution

148 Security 9

371 Robustness 9

062 Performance 8

284 Robustness 7

016 Changeability 7

241 Security 5

173 Transferability 4

359 Transferability 3

Click to edit Master title style Static Analysis in US DoD

John Keane (2013)

Testing finds lt13 of structural defects

Static analysis before testing is 85 efficient

With testing and static analysis combined test schedules are reduced by 50

Click to edit Master title style Reducing Operational Incidents amp Costs

Study of structural quality measures and maintenance effort across 20 customers in a large global system integrator

TQI increase of 24 decreased corrective maintenance effort by 50

Total Quality Index

Click to edit Master title style Reducing Operational Losses

0

20

40

60

80

100

120

140

160

180

100 150 200 250 300 350 400

Technical Quality Index

Pro

du

ctio

n In

cid

ents

+$2000000 Annual Loss

$50000 Annual Loss

$5000 Annual Loss

Large international investment bank Business critical applications

Click to edit Master title style Factors Affecting Structural Quality

2017 CRASH Report Request from infocastsoftwarecom

Factor apps Robust Security Perform Change Transfer

Technology 1606 20 3 16 26 7

Java-EE

Industry 677 5 5 7 4 6

App type 510 1 1 1 1 1

Source 580 1 1 4

Shore 540

Maturity 72 28 25 15 24 12

Method 208 10 6 14 6

Team size 186 7 5 8 8

of users 262 4 3 5 4

Percentage of variation in structural quality scores accounted for by various factors

Click to edit Master title style The Technical Debt Metaphor

Structural quality problems in production code

Technical Debt

Principal borrowed

Interest on the debt

Business Risk Liability from debt

Opportunity cost

Interestmdashcontinuing IT costs attributable to the violations causing technical debt ie higher maintenance costs greater resource usage etc

Principalcost of fixing problems remaining in the code after release that must be remediated

Opportunity costmdashbenefits that could have been achieved had resources been put on new capability rather than retiring technical debt

Liabilitymdashbusiness costs related to outages breaches corrupted data etc

Technical Debt the future cost of fixing defects remaining in code at release a component of the cost of ownership

Todayrsquos talk focuses on the principal

Curtis et al (2012) IEEE Software

Click to edit Master title style Tech Debt by Quality Factor

Transferability

40

Changeability

30

Security 7

Robustness

18

70 of Technical Debt is in IT Cost (Transferability Changeability)

30 of Technical Debt is in Business Risk

(Robustness Performance Security)

Health Factor proportions are mostly

consistent across technologies

Curtis et al (2012) IEEE Software

Click to edit Master title style Join CISQ wwwit-cisqorg

Click to edit Master title style Section

References

115

Click to edit Master title style References 1 Anderson DJ (2010) Kanban Sequim WA Blue Hole Press

Basili V Trendowicz A Kowalczyk M Heidrich J Seaman C Munch J amp Rombach D (2014) Aligning Organizations through Measurement Heidelberg Springer

Boehm B (1987) Increasing software productivity IEEE Computer 20 (9) 43-57

Boehm B et al (2000) Software Cost Estimation with COCOMO II Prentice-Hall

Chrissis MB Konrad M amp Shrum S (2011) CMMI Third Edition Guidelines for Process Integration and Product Improvement Reading MA Addison Wesley

Cohn M (2006) Agile Estimating and Planning Upper Saddle River NJ Prentice-Hall

Crosby P (1979) Quality Is Free New York McGraw-Hill

Curtis B Sapiddi J amp Szynkarski A (2012) Estimating the principle of an applicationrsquos technical debt IEEE Software 29 (6) 34-42

Dion R (1993) Process improvement and the corporate balance sheet IEEE Software 10 (4) 28-35

Duncker R (1992) Presentation at the 25th Annual Conference of the Singapore Computer Society Singapore November 1992

Dyne K (1999) ESSI Benefits from Continuous Software Improvements In Proceedings of ESEPGrsquo99 Milton Keyes UK ESPI Foundation

Flowe RM amp Thordahl JB (1994) A Correlational Study of the SEIrsquos Capability Maturity Model and Software Development Performance in DoD Contracts (AFITGSSLAR94D-2) Dayton OH Air Force Institute of Technology Wright Patterson Air Force Base

116

Click to edit Master title style References 2 Haley T Ireland B Wojtaszek E Nash D amp Dion R (1995) Raytheon Electronic Systems

Experience in Software Process Improvement (Tech Rep CMUSEI-95-TR-017) ) Pittsburgh Software Engineering Institute Carnegie Mellon University

Ebert C amp Dumke R (2007) Software Measurement Berlin Springer-Verlag

Fagan E amp Davis N (2005) TSP PSP at Intuit Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Fenton N (2006) New directions for Software Metrics CIO Symposium on Software Best Practices

Garmus D amp Herron D (2001) Function Point Analysis Boston Addison-Wesley

George E (2005) Project management with TSP Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Herbsleb J Carleton A Rozum J Siegel J amp Zubrow D (1994) Benefits of CMM-Based Software Process Improvement Initial Results (Tech Rep CMUSEI-94-TR-13) Pittsburgh Software Engineering Institute Carnegie Mellon University

Humphrey W (1997) Introduction to the Personal Software Process Boston Addison Wesley

Humphrey W (2000) Introduction to the Team Software Process Boston Addison Wesley

Humphrey W (2006) TSP Leading a Development Team Boston Addison Wesley

Jackson D (2009) A direct path to dependable software Communications of the ACM 52 (4)

Kan SH (1995) Metrics and Models in Software Quality Engineering Reading MA Addison-Wesley

Keeni B (2000) IEEE Software 16 (4)

117

Click to edit Master title style References 2 Larman C (2004) Agile and Iterative Development Boston Addison Wesley

Lydon T (1995) Productivity drivers Process and capital In Proceedings of the 1995 SEPG Conference Pittsburgh Software Engineering Institute Carnegie Mellon University

McGarry J et al (2002) Practical Software Measurement Reading MA Addison Wesley

Paulk M Weber C Curtis B amp Chrissis M (1995) The Capability Maturity Model Guidelines for Improving the Software Process Reading MA Addison-Wesley

Peterman W (2000) IEEE Software JulyAugust issue

Poppendieck M amp Poppendieck T (2007) Implementing Lean Software Development Boston Addison-Wesley

Sakamoto K Kishida K amp Nakakoji K (1996) Cultural adaptation of the CMM In Fuggetta A amp Wolf A (Eds) Software Process Chichester UK Wiley 137-154

Sinha M (2006) Performance models ndash Enabling organizations prediction capability Proceedings of SEPGrsquo06 Pittsburgh Software Engineering Institute Carnegie Mellon University

Schwaber K amp Beedle M (2002) Agile Software Development Using Scrum

Spinellis D (2006) Code Quality Addison-Wesley

Van Epps S amp Marshall R (2012) Using TSP at the end of the development cycle Proceedings of TSP 2012 Pittsburgh Software Engineering Institute Carnegie Mellon University

Vu JD (1996) Software process improvement A business case In Proceedings of the European SEPG Conference Milton Keynes UK European Software Process Improvement Foundation

118

  • Best Practices for Software Process and Product Measurement
  • Instructor Dr Bill Curtis
  • Agenda
  • Section 1
  • Why Does Measurement Languish
  • Measurement Process Guidance
  • Measurement Categories
  • Productivity Analysis Objectives
  • Productivity Analysis Measures
  • Software Productivity
  • Software Size Measures
  • How Many Lines of Code
  • Function Point Estimation
  • Effort mdash Weakest Measure
  • How Quality Affects Productivity
  • Carry-forward Rework
  • Example of Quality Impact
  • Quality-Adjusted Productivity
  • Best Practices for Analysis
  • 1 Evaluate Demographics
  • 2 Segment Baselines
  • 3 Beware Sampling Effects
  • 4 Understand Variation
  • 5 Investigate Distributions
  • 6 Account for Maturity Effects
  • 7 Caution for External Data Sets
  • Slide Number 27
  • Measuring Project Progress
  • Requirements Change Impact
  • Phase Defect Removal Model
  • SCRUM Board
  • Burndown Chart and Velocity
  • Analyzing Velocity
  • Measuring and Managing Flow
  • Trends Over Sprints
  • Sources and Measures
  • Recommended Books
  • Personal Software Process (PSP)
  • Team Software Process (TSP)
  • Aggregated Personal Data Best
  • Defect Detection at Intuit
  • TSP Benefits
  • Section 3
  • CMMI Maturity Transitions
  • Enhanced Maturity Framework
  • Maturity Level Transformation
  • Measurement by Level
  • Average Improvement Results
  • Schedule Performance
  • Cost Performance
  • Crosbyrsquos Cost of Quality Model
  • Raytheons Cost of Quality
  • Raytheonrsquos Productivity
  • Raytheonrsquos Cost Reduction
  • Raytheonrsquos Cost Predictability
  • OMRONrsquos Reuse gains
  • Section 4
  • What about the Product
  • Six Sigmarsquos Challenge
  • Testing is Not Enough
  • CASTrsquos Application Intelligence Platform
  • Modern Apps are a Technology Stack
  • Architecturally Complex Defects
  • Propagated Risk
  • Transaction Risk
  • Risk-Based Prioritization
  • Static Analysis in US DoD
  • Reducing Operational Incidents amp Costs
  • Reducing Operational Losses
  • Slide Number 70
  • The Technical Debt Metaphor
  • Tech Debt by Quality Factor
  • Join CISQ wwwit-cisqorg
  • Section
  • References 1
  • References 2
  • References 2
Page 53: Best Practices for Software Process and Product Measurement · Software Process and Product Measurement Dr. Bill Curtis ... measurement best practice • Best guidance for starting

Click to edit Master title style Average Improvement Results

Productivity growth 4 35 9 - 67

Pre-test defect detection 3 22 6 - 25

Time to market 2 darr 19 15 - 23

Field error reports 5 darr 39 10 - 94

Return on investment 5 501 41 - 881

Annual Annual Improvement benefit Orgs median range

Click to edit Master title style Schedule Performance

0

02

04

06

08

1

12

14

16

1 2 3

Schedule performance

index

CMM maturity level

Budgeted cost of work performed

Budgeted cost of work scheduled

behind ahead

Air Force study of contractors

Flowe amp Thordahl (1994)

Click to edit Master title style Cost Performance

Cost performance

index

0

05

1

15

2

25

1 2 3 CMM Maturity Level

Budgeted cost of work performed Actual cost of

work performed

overrun underrun

Air Force study of contractors

Flowe amp Thordahl (1994)

Click to edit Master title style Crosbyrsquos Cost of Quality Model Project Cost

Performance Cost of Quality Generating plans documentation Development - requirements - design - code - integration Prevention Appraisal

Reviews - system - requirements - design - test plan - test scripts Walkthroughs Testing (first time) Independent Verification and Validation (first time) Audits

Training Policies Procedures and methods Tools Planning Quality Improvement Projects Data Gathering and Analysis Root Cause Analysis Quality Reporting

Nonconformance Conformance Fixing Defects Reworking any Document Updating Source Code Re-reviews Re-tests Lab costs for re-tests Patches(Internal and External) Engineering Changes Change Control Boards External Failures and fines Customer Support Help Desks

Dion (1993) Crosby (1979) Quality Is Free

Click to edit Master title style Raytheons Cost of Quality

bull Performance mdash cost of building it right first time bull Nonconformance mdash cost of rework bull Appraisal mdash cost of testing bull Prevention mdash cost of preventing nonconformance

Dion (1993) Haley (1995)

23

18

Year Level Perform Nonconf Appraise Prevent

1988 1 34 41 15 7

1990 2 55 18 15 12

1992 3 66 11

1994 4 76 6 75

Click to edit Master title style Raytheonrsquos Productivity

0

1

2

3

4

1988 1989 1990 1991 1992 1993 1994

Growth relative to 1989 baseline

Lydon (1995)

Click to edit Master title style Raytheonrsquos Cost Reduction

0

02

04

06

08

1

12

1988 1989 1990 1991 1992 1993 1994

Reduction relative to 1989 baseline

Lydon (1995)

Click to edit Master title style

Haley (1995)

1993 1992 1991 1990 1989 1988 1994

50

40

30

20

10

0

-10

Actual cost

Budgeted cost

over-run under-run

4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1

Raytheonrsquos Cost Predictability

Click to edit Master title style OMRONrsquos Reuse gains

Sakamoto et al (1996)

0

5 0 0

1 0 0 0

1 5 0 0

2 0 0 0

2 5 0 0

3 0 0 0

1 9 8 7 1 9 8 8 1 9 8 9 1 9 9 0 1 9 9 1 1 9 9 2 1 9 9 3 1 9 9 4

Steps

100

75

50

25

0

of reuse

Click to edit Master title style Section 4 Section 4 Measuring Quality and

Technical Debt

1) Structural quality measurement

2) Technical debt

3) Consortium for IT Software Quality (CISQ)

89

Click to edit Master title style What about the Product CMMIrsquos primary assumption

ldquoThe quality of a software system is governed by quality of the process used to develop itrdquo

- Watts Humphrey

CMMIrsquos product quality problem

1 Assessments do not verify product quality

2 Compliance focus undermines CMMIrsquos primary assumption

3 Product quality focus at Level 4 in CMM was lost in quantitative process control in CMMI

4 The assumptions underlying control charts are violated by software data

Click to edit Master title style Six Sigmarsquos Challenge

Process focus ndash process improvement ndash Six Sigma Product focus ndash product improvement ndash Design for 6σ

Product focus supplements product focus to unlock even more value from software

Six Sigma projects must have significant benefits Huge benefits

Large benefits

Good benefits

Okay benefits

Small benefits

Now what

Ultimately we run out of projects with significant enough benefits to continue the programhellip

so what is the solution that allows continual improvement

Click to edit Master title style Testing is Not Enough ldquoAs higher levels of assurance are demandedhelliptesting cannot deliver the level of confidence required at a reasonable costrdquo

ldquoThe correctness of the code is rarely the weakest linkrdquo

Jackson D (2009) Spinellis D (2006)

ldquohellipa failure to satisfy a non-functional requirement can be critical even catastrophichellipnon-functional requirements are sometimes difficult to verify We cannot write a test case to verify a systemrsquos reliabilityhellipThe ability to associate code to non-functional properties can be a powerful weapon in a software engineerrsquos arsenalrdquo

Click to edit Master title style CASTrsquos Application Intelligence Platform

Application Analysis

Evaluation of 1200+ coding amp

architectural rules

Application meta-data

Transferability

Changeability

Robustness

Performance

Security

Quality Measurements

Detected Violations

Expensive operation in loop Static vs pooled connections Complex query on big table Large indices on big table

Empty CATCH block Uncontrolled data access Poor memory management Opened resource not closed

SQL injection Cross-site scripting Buffer overflow Uncontrolled format string

Unstructured code Misuse of inheritance Lack of comments Violated naming convention

Highly coupled component Duplicated code Index modified in loop High cyclomatic complexity

Language Parsers

Oracle PLSQL Sybase T-SQL SQL Server T-SQL IBM SQLPSM C C++ C Pro C Cobol CICS Visual Basic VBNet ASPNet Java J2EE JSP XML HTML Javascript VBScript PHP PowerBuilder Oracle Forms PeopleSoft SAP ABAP Netweaver Tibco Business Objects Universal Analyzer for other languages

Click to edit Master title style Modern Apps are a Technology Stack Ar

chite

ctur

al C

ompl

ianc

e

Data Flow Transaction Risk

EJB PLSQL

Oracle

SQL Server

DB2

TSQL

Hibernate

Spring

Struts NET

COBOL

Sybase IMS

Messaging

Integration quality Architectural

compliance Risk propagation Application security Resiliency checks Transaction integrity

Function point Effort estimation Data access control SDK versioning Calibration across

technologies IT organization level

System Level 3

Code style amp layout Expression complexity Code documentation Class or program design Basic coding standards Developer level

Unit Level 1

Java Java Java

Java Java

Web Services Java Java Single languagetechnology layer

Intra-technology architecture Intra-layer dependencies Inter-program invocation Security vulnerabilities Development team level

Technology Level 2

JSP ASPNET APIs

Click to edit Master title style Architecturally Complex Defects

Primary cause of operational

problems

20x as many

fixes to correct

Architecturally Complex Defect

A structural flaw involving interactions among multiple components that reside in different application layers

48

52 92

8 Architecturally

Complex Defects

Code unit-level violations

of total app defects

of total repair effort

Most architecturally complex defects involve an architectural hotspot ndash a badly-built component that should be replaced

Click to edit Master title style Propagated Risk

The impact of violation 284 in Component B is very widely propagated throughout the system ndash

it presents very high risk

The impact of violation 342 in Component A is only narrowly propagated in the system

ndash it presents low risk

Propagated Risk Index

A ranking of the risk created by each violation based on the its severity and the breadth of its impact in the system

Violation 342 in component A

Violation 284 in component B

Remediating violation 284 in Component B will have

the greater impact on reducing risk and cost

Click to edit Master title style Transaction Risk

Customers care most about the dependability of their transactions

The risk of a transaction is determined by the number and severity of violations along its path

Transactions can be ranked by risk to prioritize their remediation

The Transaction Risk Index can be further tuned using the operational frequency of each transaction

Transaction Risk Index

A ranking of transactions based on the risk created by the number and severity of violations along the transaction path

Transaction entryexit in the user interface layer

Transaction fulfillment in the data storage layer

1 3 2 4 5

Transaction 1 poses greatest risk

Transaction processing in the business logic layer

Click to edit Master title style Risk-Based Prioritization

Violation Health Factor Severity Propagated Risk Index

In a high risk transaction

Frequency of path execution

148 Security 9

371 Robustness 9

062 Performance 8

284 Robustness 7

016 Changeability 7

241 Security 5

173 Transferability 4

359 Transferability 3

Click to edit Master title style Static Analysis in US DoD

John Keane (2013)

Testing finds lt13 of structural defects

Static analysis before testing is 85 efficient

With testing and static analysis combined test schedules are reduced by 50

Click to edit Master title style Reducing Operational Incidents amp Costs

Study of structural quality measures and maintenance effort across 20 customers in a large global system integrator

TQI increase of 24 decreased corrective maintenance effort by 50

Total Quality Index

Click to edit Master title style Reducing Operational Losses

0

20

40

60

80

100

120

140

160

180

100 150 200 250 300 350 400

Technical Quality Index

Pro

du

ctio

n In

cid

ents

+$2000000 Annual Loss

$50000 Annual Loss

$5000 Annual Loss

Large international investment bank Business critical applications

Click to edit Master title style Factors Affecting Structural Quality

2017 CRASH Report Request from infocastsoftwarecom

Factor apps Robust Security Perform Change Transfer

Technology 1606 20 3 16 26 7

Java-EE

Industry 677 5 5 7 4 6

App type 510 1 1 1 1 1

Source 580 1 1 4

Shore 540

Maturity 72 28 25 15 24 12

Method 208 10 6 14 6

Team size 186 7 5 8 8

of users 262 4 3 5 4

Percentage of variation in structural quality scores accounted for by various factors

Click to edit Master title style The Technical Debt Metaphor

Structural quality problems in production code

Technical Debt

Principal borrowed

Interest on the debt

Business Risk Liability from debt

Opportunity cost

Interestmdashcontinuing IT costs attributable to the violations causing technical debt ie higher maintenance costs greater resource usage etc

Principalcost of fixing problems remaining in the code after release that must be remediated

Opportunity costmdashbenefits that could have been achieved had resources been put on new capability rather than retiring technical debt

Liabilitymdashbusiness costs related to outages breaches corrupted data etc

Technical Debt the future cost of fixing defects remaining in code at release a component of the cost of ownership

Todayrsquos talk focuses on the principal

Curtis et al (2012) IEEE Software

Click to edit Master title style Tech Debt by Quality Factor

Transferability

40

Changeability

30

Security 7

Robustness

18

70 of Technical Debt is in IT Cost (Transferability Changeability)

30 of Technical Debt is in Business Risk

(Robustness Performance Security)

Health Factor proportions are mostly

consistent across technologies

Curtis et al (2012) IEEE Software

Click to edit Master title style Join CISQ wwwit-cisqorg

Click to edit Master title style Section

References

115

Click to edit Master title style References 1 Anderson DJ (2010) Kanban Sequim WA Blue Hole Press

Basili V Trendowicz A Kowalczyk M Heidrich J Seaman C Munch J amp Rombach D (2014) Aligning Organizations through Measurement Heidelberg Springer

Boehm B (1987) Increasing software productivity IEEE Computer 20 (9) 43-57

Boehm B et al (2000) Software Cost Estimation with COCOMO II Prentice-Hall

Chrissis MB Konrad M amp Shrum S (2011) CMMI Third Edition Guidelines for Process Integration and Product Improvement Reading MA Addison Wesley

Cohn M (2006) Agile Estimating and Planning Upper Saddle River NJ Prentice-Hall

Crosby P (1979) Quality Is Free New York McGraw-Hill

Curtis B Sapiddi J amp Szynkarski A (2012) Estimating the principle of an applicationrsquos technical debt IEEE Software 29 (6) 34-42

Dion R (1993) Process improvement and the corporate balance sheet IEEE Software 10 (4) 28-35

Duncker R (1992) Presentation at the 25th Annual Conference of the Singapore Computer Society Singapore November 1992

Dyne K (1999) ESSI Benefits from Continuous Software Improvements In Proceedings of ESEPGrsquo99 Milton Keyes UK ESPI Foundation

Flowe RM amp Thordahl JB (1994) A Correlational Study of the SEIrsquos Capability Maturity Model and Software Development Performance in DoD Contracts (AFITGSSLAR94D-2) Dayton OH Air Force Institute of Technology Wright Patterson Air Force Base

116

Click to edit Master title style References 2 Haley T Ireland B Wojtaszek E Nash D amp Dion R (1995) Raytheon Electronic Systems

Experience in Software Process Improvement (Tech Rep CMUSEI-95-TR-017) ) Pittsburgh Software Engineering Institute Carnegie Mellon University

Ebert C amp Dumke R (2007) Software Measurement Berlin Springer-Verlag

Fagan E amp Davis N (2005) TSP PSP at Intuit Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Fenton N (2006) New directions for Software Metrics CIO Symposium on Software Best Practices

Garmus D amp Herron D (2001) Function Point Analysis Boston Addison-Wesley

George E (2005) Project management with TSP Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Herbsleb J Carleton A Rozum J Siegel J amp Zubrow D (1994) Benefits of CMM-Based Software Process Improvement Initial Results (Tech Rep CMUSEI-94-TR-13) Pittsburgh Software Engineering Institute Carnegie Mellon University

Humphrey W (1997) Introduction to the Personal Software Process Boston Addison Wesley

Humphrey W (2000) Introduction to the Team Software Process Boston Addison Wesley

Humphrey W (2006) TSP Leading a Development Team Boston Addison Wesley

Jackson D (2009) A direct path to dependable software Communications of the ACM 52 (4)

Kan SH (1995) Metrics and Models in Software Quality Engineering Reading MA Addison-Wesley

Keeni B (2000) IEEE Software 16 (4)

117

Click to edit Master title style References 2 Larman C (2004) Agile and Iterative Development Boston Addison Wesley

Lydon T (1995) Productivity drivers Process and capital In Proceedings of the 1995 SEPG Conference Pittsburgh Software Engineering Institute Carnegie Mellon University

McGarry J et al (2002) Practical Software Measurement Reading MA Addison Wesley

Paulk M Weber C Curtis B amp Chrissis M (1995) The Capability Maturity Model Guidelines for Improving the Software Process Reading MA Addison-Wesley

Peterman W (2000) IEEE Software JulyAugust issue

Poppendieck M amp Poppendieck T (2007) Implementing Lean Software Development Boston Addison-Wesley

Sakamoto K Kishida K amp Nakakoji K (1996) Cultural adaptation of the CMM In Fuggetta A amp Wolf A (Eds) Software Process Chichester UK Wiley 137-154

Sinha M (2006) Performance models ndash Enabling organizations prediction capability Proceedings of SEPGrsquo06 Pittsburgh Software Engineering Institute Carnegie Mellon University

Schwaber K amp Beedle M (2002) Agile Software Development Using Scrum

Spinellis D (2006) Code Quality Addison-Wesley

Van Epps S amp Marshall R (2012) Using TSP at the end of the development cycle Proceedings of TSP 2012 Pittsburgh Software Engineering Institute Carnegie Mellon University

Vu JD (1996) Software process improvement A business case In Proceedings of the European SEPG Conference Milton Keynes UK European Software Process Improvement Foundation

118

  • Best Practices for Software Process and Product Measurement
  • Instructor Dr Bill Curtis
  • Agenda
  • Section 1
  • Why Does Measurement Languish
  • Measurement Process Guidance
  • Measurement Categories
  • Productivity Analysis Objectives
  • Productivity Analysis Measures
  • Software Productivity
  • Software Size Measures
  • How Many Lines of Code
  • Function Point Estimation
  • Effort mdash Weakest Measure
  • How Quality Affects Productivity
  • Carry-forward Rework
  • Example of Quality Impact
  • Quality-Adjusted Productivity
  • Best Practices for Analysis
  • 1 Evaluate Demographics
  • 2 Segment Baselines
  • 3 Beware Sampling Effects
  • 4 Understand Variation
  • 5 Investigate Distributions
  • 6 Account for Maturity Effects
  • 7 Caution for External Data Sets
  • Slide Number 27
  • Measuring Project Progress
  • Requirements Change Impact
  • Phase Defect Removal Model
  • SCRUM Board
  • Burndown Chart and Velocity
  • Analyzing Velocity
  • Measuring and Managing Flow
  • Trends Over Sprints
  • Sources and Measures
  • Recommended Books
  • Personal Software Process (PSP)
  • Team Software Process (TSP)
  • Aggregated Personal Data Best
  • Defect Detection at Intuit
  • TSP Benefits
  • Section 3
  • CMMI Maturity Transitions
  • Enhanced Maturity Framework
  • Maturity Level Transformation
  • Measurement by Level
  • Average Improvement Results
  • Schedule Performance
  • Cost Performance
  • Crosbyrsquos Cost of Quality Model
  • Raytheons Cost of Quality
  • Raytheonrsquos Productivity
  • Raytheonrsquos Cost Reduction
  • Raytheonrsquos Cost Predictability
  • OMRONrsquos Reuse gains
  • Section 4
  • What about the Product
  • Six Sigmarsquos Challenge
  • Testing is Not Enough
  • CASTrsquos Application Intelligence Platform
  • Modern Apps are a Technology Stack
  • Architecturally Complex Defects
  • Propagated Risk
  • Transaction Risk
  • Risk-Based Prioritization
  • Static Analysis in US DoD
  • Reducing Operational Incidents amp Costs
  • Reducing Operational Losses
  • Slide Number 70
  • The Technical Debt Metaphor
  • Tech Debt by Quality Factor
  • Join CISQ wwwit-cisqorg
  • Section
  • References 1
  • References 2
  • References 2
Page 54: Best Practices for Software Process and Product Measurement · Software Process and Product Measurement Dr. Bill Curtis ... measurement best practice • Best guidance for starting

Click to edit Master title style Schedule Performance

0

02

04

06

08

1

12

14

16

1 2 3

Schedule performance

index

CMM maturity level

Budgeted cost of work performed

Budgeted cost of work scheduled

behind ahead

Air Force study of contractors

Flowe amp Thordahl (1994)

Click to edit Master title style Cost Performance

Cost performance

index

0

05

1

15

2

25

1 2 3 CMM Maturity Level

Budgeted cost of work performed Actual cost of

work performed

overrun underrun

Air Force study of contractors

Flowe amp Thordahl (1994)

Click to edit Master title style Crosbyrsquos Cost of Quality Model Project Cost

Performance Cost of Quality Generating plans documentation Development - requirements - design - code - integration Prevention Appraisal

Reviews - system - requirements - design - test plan - test scripts Walkthroughs Testing (first time) Independent Verification and Validation (first time) Audits

Training Policies Procedures and methods Tools Planning Quality Improvement Projects Data Gathering and Analysis Root Cause Analysis Quality Reporting

Nonconformance Conformance Fixing Defects Reworking any Document Updating Source Code Re-reviews Re-tests Lab costs for re-tests Patches(Internal and External) Engineering Changes Change Control Boards External Failures and fines Customer Support Help Desks

Dion (1993) Crosby (1979) Quality Is Free

Click to edit Master title style Raytheons Cost of Quality

bull Performance mdash cost of building it right first time bull Nonconformance mdash cost of rework bull Appraisal mdash cost of testing bull Prevention mdash cost of preventing nonconformance

Dion (1993) Haley (1995)

23

18

Year Level Perform Nonconf Appraise Prevent

1988 1 34 41 15 7

1990 2 55 18 15 12

1992 3 66 11

1994 4 76 6 75

Click to edit Master title style Raytheonrsquos Productivity

0

1

2

3

4

1988 1989 1990 1991 1992 1993 1994

Growth relative to 1989 baseline

Lydon (1995)

Click to edit Master title style Raytheonrsquos Cost Reduction

0

02

04

06

08

1

12

1988 1989 1990 1991 1992 1993 1994

Reduction relative to 1989 baseline

Lydon (1995)

Click to edit Master title style

Haley (1995)

1993 1992 1991 1990 1989 1988 1994

50

40

30

20

10

0

-10

Actual cost

Budgeted cost

over-run under-run

4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1

Raytheonrsquos Cost Predictability

Click to edit Master title style OMRONrsquos Reuse gains

Sakamoto et al (1996)

0

5 0 0

1 0 0 0

1 5 0 0

2 0 0 0

2 5 0 0

3 0 0 0

1 9 8 7 1 9 8 8 1 9 8 9 1 9 9 0 1 9 9 1 1 9 9 2 1 9 9 3 1 9 9 4

Steps

100

75

50

25

0

of reuse

Click to edit Master title style Section 4 Section 4 Measuring Quality and

Technical Debt

1) Structural quality measurement

2) Technical debt

3) Consortium for IT Software Quality (CISQ)

89

Click to edit Master title style What about the Product CMMIrsquos primary assumption

ldquoThe quality of a software system is governed by quality of the process used to develop itrdquo

- Watts Humphrey

CMMIrsquos product quality problem

1 Assessments do not verify product quality

2 Compliance focus undermines CMMIrsquos primary assumption

3 Product quality focus at Level 4 in CMM was lost in quantitative process control in CMMI

4 The assumptions underlying control charts are violated by software data

Click to edit Master title style Six Sigmarsquos Challenge

Process focus ndash process improvement ndash Six Sigma Product focus ndash product improvement ndash Design for 6σ

Product focus supplements product focus to unlock even more value from software

Six Sigma projects must have significant benefits Huge benefits

Large benefits

Good benefits

Okay benefits

Small benefits

Now what

Ultimately we run out of projects with significant enough benefits to continue the programhellip

so what is the solution that allows continual improvement

Click to edit Master title style Testing is Not Enough ldquoAs higher levels of assurance are demandedhelliptesting cannot deliver the level of confidence required at a reasonable costrdquo

ldquoThe correctness of the code is rarely the weakest linkrdquo

Jackson D (2009) Spinellis D (2006)

ldquohellipa failure to satisfy a non-functional requirement can be critical even catastrophichellipnon-functional requirements are sometimes difficult to verify We cannot write a test case to verify a systemrsquos reliabilityhellipThe ability to associate code to non-functional properties can be a powerful weapon in a software engineerrsquos arsenalrdquo

Click to edit Master title style CASTrsquos Application Intelligence Platform

Application Analysis

Evaluation of 1200+ coding amp

architectural rules

Application meta-data

Transferability

Changeability

Robustness

Performance

Security

Quality Measurements

Detected Violations

Expensive operation in loop Static vs pooled connections Complex query on big table Large indices on big table

Empty CATCH block Uncontrolled data access Poor memory management Opened resource not closed

SQL injection Cross-site scripting Buffer overflow Uncontrolled format string

Unstructured code Misuse of inheritance Lack of comments Violated naming convention

Highly coupled component Duplicated code Index modified in loop High cyclomatic complexity

Language Parsers

Oracle PLSQL Sybase T-SQL SQL Server T-SQL IBM SQLPSM C C++ C Pro C Cobol CICS Visual Basic VBNet ASPNet Java J2EE JSP XML HTML Javascript VBScript PHP PowerBuilder Oracle Forms PeopleSoft SAP ABAP Netweaver Tibco Business Objects Universal Analyzer for other languages

Click to edit Master title style Modern Apps are a Technology Stack Ar

chite

ctur

al C

ompl

ianc

e

Data Flow Transaction Risk

EJB PLSQL

Oracle

SQL Server

DB2

TSQL

Hibernate

Spring

Struts NET

COBOL

Sybase IMS

Messaging

Integration quality Architectural

compliance Risk propagation Application security Resiliency checks Transaction integrity

Function point Effort estimation Data access control SDK versioning Calibration across

technologies IT organization level

System Level 3

Code style amp layout Expression complexity Code documentation Class or program design Basic coding standards Developer level

Unit Level 1

Java Java Java

Java Java

Web Services Java Java Single languagetechnology layer

Intra-technology architecture Intra-layer dependencies Inter-program invocation Security vulnerabilities Development team level

Technology Level 2

JSP ASPNET APIs

Click to edit Master title style Architecturally Complex Defects

Primary cause of operational

problems

20x as many

fixes to correct

Architecturally Complex Defect

A structural flaw involving interactions among multiple components that reside in different application layers

48

52 92

8 Architecturally

Complex Defects

Code unit-level violations

of total app defects

of total repair effort

Most architecturally complex defects involve an architectural hotspot ndash a badly-built component that should be replaced

Click to edit Master title style Propagated Risk

The impact of violation 284 in Component B is very widely propagated throughout the system ndash

it presents very high risk

The impact of violation 342 in Component A is only narrowly propagated in the system

ndash it presents low risk

Propagated Risk Index

A ranking of the risk created by each violation based on the its severity and the breadth of its impact in the system

Violation 342 in component A

Violation 284 in component B

Remediating violation 284 in Component B will have

the greater impact on reducing risk and cost

Click to edit Master title style Transaction Risk

Customers care most about the dependability of their transactions

The risk of a transaction is determined by the number and severity of violations along its path

Transactions can be ranked by risk to prioritize their remediation

The Transaction Risk Index can be further tuned using the operational frequency of each transaction

Transaction Risk Index

A ranking of transactions based on the risk created by the number and severity of violations along the transaction path

Transaction entryexit in the user interface layer

Transaction fulfillment in the data storage layer

1 3 2 4 5

Transaction 1 poses greatest risk

Transaction processing in the business logic layer

Click to edit Master title style Risk-Based Prioritization

Violation Health Factor Severity Propagated Risk Index

In a high risk transaction

Frequency of path execution

148 Security 9

371 Robustness 9

062 Performance 8

284 Robustness 7

016 Changeability 7

241 Security 5

173 Transferability 4

359 Transferability 3

Click to edit Master title style Static Analysis in US DoD

John Keane (2013)

Testing finds lt13 of structural defects

Static analysis before testing is 85 efficient

With testing and static analysis combined test schedules are reduced by 50

Click to edit Master title style Reducing Operational Incidents amp Costs

Study of structural quality measures and maintenance effort across 20 customers in a large global system integrator

TQI increase of 24 decreased corrective maintenance effort by 50

Total Quality Index

Click to edit Master title style Reducing Operational Losses

0

20

40

60

80

100

120

140

160

180

100 150 200 250 300 350 400

Technical Quality Index

Pro

du

ctio

n In

cid

ents

+$2000000 Annual Loss

$50000 Annual Loss

$5000 Annual Loss

Large international investment bank Business critical applications

Click to edit Master title style Factors Affecting Structural Quality

2017 CRASH Report Request from infocastsoftwarecom

Factor apps Robust Security Perform Change Transfer

Technology 1606 20 3 16 26 7

Java-EE

Industry 677 5 5 7 4 6

App type 510 1 1 1 1 1

Source 580 1 1 4

Shore 540

Maturity 72 28 25 15 24 12

Method 208 10 6 14 6

Team size 186 7 5 8 8

of users 262 4 3 5 4

Percentage of variation in structural quality scores accounted for by various factors

Click to edit Master title style The Technical Debt Metaphor

Structural quality problems in production code

Technical Debt

Principal borrowed

Interest on the debt

Business Risk Liability from debt

Opportunity cost

Interestmdashcontinuing IT costs attributable to the violations causing technical debt ie higher maintenance costs greater resource usage etc

Principalcost of fixing problems remaining in the code after release that must be remediated

Opportunity costmdashbenefits that could have been achieved had resources been put on new capability rather than retiring technical debt

Liabilitymdashbusiness costs related to outages breaches corrupted data etc

Technical Debt the future cost of fixing defects remaining in code at release a component of the cost of ownership

Todayrsquos talk focuses on the principal

Curtis et al (2012) IEEE Software

Click to edit Master title style Tech Debt by Quality Factor

Transferability

40

Changeability

30

Security 7

Robustness

18

70 of Technical Debt is in IT Cost (Transferability Changeability)

30 of Technical Debt is in Business Risk

(Robustness Performance Security)

Health Factor proportions are mostly

consistent across technologies

Curtis et al (2012) IEEE Software

Click to edit Master title style Join CISQ wwwit-cisqorg

Click to edit Master title style Section

References

115

Click to edit Master title style References 1 Anderson DJ (2010) Kanban Sequim WA Blue Hole Press

Basili V Trendowicz A Kowalczyk M Heidrich J Seaman C Munch J amp Rombach D (2014) Aligning Organizations through Measurement Heidelberg Springer

Boehm B (1987) Increasing software productivity IEEE Computer 20 (9) 43-57

Boehm B et al (2000) Software Cost Estimation with COCOMO II Prentice-Hall

Chrissis MB Konrad M amp Shrum S (2011) CMMI Third Edition Guidelines for Process Integration and Product Improvement Reading MA Addison Wesley

Cohn M (2006) Agile Estimating and Planning Upper Saddle River NJ Prentice-Hall

Crosby P (1979) Quality Is Free New York McGraw-Hill

Curtis B Sapiddi J amp Szynkarski A (2012) Estimating the principle of an applicationrsquos technical debt IEEE Software 29 (6) 34-42

Dion R (1993) Process improvement and the corporate balance sheet IEEE Software 10 (4) 28-35

Duncker R (1992) Presentation at the 25th Annual Conference of the Singapore Computer Society Singapore November 1992

Dyne K (1999) ESSI Benefits from Continuous Software Improvements In Proceedings of ESEPGrsquo99 Milton Keyes UK ESPI Foundation

Flowe RM amp Thordahl JB (1994) A Correlational Study of the SEIrsquos Capability Maturity Model and Software Development Performance in DoD Contracts (AFITGSSLAR94D-2) Dayton OH Air Force Institute of Technology Wright Patterson Air Force Base

116

Click to edit Master title style References 2 Haley T Ireland B Wojtaszek E Nash D amp Dion R (1995) Raytheon Electronic Systems

Experience in Software Process Improvement (Tech Rep CMUSEI-95-TR-017) ) Pittsburgh Software Engineering Institute Carnegie Mellon University

Ebert C amp Dumke R (2007) Software Measurement Berlin Springer-Verlag

Fagan E amp Davis N (2005) TSP PSP at Intuit Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Fenton N (2006) New directions for Software Metrics CIO Symposium on Software Best Practices

Garmus D amp Herron D (2001) Function Point Analysis Boston Addison-Wesley

George E (2005) Project management with TSP Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Herbsleb J Carleton A Rozum J Siegel J amp Zubrow D (1994) Benefits of CMM-Based Software Process Improvement Initial Results (Tech Rep CMUSEI-94-TR-13) Pittsburgh Software Engineering Institute Carnegie Mellon University

Humphrey W (1997) Introduction to the Personal Software Process Boston Addison Wesley

Humphrey W (2000) Introduction to the Team Software Process Boston Addison Wesley

Humphrey W (2006) TSP Leading a Development Team Boston Addison Wesley

Jackson D (2009) A direct path to dependable software Communications of the ACM 52 (4)

Kan SH (1995) Metrics and Models in Software Quality Engineering Reading MA Addison-Wesley

Keeni B (2000) IEEE Software 16 (4)

117

Click to edit Master title style References 2 Larman C (2004) Agile and Iterative Development Boston Addison Wesley

Lydon T (1995) Productivity drivers Process and capital In Proceedings of the 1995 SEPG Conference Pittsburgh Software Engineering Institute Carnegie Mellon University

McGarry J et al (2002) Practical Software Measurement Reading MA Addison Wesley

Paulk M Weber C Curtis B amp Chrissis M (1995) The Capability Maturity Model Guidelines for Improving the Software Process Reading MA Addison-Wesley

Peterman W (2000) IEEE Software JulyAugust issue

Poppendieck M amp Poppendieck T (2007) Implementing Lean Software Development Boston Addison-Wesley

Sakamoto K Kishida K amp Nakakoji K (1996) Cultural adaptation of the CMM In Fuggetta A amp Wolf A (Eds) Software Process Chichester UK Wiley 137-154

Sinha M (2006) Performance models ndash Enabling organizations prediction capability Proceedings of SEPGrsquo06 Pittsburgh Software Engineering Institute Carnegie Mellon University

Schwaber K amp Beedle M (2002) Agile Software Development Using Scrum

Spinellis D (2006) Code Quality Addison-Wesley

Van Epps S amp Marshall R (2012) Using TSP at the end of the development cycle Proceedings of TSP 2012 Pittsburgh Software Engineering Institute Carnegie Mellon University

Vu JD (1996) Software process improvement A business case In Proceedings of the European SEPG Conference Milton Keynes UK European Software Process Improvement Foundation

118

  • Best Practices for Software Process and Product Measurement
  • Instructor Dr Bill Curtis
  • Agenda
  • Section 1
  • Why Does Measurement Languish
  • Measurement Process Guidance
  • Measurement Categories
  • Productivity Analysis Objectives
  • Productivity Analysis Measures
  • Software Productivity
  • Software Size Measures
  • How Many Lines of Code
  • Function Point Estimation
  • Effort mdash Weakest Measure
  • How Quality Affects Productivity
  • Carry-forward Rework
  • Example of Quality Impact
  • Quality-Adjusted Productivity
  • Best Practices for Analysis
  • 1 Evaluate Demographics
  • 2 Segment Baselines
  • 3 Beware Sampling Effects
  • 4 Understand Variation
  • 5 Investigate Distributions
  • 6 Account for Maturity Effects
  • 7 Caution for External Data Sets
  • Slide Number 27
  • Measuring Project Progress
  • Requirements Change Impact
  • Phase Defect Removal Model
  • SCRUM Board
  • Burndown Chart and Velocity
  • Analyzing Velocity
  • Measuring and Managing Flow
  • Trends Over Sprints
  • Sources and Measures
  • Recommended Books
  • Personal Software Process (PSP)
  • Team Software Process (TSP)
  • Aggregated Personal Data Best
  • Defect Detection at Intuit
  • TSP Benefits
  • Section 3
  • CMMI Maturity Transitions
  • Enhanced Maturity Framework
  • Maturity Level Transformation
  • Measurement by Level
  • Average Improvement Results
  • Schedule Performance
  • Cost Performance
  • Crosbyrsquos Cost of Quality Model
  • Raytheons Cost of Quality
  • Raytheonrsquos Productivity
  • Raytheonrsquos Cost Reduction
  • Raytheonrsquos Cost Predictability
  • OMRONrsquos Reuse gains
  • Section 4
  • What about the Product
  • Six Sigmarsquos Challenge
  • Testing is Not Enough
  • CASTrsquos Application Intelligence Platform
  • Modern Apps are a Technology Stack
  • Architecturally Complex Defects
  • Propagated Risk
  • Transaction Risk
  • Risk-Based Prioritization
  • Static Analysis in US DoD
  • Reducing Operational Incidents amp Costs
  • Reducing Operational Losses
  • Slide Number 70
  • The Technical Debt Metaphor
  • Tech Debt by Quality Factor
  • Join CISQ wwwit-cisqorg
  • Section
  • References 1
  • References 2
  • References 2
Page 55: Best Practices for Software Process and Product Measurement · Software Process and Product Measurement Dr. Bill Curtis ... measurement best practice • Best guidance for starting

Click to edit Master title style Cost Performance

Cost performance

index

0

05

1

15

2

25

1 2 3 CMM Maturity Level

Budgeted cost of work performed Actual cost of

work performed

overrun underrun

Air Force study of contractors

Flowe amp Thordahl (1994)

Click to edit Master title style Crosbyrsquos Cost of Quality Model Project Cost

Performance Cost of Quality Generating plans documentation Development - requirements - design - code - integration Prevention Appraisal

Reviews - system - requirements - design - test plan - test scripts Walkthroughs Testing (first time) Independent Verification and Validation (first time) Audits

Training Policies Procedures and methods Tools Planning Quality Improvement Projects Data Gathering and Analysis Root Cause Analysis Quality Reporting

Nonconformance Conformance Fixing Defects Reworking any Document Updating Source Code Re-reviews Re-tests Lab costs for re-tests Patches(Internal and External) Engineering Changes Change Control Boards External Failures and fines Customer Support Help Desks

Dion (1993) Crosby (1979) Quality Is Free

Click to edit Master title style Raytheons Cost of Quality

bull Performance mdash cost of building it right first time bull Nonconformance mdash cost of rework bull Appraisal mdash cost of testing bull Prevention mdash cost of preventing nonconformance

Dion (1993) Haley (1995)

23

18

Year Level Perform Nonconf Appraise Prevent

1988 1 34 41 15 7

1990 2 55 18 15 12

1992 3 66 11

1994 4 76 6 75

Click to edit Master title style Raytheonrsquos Productivity

0

1

2

3

4

1988 1989 1990 1991 1992 1993 1994

Growth relative to 1989 baseline

Lydon (1995)

Click to edit Master title style Raytheonrsquos Cost Reduction

0

02

04

06

08

1

12

1988 1989 1990 1991 1992 1993 1994

Reduction relative to 1989 baseline

Lydon (1995)

Click to edit Master title style

Haley (1995)

1993 1992 1991 1990 1989 1988 1994

50

40

30

20

10

0

-10

Actual cost

Budgeted cost

over-run under-run

4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1

Raytheonrsquos Cost Predictability

Click to edit Master title style OMRONrsquos Reuse gains

Sakamoto et al (1996)

0

5 0 0

1 0 0 0

1 5 0 0

2 0 0 0

2 5 0 0

3 0 0 0

1 9 8 7 1 9 8 8 1 9 8 9 1 9 9 0 1 9 9 1 1 9 9 2 1 9 9 3 1 9 9 4

Steps

100

75

50

25

0

of reuse

Click to edit Master title style Section 4 Section 4 Measuring Quality and

Technical Debt

1) Structural quality measurement

2) Technical debt

3) Consortium for IT Software Quality (CISQ)

89

Click to edit Master title style What about the Product CMMIrsquos primary assumption

ldquoThe quality of a software system is governed by quality of the process used to develop itrdquo

- Watts Humphrey

CMMIrsquos product quality problem

1 Assessments do not verify product quality

2 Compliance focus undermines CMMIrsquos primary assumption

3 Product quality focus at Level 4 in CMM was lost in quantitative process control in CMMI

4 The assumptions underlying control charts are violated by software data

Click to edit Master title style Six Sigmarsquos Challenge

Process focus ndash process improvement ndash Six Sigma Product focus ndash product improvement ndash Design for 6σ

Product focus supplements product focus to unlock even more value from software

Six Sigma projects must have significant benefits Huge benefits

Large benefits

Good benefits

Okay benefits

Small benefits

Now what

Ultimately we run out of projects with significant enough benefits to continue the programhellip

so what is the solution that allows continual improvement

Click to edit Master title style Testing is Not Enough ldquoAs higher levels of assurance are demandedhelliptesting cannot deliver the level of confidence required at a reasonable costrdquo

ldquoThe correctness of the code is rarely the weakest linkrdquo

Jackson D (2009) Spinellis D (2006)

ldquohellipa failure to satisfy a non-functional requirement can be critical even catastrophichellipnon-functional requirements are sometimes difficult to verify We cannot write a test case to verify a systemrsquos reliabilityhellipThe ability to associate code to non-functional properties can be a powerful weapon in a software engineerrsquos arsenalrdquo

Click to edit Master title style CASTrsquos Application Intelligence Platform

Application Analysis

Evaluation of 1200+ coding amp

architectural rules

Application meta-data

Transferability

Changeability

Robustness

Performance

Security

Quality Measurements

Detected Violations

Expensive operation in loop Static vs pooled connections Complex query on big table Large indices on big table

Empty CATCH block Uncontrolled data access Poor memory management Opened resource not closed

SQL injection Cross-site scripting Buffer overflow Uncontrolled format string

Unstructured code Misuse of inheritance Lack of comments Violated naming convention

Highly coupled component Duplicated code Index modified in loop High cyclomatic complexity

Language Parsers

Oracle PLSQL Sybase T-SQL SQL Server T-SQL IBM SQLPSM C C++ C Pro C Cobol CICS Visual Basic VBNet ASPNet Java J2EE JSP XML HTML Javascript VBScript PHP PowerBuilder Oracle Forms PeopleSoft SAP ABAP Netweaver Tibco Business Objects Universal Analyzer for other languages

Click to edit Master title style Modern Apps are a Technology Stack Ar

chite

ctur

al C

ompl

ianc

e

Data Flow Transaction Risk

EJB PLSQL

Oracle

SQL Server

DB2

TSQL

Hibernate

Spring

Struts NET

COBOL

Sybase IMS

Messaging

Integration quality Architectural

compliance Risk propagation Application security Resiliency checks Transaction integrity

Function point Effort estimation Data access control SDK versioning Calibration across

technologies IT organization level

System Level 3

Code style amp layout Expression complexity Code documentation Class or program design Basic coding standards Developer level

Unit Level 1

Java Java Java

Java Java

Web Services Java Java Single languagetechnology layer

Intra-technology architecture Intra-layer dependencies Inter-program invocation Security vulnerabilities Development team level

Technology Level 2

JSP ASPNET APIs

Click to edit Master title style Architecturally Complex Defects

Primary cause of operational

problems

20x as many

fixes to correct

Architecturally Complex Defect

A structural flaw involving interactions among multiple components that reside in different application layers

48

52 92

8 Architecturally

Complex Defects

Code unit-level violations

of total app defects

of total repair effort

Most architecturally complex defects involve an architectural hotspot ndash a badly-built component that should be replaced

Click to edit Master title style Propagated Risk

The impact of violation 284 in Component B is very widely propagated throughout the system ndash

it presents very high risk

The impact of violation 342 in Component A is only narrowly propagated in the system

ndash it presents low risk

Propagated Risk Index

A ranking of the risk created by each violation based on the its severity and the breadth of its impact in the system

Violation 342 in component A

Violation 284 in component B

Remediating violation 284 in Component B will have

the greater impact on reducing risk and cost

Click to edit Master title style Transaction Risk

Customers care most about the dependability of their transactions

The risk of a transaction is determined by the number and severity of violations along its path

Transactions can be ranked by risk to prioritize their remediation

The Transaction Risk Index can be further tuned using the operational frequency of each transaction

Transaction Risk Index

A ranking of transactions based on the risk created by the number and severity of violations along the transaction path

Transaction entryexit in the user interface layer

Transaction fulfillment in the data storage layer

1 3 2 4 5

Transaction 1 poses greatest risk

Transaction processing in the business logic layer

Click to edit Master title style Risk-Based Prioritization

Violation Health Factor Severity Propagated Risk Index

In a high risk transaction

Frequency of path execution

148 Security 9

371 Robustness 9

062 Performance 8

284 Robustness 7

016 Changeability 7

241 Security 5

173 Transferability 4

359 Transferability 3

Click to edit Master title style Static Analysis in US DoD

John Keane (2013)

Testing finds lt13 of structural defects

Static analysis before testing is 85 efficient

With testing and static analysis combined test schedules are reduced by 50

Click to edit Master title style Reducing Operational Incidents amp Costs

Study of structural quality measures and maintenance effort across 20 customers in a large global system integrator

TQI increase of 24 decreased corrective maintenance effort by 50

Total Quality Index

Click to edit Master title style Reducing Operational Losses

0

20

40

60

80

100

120

140

160

180

100 150 200 250 300 350 400

Technical Quality Index

Pro

du

ctio

n In

cid

ents

+$2000000 Annual Loss

$50000 Annual Loss

$5000 Annual Loss

Large international investment bank Business critical applications

Click to edit Master title style Factors Affecting Structural Quality

2017 CRASH Report Request from infocastsoftwarecom

Factor apps Robust Security Perform Change Transfer

Technology 1606 20 3 16 26 7

Java-EE

Industry 677 5 5 7 4 6

App type 510 1 1 1 1 1

Source 580 1 1 4

Shore 540

Maturity 72 28 25 15 24 12

Method 208 10 6 14 6

Team size 186 7 5 8 8

of users 262 4 3 5 4

Percentage of variation in structural quality scores accounted for by various factors

Click to edit Master title style The Technical Debt Metaphor

Structural quality problems in production code

Technical Debt

Principal borrowed

Interest on the debt

Business Risk Liability from debt

Opportunity cost

Interestmdashcontinuing IT costs attributable to the violations causing technical debt ie higher maintenance costs greater resource usage etc

Principalcost of fixing problems remaining in the code after release that must be remediated

Opportunity costmdashbenefits that could have been achieved had resources been put on new capability rather than retiring technical debt

Liabilitymdashbusiness costs related to outages breaches corrupted data etc

Technical Debt the future cost of fixing defects remaining in code at release a component of the cost of ownership

Todayrsquos talk focuses on the principal

Curtis et al (2012) IEEE Software

Click to edit Master title style Tech Debt by Quality Factor

Transferability

40

Changeability

30

Security 7

Robustness

18

70 of Technical Debt is in IT Cost (Transferability Changeability)

30 of Technical Debt is in Business Risk

(Robustness Performance Security)

Health Factor proportions are mostly

consistent across technologies

Curtis et al (2012) IEEE Software

Click to edit Master title style Join CISQ wwwit-cisqorg

Click to edit Master title style Section

References

115

Click to edit Master title style References 1 Anderson DJ (2010) Kanban Sequim WA Blue Hole Press

Basili V Trendowicz A Kowalczyk M Heidrich J Seaman C Munch J amp Rombach D (2014) Aligning Organizations through Measurement Heidelberg Springer

Boehm B (1987) Increasing software productivity IEEE Computer 20 (9) 43-57

Boehm B et al (2000) Software Cost Estimation with COCOMO II Prentice-Hall

Chrissis MB Konrad M amp Shrum S (2011) CMMI Third Edition Guidelines for Process Integration and Product Improvement Reading MA Addison Wesley

Cohn M (2006) Agile Estimating and Planning Upper Saddle River NJ Prentice-Hall

Crosby P (1979) Quality Is Free New York McGraw-Hill

Curtis B Sapiddi J amp Szynkarski A (2012) Estimating the principle of an applicationrsquos technical debt IEEE Software 29 (6) 34-42

Dion R (1993) Process improvement and the corporate balance sheet IEEE Software 10 (4) 28-35

Duncker R (1992) Presentation at the 25th Annual Conference of the Singapore Computer Society Singapore November 1992

Dyne K (1999) ESSI Benefits from Continuous Software Improvements In Proceedings of ESEPGrsquo99 Milton Keyes UK ESPI Foundation

Flowe RM amp Thordahl JB (1994) A Correlational Study of the SEIrsquos Capability Maturity Model and Software Development Performance in DoD Contracts (AFITGSSLAR94D-2) Dayton OH Air Force Institute of Technology Wright Patterson Air Force Base

116

Click to edit Master title style References 2 Haley T Ireland B Wojtaszek E Nash D amp Dion R (1995) Raytheon Electronic Systems

Experience in Software Process Improvement (Tech Rep CMUSEI-95-TR-017) ) Pittsburgh Software Engineering Institute Carnegie Mellon University

Ebert C amp Dumke R (2007) Software Measurement Berlin Springer-Verlag

Fagan E amp Davis N (2005) TSP PSP at Intuit Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Fenton N (2006) New directions for Software Metrics CIO Symposium on Software Best Practices

Garmus D amp Herron D (2001) Function Point Analysis Boston Addison-Wesley

George E (2005) Project management with TSP Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Herbsleb J Carleton A Rozum J Siegel J amp Zubrow D (1994) Benefits of CMM-Based Software Process Improvement Initial Results (Tech Rep CMUSEI-94-TR-13) Pittsburgh Software Engineering Institute Carnegie Mellon University

Humphrey W (1997) Introduction to the Personal Software Process Boston Addison Wesley

Humphrey W (2000) Introduction to the Team Software Process Boston Addison Wesley

Humphrey W (2006) TSP Leading a Development Team Boston Addison Wesley

Jackson D (2009) A direct path to dependable software Communications of the ACM 52 (4)

Kan SH (1995) Metrics and Models in Software Quality Engineering Reading MA Addison-Wesley

Keeni B (2000) IEEE Software 16 (4)

117

Click to edit Master title style References 2 Larman C (2004) Agile and Iterative Development Boston Addison Wesley

Lydon T (1995) Productivity drivers Process and capital In Proceedings of the 1995 SEPG Conference Pittsburgh Software Engineering Institute Carnegie Mellon University

McGarry J et al (2002) Practical Software Measurement Reading MA Addison Wesley

Paulk M Weber C Curtis B amp Chrissis M (1995) The Capability Maturity Model Guidelines for Improving the Software Process Reading MA Addison-Wesley

Peterman W (2000) IEEE Software JulyAugust issue

Poppendieck M amp Poppendieck T (2007) Implementing Lean Software Development Boston Addison-Wesley

Sakamoto K Kishida K amp Nakakoji K (1996) Cultural adaptation of the CMM In Fuggetta A amp Wolf A (Eds) Software Process Chichester UK Wiley 137-154

Sinha M (2006) Performance models ndash Enabling organizations prediction capability Proceedings of SEPGrsquo06 Pittsburgh Software Engineering Institute Carnegie Mellon University

Schwaber K amp Beedle M (2002) Agile Software Development Using Scrum

Spinellis D (2006) Code Quality Addison-Wesley

Van Epps S amp Marshall R (2012) Using TSP at the end of the development cycle Proceedings of TSP 2012 Pittsburgh Software Engineering Institute Carnegie Mellon University

Vu JD (1996) Software process improvement A business case In Proceedings of the European SEPG Conference Milton Keynes UK European Software Process Improvement Foundation

118

  • Best Practices for Software Process and Product Measurement
  • Instructor Dr Bill Curtis
  • Agenda
  • Section 1
  • Why Does Measurement Languish
  • Measurement Process Guidance
  • Measurement Categories
  • Productivity Analysis Objectives
  • Productivity Analysis Measures
  • Software Productivity
  • Software Size Measures
  • How Many Lines of Code
  • Function Point Estimation
  • Effort mdash Weakest Measure
  • How Quality Affects Productivity
  • Carry-forward Rework
  • Example of Quality Impact
  • Quality-Adjusted Productivity
  • Best Practices for Analysis
  • 1 Evaluate Demographics
  • 2 Segment Baselines
  • 3 Beware Sampling Effects
  • 4 Understand Variation
  • 5 Investigate Distributions
  • 6 Account for Maturity Effects
  • 7 Caution for External Data Sets
  • Slide Number 27
  • Measuring Project Progress
  • Requirements Change Impact
  • Phase Defect Removal Model
  • SCRUM Board
  • Burndown Chart and Velocity
  • Analyzing Velocity
  • Measuring and Managing Flow
  • Trends Over Sprints
  • Sources and Measures
  • Recommended Books
  • Personal Software Process (PSP)
  • Team Software Process (TSP)
  • Aggregated Personal Data Best
  • Defect Detection at Intuit
  • TSP Benefits
  • Section 3
  • CMMI Maturity Transitions
  • Enhanced Maturity Framework
  • Maturity Level Transformation
  • Measurement by Level
  • Average Improvement Results
  • Schedule Performance
  • Cost Performance
  • Crosbyrsquos Cost of Quality Model
  • Raytheons Cost of Quality
  • Raytheonrsquos Productivity
  • Raytheonrsquos Cost Reduction
  • Raytheonrsquos Cost Predictability
  • OMRONrsquos Reuse gains
  • Section 4
  • What about the Product
  • Six Sigmarsquos Challenge
  • Testing is Not Enough
  • CASTrsquos Application Intelligence Platform
  • Modern Apps are a Technology Stack
  • Architecturally Complex Defects
  • Propagated Risk
  • Transaction Risk
  • Risk-Based Prioritization
  • Static Analysis in US DoD
  • Reducing Operational Incidents amp Costs
  • Reducing Operational Losses
  • Slide Number 70
  • The Technical Debt Metaphor
  • Tech Debt by Quality Factor
  • Join CISQ wwwit-cisqorg
  • Section
  • References 1
  • References 2
  • References 2
Page 56: Best Practices for Software Process and Product Measurement · Software Process and Product Measurement Dr. Bill Curtis ... measurement best practice • Best guidance for starting

Click to edit Master title style Crosbyrsquos Cost of Quality Model Project Cost

Performance Cost of Quality Generating plans documentation Development - requirements - design - code - integration Prevention Appraisal

Reviews - system - requirements - design - test plan - test scripts Walkthroughs Testing (first time) Independent Verification and Validation (first time) Audits

Training Policies Procedures and methods Tools Planning Quality Improvement Projects Data Gathering and Analysis Root Cause Analysis Quality Reporting

Nonconformance Conformance Fixing Defects Reworking any Document Updating Source Code Re-reviews Re-tests Lab costs for re-tests Patches(Internal and External) Engineering Changes Change Control Boards External Failures and fines Customer Support Help Desks

Dion (1993) Crosby (1979) Quality Is Free

Click to edit Master title style Raytheons Cost of Quality

bull Performance mdash cost of building it right first time bull Nonconformance mdash cost of rework bull Appraisal mdash cost of testing bull Prevention mdash cost of preventing nonconformance

Dion (1993) Haley (1995)

23

18

Year Level Perform Nonconf Appraise Prevent

1988 1 34 41 15 7

1990 2 55 18 15 12

1992 3 66 11

1994 4 76 6 75

Click to edit Master title style Raytheonrsquos Productivity

0

1

2

3

4

1988 1989 1990 1991 1992 1993 1994

Growth relative to 1989 baseline

Lydon (1995)

Click to edit Master title style Raytheonrsquos Cost Reduction

0

02

04

06

08

1

12

1988 1989 1990 1991 1992 1993 1994

Reduction relative to 1989 baseline

Lydon (1995)

Click to edit Master title style

Haley (1995)

1993 1992 1991 1990 1989 1988 1994

50

40

30

20

10

0

-10

Actual cost

Budgeted cost

over-run under-run

4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1

Raytheonrsquos Cost Predictability

Click to edit Master title style OMRONrsquos Reuse gains

Sakamoto et al (1996)

0

5 0 0

1 0 0 0

1 5 0 0

2 0 0 0

2 5 0 0

3 0 0 0

1 9 8 7 1 9 8 8 1 9 8 9 1 9 9 0 1 9 9 1 1 9 9 2 1 9 9 3 1 9 9 4

Steps

100

75

50

25

0

of reuse

Click to edit Master title style Section 4 Section 4 Measuring Quality and

Technical Debt

1) Structural quality measurement

2) Technical debt

3) Consortium for IT Software Quality (CISQ)

89

Click to edit Master title style What about the Product CMMIrsquos primary assumption

ldquoThe quality of a software system is governed by quality of the process used to develop itrdquo

- Watts Humphrey

CMMIrsquos product quality problem

1 Assessments do not verify product quality

2 Compliance focus undermines CMMIrsquos primary assumption

3 Product quality focus at Level 4 in CMM was lost in quantitative process control in CMMI

4 The assumptions underlying control charts are violated by software data

Click to edit Master title style Six Sigmarsquos Challenge

Process focus ndash process improvement ndash Six Sigma Product focus ndash product improvement ndash Design for 6σ

Product focus supplements product focus to unlock even more value from software

Six Sigma projects must have significant benefits Huge benefits

Large benefits

Good benefits

Okay benefits

Small benefits

Now what

Ultimately we run out of projects with significant enough benefits to continue the programhellip

so what is the solution that allows continual improvement

Click to edit Master title style Testing is Not Enough ldquoAs higher levels of assurance are demandedhelliptesting cannot deliver the level of confidence required at a reasonable costrdquo

ldquoThe correctness of the code is rarely the weakest linkrdquo

Jackson D (2009) Spinellis D (2006)

ldquohellipa failure to satisfy a non-functional requirement can be critical even catastrophichellipnon-functional requirements are sometimes difficult to verify We cannot write a test case to verify a systemrsquos reliabilityhellipThe ability to associate code to non-functional properties can be a powerful weapon in a software engineerrsquos arsenalrdquo

Click to edit Master title style CASTrsquos Application Intelligence Platform

Application Analysis

Evaluation of 1200+ coding amp

architectural rules

Application meta-data

Transferability

Changeability

Robustness

Performance

Security

Quality Measurements

Detected Violations

Expensive operation in loop Static vs pooled connections Complex query on big table Large indices on big table

Empty CATCH block Uncontrolled data access Poor memory management Opened resource not closed

SQL injection Cross-site scripting Buffer overflow Uncontrolled format string

Unstructured code Misuse of inheritance Lack of comments Violated naming convention

Highly coupled component Duplicated code Index modified in loop High cyclomatic complexity

Language Parsers

Oracle PLSQL Sybase T-SQL SQL Server T-SQL IBM SQLPSM C C++ C Pro C Cobol CICS Visual Basic VBNet ASPNet Java J2EE JSP XML HTML Javascript VBScript PHP PowerBuilder Oracle Forms PeopleSoft SAP ABAP Netweaver Tibco Business Objects Universal Analyzer for other languages

Click to edit Master title style Modern Apps are a Technology Stack Ar

chite

ctur

al C

ompl

ianc

e

Data Flow Transaction Risk

EJB PLSQL

Oracle

SQL Server

DB2

TSQL

Hibernate

Spring

Struts NET

COBOL

Sybase IMS

Messaging

Integration quality Architectural

compliance Risk propagation Application security Resiliency checks Transaction integrity

Function point Effort estimation Data access control SDK versioning Calibration across

technologies IT organization level

System Level 3

Code style amp layout Expression complexity Code documentation Class or program design Basic coding standards Developer level

Unit Level 1

Java Java Java

Java Java

Web Services Java Java Single languagetechnology layer

Intra-technology architecture Intra-layer dependencies Inter-program invocation Security vulnerabilities Development team level

Technology Level 2

JSP ASPNET APIs

Click to edit Master title style Architecturally Complex Defects

Primary cause of operational

problems

20x as many

fixes to correct

Architecturally Complex Defect

A structural flaw involving interactions among multiple components that reside in different application layers

48

52 92

8 Architecturally

Complex Defects

Code unit-level violations

of total app defects

of total repair effort

Most architecturally complex defects involve an architectural hotspot ndash a badly-built component that should be replaced

Click to edit Master title style Propagated Risk

The impact of violation 284 in Component B is very widely propagated throughout the system ndash

it presents very high risk

The impact of violation 342 in Component A is only narrowly propagated in the system

ndash it presents low risk

Propagated Risk Index

A ranking of the risk created by each violation based on the its severity and the breadth of its impact in the system

Violation 342 in component A

Violation 284 in component B

Remediating violation 284 in Component B will have

the greater impact on reducing risk and cost

Click to edit Master title style Transaction Risk

Customers care most about the dependability of their transactions

The risk of a transaction is determined by the number and severity of violations along its path

Transactions can be ranked by risk to prioritize their remediation

The Transaction Risk Index can be further tuned using the operational frequency of each transaction

Transaction Risk Index

A ranking of transactions based on the risk created by the number and severity of violations along the transaction path

Transaction entryexit in the user interface layer

Transaction fulfillment in the data storage layer

1 3 2 4 5

Transaction 1 poses greatest risk

Transaction processing in the business logic layer

Click to edit Master title style Risk-Based Prioritization

Violation Health Factor Severity Propagated Risk Index

In a high risk transaction

Frequency of path execution

148 Security 9

371 Robustness 9

062 Performance 8

284 Robustness 7

016 Changeability 7

241 Security 5

173 Transferability 4

359 Transferability 3

Click to edit Master title style Static Analysis in US DoD

John Keane (2013)

Testing finds lt13 of structural defects

Static analysis before testing is 85 efficient

With testing and static analysis combined test schedules are reduced by 50

Click to edit Master title style Reducing Operational Incidents amp Costs

Study of structural quality measures and maintenance effort across 20 customers in a large global system integrator

TQI increase of 24 decreased corrective maintenance effort by 50

Total Quality Index

Click to edit Master title style Reducing Operational Losses

0

20

40

60

80

100

120

140

160

180

100 150 200 250 300 350 400

Technical Quality Index

Pro

du

ctio

n In

cid

ents

+$2000000 Annual Loss

$50000 Annual Loss

$5000 Annual Loss

Large international investment bank Business critical applications

Click to edit Master title style Factors Affecting Structural Quality

2017 CRASH Report Request from infocastsoftwarecom

Factor apps Robust Security Perform Change Transfer

Technology 1606 20 3 16 26 7

Java-EE

Industry 677 5 5 7 4 6

App type 510 1 1 1 1 1

Source 580 1 1 4

Shore 540

Maturity 72 28 25 15 24 12

Method 208 10 6 14 6

Team size 186 7 5 8 8

of users 262 4 3 5 4

Percentage of variation in structural quality scores accounted for by various factors

Click to edit Master title style The Technical Debt Metaphor

Structural quality problems in production code

Technical Debt

Principal borrowed

Interest on the debt

Business Risk Liability from debt

Opportunity cost

Interestmdashcontinuing IT costs attributable to the violations causing technical debt ie higher maintenance costs greater resource usage etc

Principalcost of fixing problems remaining in the code after release that must be remediated

Opportunity costmdashbenefits that could have been achieved had resources been put on new capability rather than retiring technical debt

Liabilitymdashbusiness costs related to outages breaches corrupted data etc

Technical Debt the future cost of fixing defects remaining in code at release a component of the cost of ownership

Todayrsquos talk focuses on the principal

Curtis et al (2012) IEEE Software

Click to edit Master title style Tech Debt by Quality Factor

Transferability

40

Changeability

30

Security 7

Robustness

18

70 of Technical Debt is in IT Cost (Transferability Changeability)

30 of Technical Debt is in Business Risk

(Robustness Performance Security)

Health Factor proportions are mostly

consistent across technologies

Curtis et al (2012) IEEE Software

Click to edit Master title style Join CISQ wwwit-cisqorg

Click to edit Master title style Section

References

115

Click to edit Master title style References 1 Anderson DJ (2010) Kanban Sequim WA Blue Hole Press

Basili V Trendowicz A Kowalczyk M Heidrich J Seaman C Munch J amp Rombach D (2014) Aligning Organizations through Measurement Heidelberg Springer

Boehm B (1987) Increasing software productivity IEEE Computer 20 (9) 43-57

Boehm B et al (2000) Software Cost Estimation with COCOMO II Prentice-Hall

Chrissis MB Konrad M amp Shrum S (2011) CMMI Third Edition Guidelines for Process Integration and Product Improvement Reading MA Addison Wesley

Cohn M (2006) Agile Estimating and Planning Upper Saddle River NJ Prentice-Hall

Crosby P (1979) Quality Is Free New York McGraw-Hill

Curtis B Sapiddi J amp Szynkarski A (2012) Estimating the principle of an applicationrsquos technical debt IEEE Software 29 (6) 34-42

Dion R (1993) Process improvement and the corporate balance sheet IEEE Software 10 (4) 28-35

Duncker R (1992) Presentation at the 25th Annual Conference of the Singapore Computer Society Singapore November 1992

Dyne K (1999) ESSI Benefits from Continuous Software Improvements In Proceedings of ESEPGrsquo99 Milton Keyes UK ESPI Foundation

Flowe RM amp Thordahl JB (1994) A Correlational Study of the SEIrsquos Capability Maturity Model and Software Development Performance in DoD Contracts (AFITGSSLAR94D-2) Dayton OH Air Force Institute of Technology Wright Patterson Air Force Base

116

Click to edit Master title style References 2 Haley T Ireland B Wojtaszek E Nash D amp Dion R (1995) Raytheon Electronic Systems

Experience in Software Process Improvement (Tech Rep CMUSEI-95-TR-017) ) Pittsburgh Software Engineering Institute Carnegie Mellon University

Ebert C amp Dumke R (2007) Software Measurement Berlin Springer-Verlag

Fagan E amp Davis N (2005) TSP PSP at Intuit Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Fenton N (2006) New directions for Software Metrics CIO Symposium on Software Best Practices

Garmus D amp Herron D (2001) Function Point Analysis Boston Addison-Wesley

George E (2005) Project management with TSP Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Herbsleb J Carleton A Rozum J Siegel J amp Zubrow D (1994) Benefits of CMM-Based Software Process Improvement Initial Results (Tech Rep CMUSEI-94-TR-13) Pittsburgh Software Engineering Institute Carnegie Mellon University

Humphrey W (1997) Introduction to the Personal Software Process Boston Addison Wesley

Humphrey W (2000) Introduction to the Team Software Process Boston Addison Wesley

Humphrey W (2006) TSP Leading a Development Team Boston Addison Wesley

Jackson D (2009) A direct path to dependable software Communications of the ACM 52 (4)

Kan SH (1995) Metrics and Models in Software Quality Engineering Reading MA Addison-Wesley

Keeni B (2000) IEEE Software 16 (4)

117

Click to edit Master title style References 2 Larman C (2004) Agile and Iterative Development Boston Addison Wesley

Lydon T (1995) Productivity drivers Process and capital In Proceedings of the 1995 SEPG Conference Pittsburgh Software Engineering Institute Carnegie Mellon University

McGarry J et al (2002) Practical Software Measurement Reading MA Addison Wesley

Paulk M Weber C Curtis B amp Chrissis M (1995) The Capability Maturity Model Guidelines for Improving the Software Process Reading MA Addison-Wesley

Peterman W (2000) IEEE Software JulyAugust issue

Poppendieck M amp Poppendieck T (2007) Implementing Lean Software Development Boston Addison-Wesley

Sakamoto K Kishida K amp Nakakoji K (1996) Cultural adaptation of the CMM In Fuggetta A amp Wolf A (Eds) Software Process Chichester UK Wiley 137-154

Sinha M (2006) Performance models ndash Enabling organizations prediction capability Proceedings of SEPGrsquo06 Pittsburgh Software Engineering Institute Carnegie Mellon University

Schwaber K amp Beedle M (2002) Agile Software Development Using Scrum

Spinellis D (2006) Code Quality Addison-Wesley

Van Epps S amp Marshall R (2012) Using TSP at the end of the development cycle Proceedings of TSP 2012 Pittsburgh Software Engineering Institute Carnegie Mellon University

Vu JD (1996) Software process improvement A business case In Proceedings of the European SEPG Conference Milton Keynes UK European Software Process Improvement Foundation

118

  • Best Practices for Software Process and Product Measurement
  • Instructor Dr Bill Curtis
  • Agenda
  • Section 1
  • Why Does Measurement Languish
  • Measurement Process Guidance
  • Measurement Categories
  • Productivity Analysis Objectives
  • Productivity Analysis Measures
  • Software Productivity
  • Software Size Measures
  • How Many Lines of Code
  • Function Point Estimation
  • Effort mdash Weakest Measure
  • How Quality Affects Productivity
  • Carry-forward Rework
  • Example of Quality Impact
  • Quality-Adjusted Productivity
  • Best Practices for Analysis
  • 1 Evaluate Demographics
  • 2 Segment Baselines
  • 3 Beware Sampling Effects
  • 4 Understand Variation
  • 5 Investigate Distributions
  • 6 Account for Maturity Effects
  • 7 Caution for External Data Sets
  • Slide Number 27
  • Measuring Project Progress
  • Requirements Change Impact
  • Phase Defect Removal Model
  • SCRUM Board
  • Burndown Chart and Velocity
  • Analyzing Velocity
  • Measuring and Managing Flow
  • Trends Over Sprints
  • Sources and Measures
  • Recommended Books
  • Personal Software Process (PSP)
  • Team Software Process (TSP)
  • Aggregated Personal Data Best
  • Defect Detection at Intuit
  • TSP Benefits
  • Section 3
  • CMMI Maturity Transitions
  • Enhanced Maturity Framework
  • Maturity Level Transformation
  • Measurement by Level
  • Average Improvement Results
  • Schedule Performance
  • Cost Performance
  • Crosbyrsquos Cost of Quality Model
  • Raytheons Cost of Quality
  • Raytheonrsquos Productivity
  • Raytheonrsquos Cost Reduction
  • Raytheonrsquos Cost Predictability
  • OMRONrsquos Reuse gains
  • Section 4
  • What about the Product
  • Six Sigmarsquos Challenge
  • Testing is Not Enough
  • CASTrsquos Application Intelligence Platform
  • Modern Apps are a Technology Stack
  • Architecturally Complex Defects
  • Propagated Risk
  • Transaction Risk
  • Risk-Based Prioritization
  • Static Analysis in US DoD
  • Reducing Operational Incidents amp Costs
  • Reducing Operational Losses
  • Slide Number 70
  • The Technical Debt Metaphor
  • Tech Debt by Quality Factor
  • Join CISQ wwwit-cisqorg
  • Section
  • References 1
  • References 2
  • References 2
Page 57: Best Practices for Software Process and Product Measurement · Software Process and Product Measurement Dr. Bill Curtis ... measurement best practice • Best guidance for starting

Click to edit Master title style Raytheons Cost of Quality

bull Performance mdash cost of building it right first time bull Nonconformance mdash cost of rework bull Appraisal mdash cost of testing bull Prevention mdash cost of preventing nonconformance

Dion (1993) Haley (1995)

23

18

Year Level Perform Nonconf Appraise Prevent

1988 1 34 41 15 7

1990 2 55 18 15 12

1992 3 66 11

1994 4 76 6 75

Click to edit Master title style Raytheonrsquos Productivity

0

1

2

3

4

1988 1989 1990 1991 1992 1993 1994

Growth relative to 1989 baseline

Lydon (1995)

Click to edit Master title style Raytheonrsquos Cost Reduction

0

02

04

06

08

1

12

1988 1989 1990 1991 1992 1993 1994

Reduction relative to 1989 baseline

Lydon (1995)

Click to edit Master title style

Haley (1995)

1993 1992 1991 1990 1989 1988 1994

50

40

30

20

10

0

-10

Actual cost

Budgeted cost

over-run under-run

4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1

Raytheonrsquos Cost Predictability

Click to edit Master title style OMRONrsquos Reuse gains

Sakamoto et al (1996)

0

5 0 0

1 0 0 0

1 5 0 0

2 0 0 0

2 5 0 0

3 0 0 0

1 9 8 7 1 9 8 8 1 9 8 9 1 9 9 0 1 9 9 1 1 9 9 2 1 9 9 3 1 9 9 4

Steps

100

75

50

25

0

of reuse

Click to edit Master title style Section 4 Section 4 Measuring Quality and

Technical Debt

1) Structural quality measurement

2) Technical debt

3) Consortium for IT Software Quality (CISQ)

89

Click to edit Master title style What about the Product CMMIrsquos primary assumption

ldquoThe quality of a software system is governed by quality of the process used to develop itrdquo

- Watts Humphrey

CMMIrsquos product quality problem

1 Assessments do not verify product quality

2 Compliance focus undermines CMMIrsquos primary assumption

3 Product quality focus at Level 4 in CMM was lost in quantitative process control in CMMI

4 The assumptions underlying control charts are violated by software data

Click to edit Master title style Six Sigmarsquos Challenge

Process focus ndash process improvement ndash Six Sigma Product focus ndash product improvement ndash Design for 6σ

Product focus supplements product focus to unlock even more value from software

Six Sigma projects must have significant benefits Huge benefits

Large benefits

Good benefits

Okay benefits

Small benefits

Now what

Ultimately we run out of projects with significant enough benefits to continue the programhellip

so what is the solution that allows continual improvement

Click to edit Master title style Testing is Not Enough ldquoAs higher levels of assurance are demandedhelliptesting cannot deliver the level of confidence required at a reasonable costrdquo

ldquoThe correctness of the code is rarely the weakest linkrdquo

Jackson D (2009) Spinellis D (2006)

ldquohellipa failure to satisfy a non-functional requirement can be critical even catastrophichellipnon-functional requirements are sometimes difficult to verify We cannot write a test case to verify a systemrsquos reliabilityhellipThe ability to associate code to non-functional properties can be a powerful weapon in a software engineerrsquos arsenalrdquo

Click to edit Master title style CASTrsquos Application Intelligence Platform

Application Analysis

Evaluation of 1200+ coding amp

architectural rules

Application meta-data

Transferability

Changeability

Robustness

Performance

Security

Quality Measurements

Detected Violations

Expensive operation in loop Static vs pooled connections Complex query on big table Large indices on big table

Empty CATCH block Uncontrolled data access Poor memory management Opened resource not closed

SQL injection Cross-site scripting Buffer overflow Uncontrolled format string

Unstructured code Misuse of inheritance Lack of comments Violated naming convention

Highly coupled component Duplicated code Index modified in loop High cyclomatic complexity

Language Parsers

Oracle PLSQL Sybase T-SQL SQL Server T-SQL IBM SQLPSM C C++ C Pro C Cobol CICS Visual Basic VBNet ASPNet Java J2EE JSP XML HTML Javascript VBScript PHP PowerBuilder Oracle Forms PeopleSoft SAP ABAP Netweaver Tibco Business Objects Universal Analyzer for other languages

Click to edit Master title style Modern Apps are a Technology Stack Ar

chite

ctur

al C

ompl

ianc

e

Data Flow Transaction Risk

EJB PLSQL

Oracle

SQL Server

DB2

TSQL

Hibernate

Spring

Struts NET

COBOL

Sybase IMS

Messaging

Integration quality Architectural

compliance Risk propagation Application security Resiliency checks Transaction integrity

Function point Effort estimation Data access control SDK versioning Calibration across

technologies IT organization level

System Level 3

Code style amp layout Expression complexity Code documentation Class or program design Basic coding standards Developer level

Unit Level 1

Java Java Java

Java Java

Web Services Java Java Single languagetechnology layer

Intra-technology architecture Intra-layer dependencies Inter-program invocation Security vulnerabilities Development team level

Technology Level 2

JSP ASPNET APIs

Click to edit Master title style Architecturally Complex Defects

Primary cause of operational

problems

20x as many

fixes to correct

Architecturally Complex Defect

A structural flaw involving interactions among multiple components that reside in different application layers

48

52 92

8 Architecturally

Complex Defects

Code unit-level violations

of total app defects

of total repair effort

Most architecturally complex defects involve an architectural hotspot ndash a badly-built component that should be replaced

Click to edit Master title style Propagated Risk

The impact of violation 284 in Component B is very widely propagated throughout the system ndash

it presents very high risk

The impact of violation 342 in Component A is only narrowly propagated in the system

ndash it presents low risk

Propagated Risk Index

A ranking of the risk created by each violation based on the its severity and the breadth of its impact in the system

Violation 342 in component A

Violation 284 in component B

Remediating violation 284 in Component B will have

the greater impact on reducing risk and cost

Click to edit Master title style Transaction Risk

Customers care most about the dependability of their transactions

The risk of a transaction is determined by the number and severity of violations along its path

Transactions can be ranked by risk to prioritize their remediation

The Transaction Risk Index can be further tuned using the operational frequency of each transaction

Transaction Risk Index

A ranking of transactions based on the risk created by the number and severity of violations along the transaction path

Transaction entryexit in the user interface layer

Transaction fulfillment in the data storage layer

1 3 2 4 5

Transaction 1 poses greatest risk

Transaction processing in the business logic layer

Click to edit Master title style Risk-Based Prioritization

Violation Health Factor Severity Propagated Risk Index

In a high risk transaction

Frequency of path execution

148 Security 9

371 Robustness 9

062 Performance 8

284 Robustness 7

016 Changeability 7

241 Security 5

173 Transferability 4

359 Transferability 3

Click to edit Master title style Static Analysis in US DoD

John Keane (2013)

Testing finds lt13 of structural defects

Static analysis before testing is 85 efficient

With testing and static analysis combined test schedules are reduced by 50

Click to edit Master title style Reducing Operational Incidents amp Costs

Study of structural quality measures and maintenance effort across 20 customers in a large global system integrator

TQI increase of 24 decreased corrective maintenance effort by 50

Total Quality Index

Click to edit Master title style Reducing Operational Losses

0

20

40

60

80

100

120

140

160

180

100 150 200 250 300 350 400

Technical Quality Index

Pro

du

ctio

n In

cid

ents

+$2000000 Annual Loss

$50000 Annual Loss

$5000 Annual Loss

Large international investment bank Business critical applications

Click to edit Master title style Factors Affecting Structural Quality

2017 CRASH Report Request from infocastsoftwarecom

Factor apps Robust Security Perform Change Transfer

Technology 1606 20 3 16 26 7

Java-EE

Industry 677 5 5 7 4 6

App type 510 1 1 1 1 1

Source 580 1 1 4

Shore 540

Maturity 72 28 25 15 24 12

Method 208 10 6 14 6

Team size 186 7 5 8 8

of users 262 4 3 5 4

Percentage of variation in structural quality scores accounted for by various factors

Click to edit Master title style The Technical Debt Metaphor

Structural quality problems in production code

Technical Debt

Principal borrowed

Interest on the debt

Business Risk Liability from debt

Opportunity cost

Interestmdashcontinuing IT costs attributable to the violations causing technical debt ie higher maintenance costs greater resource usage etc

Principalcost of fixing problems remaining in the code after release that must be remediated

Opportunity costmdashbenefits that could have been achieved had resources been put on new capability rather than retiring technical debt

Liabilitymdashbusiness costs related to outages breaches corrupted data etc

Technical Debt the future cost of fixing defects remaining in code at release a component of the cost of ownership

Todayrsquos talk focuses on the principal

Curtis et al (2012) IEEE Software

Click to edit Master title style Tech Debt by Quality Factor

Transferability

40

Changeability

30

Security 7

Robustness

18

70 of Technical Debt is in IT Cost (Transferability Changeability)

30 of Technical Debt is in Business Risk

(Robustness Performance Security)

Health Factor proportions are mostly

consistent across technologies

Curtis et al (2012) IEEE Software

Click to edit Master title style Join CISQ wwwit-cisqorg

Click to edit Master title style Section

References

115

Click to edit Master title style References 1 Anderson DJ (2010) Kanban Sequim WA Blue Hole Press

Basili V Trendowicz A Kowalczyk M Heidrich J Seaman C Munch J amp Rombach D (2014) Aligning Organizations through Measurement Heidelberg Springer

Boehm B (1987) Increasing software productivity IEEE Computer 20 (9) 43-57

Boehm B et al (2000) Software Cost Estimation with COCOMO II Prentice-Hall

Chrissis MB Konrad M amp Shrum S (2011) CMMI Third Edition Guidelines for Process Integration and Product Improvement Reading MA Addison Wesley

Cohn M (2006) Agile Estimating and Planning Upper Saddle River NJ Prentice-Hall

Crosby P (1979) Quality Is Free New York McGraw-Hill

Curtis B Sapiddi J amp Szynkarski A (2012) Estimating the principle of an applicationrsquos technical debt IEEE Software 29 (6) 34-42

Dion R (1993) Process improvement and the corporate balance sheet IEEE Software 10 (4) 28-35

Duncker R (1992) Presentation at the 25th Annual Conference of the Singapore Computer Society Singapore November 1992

Dyne K (1999) ESSI Benefits from Continuous Software Improvements In Proceedings of ESEPGrsquo99 Milton Keyes UK ESPI Foundation

Flowe RM amp Thordahl JB (1994) A Correlational Study of the SEIrsquos Capability Maturity Model and Software Development Performance in DoD Contracts (AFITGSSLAR94D-2) Dayton OH Air Force Institute of Technology Wright Patterson Air Force Base

116

Click to edit Master title style References 2 Haley T Ireland B Wojtaszek E Nash D amp Dion R (1995) Raytheon Electronic Systems

Experience in Software Process Improvement (Tech Rep CMUSEI-95-TR-017) ) Pittsburgh Software Engineering Institute Carnegie Mellon University

Ebert C amp Dumke R (2007) Software Measurement Berlin Springer-Verlag

Fagan E amp Davis N (2005) TSP PSP at Intuit Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Fenton N (2006) New directions for Software Metrics CIO Symposium on Software Best Practices

Garmus D amp Herron D (2001) Function Point Analysis Boston Addison-Wesley

George E (2005) Project management with TSP Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Herbsleb J Carleton A Rozum J Siegel J amp Zubrow D (1994) Benefits of CMM-Based Software Process Improvement Initial Results (Tech Rep CMUSEI-94-TR-13) Pittsburgh Software Engineering Institute Carnegie Mellon University

Humphrey W (1997) Introduction to the Personal Software Process Boston Addison Wesley

Humphrey W (2000) Introduction to the Team Software Process Boston Addison Wesley

Humphrey W (2006) TSP Leading a Development Team Boston Addison Wesley

Jackson D (2009) A direct path to dependable software Communications of the ACM 52 (4)

Kan SH (1995) Metrics and Models in Software Quality Engineering Reading MA Addison-Wesley

Keeni B (2000) IEEE Software 16 (4)

117

Click to edit Master title style References 2 Larman C (2004) Agile and Iterative Development Boston Addison Wesley

Lydon T (1995) Productivity drivers Process and capital In Proceedings of the 1995 SEPG Conference Pittsburgh Software Engineering Institute Carnegie Mellon University

McGarry J et al (2002) Practical Software Measurement Reading MA Addison Wesley

Paulk M Weber C Curtis B amp Chrissis M (1995) The Capability Maturity Model Guidelines for Improving the Software Process Reading MA Addison-Wesley

Peterman W (2000) IEEE Software JulyAugust issue

Poppendieck M amp Poppendieck T (2007) Implementing Lean Software Development Boston Addison-Wesley

Sakamoto K Kishida K amp Nakakoji K (1996) Cultural adaptation of the CMM In Fuggetta A amp Wolf A (Eds) Software Process Chichester UK Wiley 137-154

Sinha M (2006) Performance models ndash Enabling organizations prediction capability Proceedings of SEPGrsquo06 Pittsburgh Software Engineering Institute Carnegie Mellon University

Schwaber K amp Beedle M (2002) Agile Software Development Using Scrum

Spinellis D (2006) Code Quality Addison-Wesley

Van Epps S amp Marshall R (2012) Using TSP at the end of the development cycle Proceedings of TSP 2012 Pittsburgh Software Engineering Institute Carnegie Mellon University

Vu JD (1996) Software process improvement A business case In Proceedings of the European SEPG Conference Milton Keynes UK European Software Process Improvement Foundation

118

  • Best Practices for Software Process and Product Measurement
  • Instructor Dr Bill Curtis
  • Agenda
  • Section 1
  • Why Does Measurement Languish
  • Measurement Process Guidance
  • Measurement Categories
  • Productivity Analysis Objectives
  • Productivity Analysis Measures
  • Software Productivity
  • Software Size Measures
  • How Many Lines of Code
  • Function Point Estimation
  • Effort mdash Weakest Measure
  • How Quality Affects Productivity
  • Carry-forward Rework
  • Example of Quality Impact
  • Quality-Adjusted Productivity
  • Best Practices for Analysis
  • 1 Evaluate Demographics
  • 2 Segment Baselines
  • 3 Beware Sampling Effects
  • 4 Understand Variation
  • 5 Investigate Distributions
  • 6 Account for Maturity Effects
  • 7 Caution for External Data Sets
  • Slide Number 27
  • Measuring Project Progress
  • Requirements Change Impact
  • Phase Defect Removal Model
  • SCRUM Board
  • Burndown Chart and Velocity
  • Analyzing Velocity
  • Measuring and Managing Flow
  • Trends Over Sprints
  • Sources and Measures
  • Recommended Books
  • Personal Software Process (PSP)
  • Team Software Process (TSP)
  • Aggregated Personal Data Best
  • Defect Detection at Intuit
  • TSP Benefits
  • Section 3
  • CMMI Maturity Transitions
  • Enhanced Maturity Framework
  • Maturity Level Transformation
  • Measurement by Level
  • Average Improvement Results
  • Schedule Performance
  • Cost Performance
  • Crosbyrsquos Cost of Quality Model
  • Raytheons Cost of Quality
  • Raytheonrsquos Productivity
  • Raytheonrsquos Cost Reduction
  • Raytheonrsquos Cost Predictability
  • OMRONrsquos Reuse gains
  • Section 4
  • What about the Product
  • Six Sigmarsquos Challenge
  • Testing is Not Enough
  • CASTrsquos Application Intelligence Platform
  • Modern Apps are a Technology Stack
  • Architecturally Complex Defects
  • Propagated Risk
  • Transaction Risk
  • Risk-Based Prioritization
  • Static Analysis in US DoD
  • Reducing Operational Incidents amp Costs
  • Reducing Operational Losses
  • Slide Number 70
  • The Technical Debt Metaphor
  • Tech Debt by Quality Factor
  • Join CISQ wwwit-cisqorg
  • Section
  • References 1
  • References 2
  • References 2
Page 58: Best Practices for Software Process and Product Measurement · Software Process and Product Measurement Dr. Bill Curtis ... measurement best practice • Best guidance for starting

Click to edit Master title style Raytheonrsquos Productivity

0

1

2

3

4

1988 1989 1990 1991 1992 1993 1994

Growth relative to 1989 baseline

Lydon (1995)

Click to edit Master title style Raytheonrsquos Cost Reduction

0

02

04

06

08

1

12

1988 1989 1990 1991 1992 1993 1994

Reduction relative to 1989 baseline

Lydon (1995)

Click to edit Master title style

Haley (1995)

1993 1992 1991 1990 1989 1988 1994

50

40

30

20

10

0

-10

Actual cost

Budgeted cost

over-run under-run

4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1

Raytheonrsquos Cost Predictability

Click to edit Master title style OMRONrsquos Reuse gains

Sakamoto et al (1996)

0

5 0 0

1 0 0 0

1 5 0 0

2 0 0 0

2 5 0 0

3 0 0 0

1 9 8 7 1 9 8 8 1 9 8 9 1 9 9 0 1 9 9 1 1 9 9 2 1 9 9 3 1 9 9 4

Steps

100

75

50

25

0

of reuse

Click to edit Master title style Section 4 Section 4 Measuring Quality and

Technical Debt

1) Structural quality measurement

2) Technical debt

3) Consortium for IT Software Quality (CISQ)

89

Click to edit Master title style What about the Product CMMIrsquos primary assumption

ldquoThe quality of a software system is governed by quality of the process used to develop itrdquo

- Watts Humphrey

CMMIrsquos product quality problem

1 Assessments do not verify product quality

2 Compliance focus undermines CMMIrsquos primary assumption

3 Product quality focus at Level 4 in CMM was lost in quantitative process control in CMMI

4 The assumptions underlying control charts are violated by software data

Click to edit Master title style Six Sigmarsquos Challenge

Process focus ndash process improvement ndash Six Sigma Product focus ndash product improvement ndash Design for 6σ

Product focus supplements product focus to unlock even more value from software

Six Sigma projects must have significant benefits Huge benefits

Large benefits

Good benefits

Okay benefits

Small benefits

Now what

Ultimately we run out of projects with significant enough benefits to continue the programhellip

so what is the solution that allows continual improvement

Click to edit Master title style Testing is Not Enough ldquoAs higher levels of assurance are demandedhelliptesting cannot deliver the level of confidence required at a reasonable costrdquo

ldquoThe correctness of the code is rarely the weakest linkrdquo

Jackson D (2009) Spinellis D (2006)

ldquohellipa failure to satisfy a non-functional requirement can be critical even catastrophichellipnon-functional requirements are sometimes difficult to verify We cannot write a test case to verify a systemrsquos reliabilityhellipThe ability to associate code to non-functional properties can be a powerful weapon in a software engineerrsquos arsenalrdquo

Click to edit Master title style CASTrsquos Application Intelligence Platform

Application Analysis

Evaluation of 1200+ coding amp

architectural rules

Application meta-data

Transferability

Changeability

Robustness

Performance

Security

Quality Measurements

Detected Violations

Expensive operation in loop Static vs pooled connections Complex query on big table Large indices on big table

Empty CATCH block Uncontrolled data access Poor memory management Opened resource not closed

SQL injection Cross-site scripting Buffer overflow Uncontrolled format string

Unstructured code Misuse of inheritance Lack of comments Violated naming convention

Highly coupled component Duplicated code Index modified in loop High cyclomatic complexity

Language Parsers

Oracle PLSQL Sybase T-SQL SQL Server T-SQL IBM SQLPSM C C++ C Pro C Cobol CICS Visual Basic VBNet ASPNet Java J2EE JSP XML HTML Javascript VBScript PHP PowerBuilder Oracle Forms PeopleSoft SAP ABAP Netweaver Tibco Business Objects Universal Analyzer for other languages

Click to edit Master title style Modern Apps are a Technology Stack Ar

chite

ctur

al C

ompl

ianc

e

Data Flow Transaction Risk

EJB PLSQL

Oracle

SQL Server

DB2

TSQL

Hibernate

Spring

Struts NET

COBOL

Sybase IMS

Messaging

Integration quality Architectural

compliance Risk propagation Application security Resiliency checks Transaction integrity

Function point Effort estimation Data access control SDK versioning Calibration across

technologies IT organization level

System Level 3

Code style amp layout Expression complexity Code documentation Class or program design Basic coding standards Developer level

Unit Level 1

Java Java Java

Java Java

Web Services Java Java Single languagetechnology layer

Intra-technology architecture Intra-layer dependencies Inter-program invocation Security vulnerabilities Development team level

Technology Level 2

JSP ASPNET APIs

Click to edit Master title style Architecturally Complex Defects

Primary cause of operational

problems

20x as many

fixes to correct

Architecturally Complex Defect

A structural flaw involving interactions among multiple components that reside in different application layers

48

52 92

8 Architecturally

Complex Defects

Code unit-level violations

of total app defects

of total repair effort

Most architecturally complex defects involve an architectural hotspot ndash a badly-built component that should be replaced

Click to edit Master title style Propagated Risk

The impact of violation 284 in Component B is very widely propagated throughout the system ndash

it presents very high risk

The impact of violation 342 in Component A is only narrowly propagated in the system

ndash it presents low risk

Propagated Risk Index

A ranking of the risk created by each violation based on the its severity and the breadth of its impact in the system

Violation 342 in component A

Violation 284 in component B

Remediating violation 284 in Component B will have

the greater impact on reducing risk and cost

Click to edit Master title style Transaction Risk

Customers care most about the dependability of their transactions

The risk of a transaction is determined by the number and severity of violations along its path

Transactions can be ranked by risk to prioritize their remediation

The Transaction Risk Index can be further tuned using the operational frequency of each transaction

Transaction Risk Index

A ranking of transactions based on the risk created by the number and severity of violations along the transaction path

Transaction entryexit in the user interface layer

Transaction fulfillment in the data storage layer

1 3 2 4 5

Transaction 1 poses greatest risk

Transaction processing in the business logic layer

Click to edit Master title style Risk-Based Prioritization

Violation Health Factor Severity Propagated Risk Index

In a high risk transaction

Frequency of path execution

148 Security 9

371 Robustness 9

062 Performance 8

284 Robustness 7

016 Changeability 7

241 Security 5

173 Transferability 4

359 Transferability 3

Click to edit Master title style Static Analysis in US DoD

John Keane (2013)

Testing finds lt13 of structural defects

Static analysis before testing is 85 efficient

With testing and static analysis combined test schedules are reduced by 50

Click to edit Master title style Reducing Operational Incidents amp Costs

Study of structural quality measures and maintenance effort across 20 customers in a large global system integrator

TQI increase of 24 decreased corrective maintenance effort by 50

Total Quality Index

Click to edit Master title style Reducing Operational Losses

0

20

40

60

80

100

120

140

160

180

100 150 200 250 300 350 400

Technical Quality Index

Pro

du

ctio

n In

cid

ents

+$2000000 Annual Loss

$50000 Annual Loss

$5000 Annual Loss

Large international investment bank Business critical applications

Click to edit Master title style Factors Affecting Structural Quality

2017 CRASH Report Request from infocastsoftwarecom

Factor apps Robust Security Perform Change Transfer

Technology 1606 20 3 16 26 7

Java-EE

Industry 677 5 5 7 4 6

App type 510 1 1 1 1 1

Source 580 1 1 4

Shore 540

Maturity 72 28 25 15 24 12

Method 208 10 6 14 6

Team size 186 7 5 8 8

of users 262 4 3 5 4

Percentage of variation in structural quality scores accounted for by various factors

Click to edit Master title style The Technical Debt Metaphor

Structural quality problems in production code

Technical Debt

Principal borrowed

Interest on the debt

Business Risk Liability from debt

Opportunity cost

Interestmdashcontinuing IT costs attributable to the violations causing technical debt ie higher maintenance costs greater resource usage etc

Principalcost of fixing problems remaining in the code after release that must be remediated

Opportunity costmdashbenefits that could have been achieved had resources been put on new capability rather than retiring technical debt

Liabilitymdashbusiness costs related to outages breaches corrupted data etc

Technical Debt the future cost of fixing defects remaining in code at release a component of the cost of ownership

Todayrsquos talk focuses on the principal

Curtis et al (2012) IEEE Software

Click to edit Master title style Tech Debt by Quality Factor

Transferability

40

Changeability

30

Security 7

Robustness

18

70 of Technical Debt is in IT Cost (Transferability Changeability)

30 of Technical Debt is in Business Risk

(Robustness Performance Security)

Health Factor proportions are mostly

consistent across technologies

Curtis et al (2012) IEEE Software

Click to edit Master title style Join CISQ wwwit-cisqorg

Click to edit Master title style Section

References

115

Click to edit Master title style References 1 Anderson DJ (2010) Kanban Sequim WA Blue Hole Press

Basili V Trendowicz A Kowalczyk M Heidrich J Seaman C Munch J amp Rombach D (2014) Aligning Organizations through Measurement Heidelberg Springer

Boehm B (1987) Increasing software productivity IEEE Computer 20 (9) 43-57

Boehm B et al (2000) Software Cost Estimation with COCOMO II Prentice-Hall

Chrissis MB Konrad M amp Shrum S (2011) CMMI Third Edition Guidelines for Process Integration and Product Improvement Reading MA Addison Wesley

Cohn M (2006) Agile Estimating and Planning Upper Saddle River NJ Prentice-Hall

Crosby P (1979) Quality Is Free New York McGraw-Hill

Curtis B Sapiddi J amp Szynkarski A (2012) Estimating the principle of an applicationrsquos technical debt IEEE Software 29 (6) 34-42

Dion R (1993) Process improvement and the corporate balance sheet IEEE Software 10 (4) 28-35

Duncker R (1992) Presentation at the 25th Annual Conference of the Singapore Computer Society Singapore November 1992

Dyne K (1999) ESSI Benefits from Continuous Software Improvements In Proceedings of ESEPGrsquo99 Milton Keyes UK ESPI Foundation

Flowe RM amp Thordahl JB (1994) A Correlational Study of the SEIrsquos Capability Maturity Model and Software Development Performance in DoD Contracts (AFITGSSLAR94D-2) Dayton OH Air Force Institute of Technology Wright Patterson Air Force Base

116

Click to edit Master title style References 2 Haley T Ireland B Wojtaszek E Nash D amp Dion R (1995) Raytheon Electronic Systems

Experience in Software Process Improvement (Tech Rep CMUSEI-95-TR-017) ) Pittsburgh Software Engineering Institute Carnegie Mellon University

Ebert C amp Dumke R (2007) Software Measurement Berlin Springer-Verlag

Fagan E amp Davis N (2005) TSP PSP at Intuit Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Fenton N (2006) New directions for Software Metrics CIO Symposium on Software Best Practices

Garmus D amp Herron D (2001) Function Point Analysis Boston Addison-Wesley

George E (2005) Project management with TSP Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Herbsleb J Carleton A Rozum J Siegel J amp Zubrow D (1994) Benefits of CMM-Based Software Process Improvement Initial Results (Tech Rep CMUSEI-94-TR-13) Pittsburgh Software Engineering Institute Carnegie Mellon University

Humphrey W (1997) Introduction to the Personal Software Process Boston Addison Wesley

Humphrey W (2000) Introduction to the Team Software Process Boston Addison Wesley

Humphrey W (2006) TSP Leading a Development Team Boston Addison Wesley

Jackson D (2009) A direct path to dependable software Communications of the ACM 52 (4)

Kan SH (1995) Metrics and Models in Software Quality Engineering Reading MA Addison-Wesley

Keeni B (2000) IEEE Software 16 (4)

117

Click to edit Master title style References 2 Larman C (2004) Agile and Iterative Development Boston Addison Wesley

Lydon T (1995) Productivity drivers Process and capital In Proceedings of the 1995 SEPG Conference Pittsburgh Software Engineering Institute Carnegie Mellon University

McGarry J et al (2002) Practical Software Measurement Reading MA Addison Wesley

Paulk M Weber C Curtis B amp Chrissis M (1995) The Capability Maturity Model Guidelines for Improving the Software Process Reading MA Addison-Wesley

Peterman W (2000) IEEE Software JulyAugust issue

Poppendieck M amp Poppendieck T (2007) Implementing Lean Software Development Boston Addison-Wesley

Sakamoto K Kishida K amp Nakakoji K (1996) Cultural adaptation of the CMM In Fuggetta A amp Wolf A (Eds) Software Process Chichester UK Wiley 137-154

Sinha M (2006) Performance models ndash Enabling organizations prediction capability Proceedings of SEPGrsquo06 Pittsburgh Software Engineering Institute Carnegie Mellon University

Schwaber K amp Beedle M (2002) Agile Software Development Using Scrum

Spinellis D (2006) Code Quality Addison-Wesley

Van Epps S amp Marshall R (2012) Using TSP at the end of the development cycle Proceedings of TSP 2012 Pittsburgh Software Engineering Institute Carnegie Mellon University

Vu JD (1996) Software process improvement A business case In Proceedings of the European SEPG Conference Milton Keynes UK European Software Process Improvement Foundation

118

  • Best Practices for Software Process and Product Measurement
  • Instructor Dr Bill Curtis
  • Agenda
  • Section 1
  • Why Does Measurement Languish
  • Measurement Process Guidance
  • Measurement Categories
  • Productivity Analysis Objectives
  • Productivity Analysis Measures
  • Software Productivity
  • Software Size Measures
  • How Many Lines of Code
  • Function Point Estimation
  • Effort mdash Weakest Measure
  • How Quality Affects Productivity
  • Carry-forward Rework
  • Example of Quality Impact
  • Quality-Adjusted Productivity
  • Best Practices for Analysis
  • 1 Evaluate Demographics
  • 2 Segment Baselines
  • 3 Beware Sampling Effects
  • 4 Understand Variation
  • 5 Investigate Distributions
  • 6 Account for Maturity Effects
  • 7 Caution for External Data Sets
  • Slide Number 27
  • Measuring Project Progress
  • Requirements Change Impact
  • Phase Defect Removal Model
  • SCRUM Board
  • Burndown Chart and Velocity
  • Analyzing Velocity
  • Measuring and Managing Flow
  • Trends Over Sprints
  • Sources and Measures
  • Recommended Books
  • Personal Software Process (PSP)
  • Team Software Process (TSP)
  • Aggregated Personal Data Best
  • Defect Detection at Intuit
  • TSP Benefits
  • Section 3
  • CMMI Maturity Transitions
  • Enhanced Maturity Framework
  • Maturity Level Transformation
  • Measurement by Level
  • Average Improvement Results
  • Schedule Performance
  • Cost Performance
  • Crosbyrsquos Cost of Quality Model
  • Raytheons Cost of Quality
  • Raytheonrsquos Productivity
  • Raytheonrsquos Cost Reduction
  • Raytheonrsquos Cost Predictability
  • OMRONrsquos Reuse gains
  • Section 4
  • What about the Product
  • Six Sigmarsquos Challenge
  • Testing is Not Enough
  • CASTrsquos Application Intelligence Platform
  • Modern Apps are a Technology Stack
  • Architecturally Complex Defects
  • Propagated Risk
  • Transaction Risk
  • Risk-Based Prioritization
  • Static Analysis in US DoD
  • Reducing Operational Incidents amp Costs
  • Reducing Operational Losses
  • Slide Number 70
  • The Technical Debt Metaphor
  • Tech Debt by Quality Factor
  • Join CISQ wwwit-cisqorg
  • Section
  • References 1
  • References 2
  • References 2
Page 59: Best Practices for Software Process and Product Measurement · Software Process and Product Measurement Dr. Bill Curtis ... measurement best practice • Best guidance for starting

Click to edit Master title style Raytheonrsquos Cost Reduction

0

02

04

06

08

1

12

1988 1989 1990 1991 1992 1993 1994

Reduction relative to 1989 baseline

Lydon (1995)

Click to edit Master title style

Haley (1995)

1993 1992 1991 1990 1989 1988 1994

50

40

30

20

10

0

-10

Actual cost

Budgeted cost

over-run under-run

4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1

Raytheonrsquos Cost Predictability

Click to edit Master title style OMRONrsquos Reuse gains

Sakamoto et al (1996)

0

5 0 0

1 0 0 0

1 5 0 0

2 0 0 0

2 5 0 0

3 0 0 0

1 9 8 7 1 9 8 8 1 9 8 9 1 9 9 0 1 9 9 1 1 9 9 2 1 9 9 3 1 9 9 4

Steps

100

75

50

25

0

of reuse

Click to edit Master title style Section 4 Section 4 Measuring Quality and

Technical Debt

1) Structural quality measurement

2) Technical debt

3) Consortium for IT Software Quality (CISQ)

89

Click to edit Master title style What about the Product CMMIrsquos primary assumption

ldquoThe quality of a software system is governed by quality of the process used to develop itrdquo

- Watts Humphrey

CMMIrsquos product quality problem

1 Assessments do not verify product quality

2 Compliance focus undermines CMMIrsquos primary assumption

3 Product quality focus at Level 4 in CMM was lost in quantitative process control in CMMI

4 The assumptions underlying control charts are violated by software data

Click to edit Master title style Six Sigmarsquos Challenge

Process focus ndash process improvement ndash Six Sigma Product focus ndash product improvement ndash Design for 6σ

Product focus supplements product focus to unlock even more value from software

Six Sigma projects must have significant benefits Huge benefits

Large benefits

Good benefits

Okay benefits

Small benefits

Now what

Ultimately we run out of projects with significant enough benefits to continue the programhellip

so what is the solution that allows continual improvement

Click to edit Master title style Testing is Not Enough ldquoAs higher levels of assurance are demandedhelliptesting cannot deliver the level of confidence required at a reasonable costrdquo

ldquoThe correctness of the code is rarely the weakest linkrdquo

Jackson D (2009) Spinellis D (2006)

ldquohellipa failure to satisfy a non-functional requirement can be critical even catastrophichellipnon-functional requirements are sometimes difficult to verify We cannot write a test case to verify a systemrsquos reliabilityhellipThe ability to associate code to non-functional properties can be a powerful weapon in a software engineerrsquos arsenalrdquo

Click to edit Master title style CASTrsquos Application Intelligence Platform

Application Analysis

Evaluation of 1200+ coding amp

architectural rules

Application meta-data

Transferability

Changeability

Robustness

Performance

Security

Quality Measurements

Detected Violations

Expensive operation in loop Static vs pooled connections Complex query on big table Large indices on big table

Empty CATCH block Uncontrolled data access Poor memory management Opened resource not closed

SQL injection Cross-site scripting Buffer overflow Uncontrolled format string

Unstructured code Misuse of inheritance Lack of comments Violated naming convention

Highly coupled component Duplicated code Index modified in loop High cyclomatic complexity

Language Parsers

Oracle PLSQL Sybase T-SQL SQL Server T-SQL IBM SQLPSM C C++ C Pro C Cobol CICS Visual Basic VBNet ASPNet Java J2EE JSP XML HTML Javascript VBScript PHP PowerBuilder Oracle Forms PeopleSoft SAP ABAP Netweaver Tibco Business Objects Universal Analyzer for other languages

Click to edit Master title style Modern Apps are a Technology Stack Ar

chite

ctur

al C

ompl

ianc

e

Data Flow Transaction Risk

EJB PLSQL

Oracle

SQL Server

DB2

TSQL

Hibernate

Spring

Struts NET

COBOL

Sybase IMS

Messaging

Integration quality Architectural

compliance Risk propagation Application security Resiliency checks Transaction integrity

Function point Effort estimation Data access control SDK versioning Calibration across

technologies IT organization level

System Level 3

Code style amp layout Expression complexity Code documentation Class or program design Basic coding standards Developer level

Unit Level 1

Java Java Java

Java Java

Web Services Java Java Single languagetechnology layer

Intra-technology architecture Intra-layer dependencies Inter-program invocation Security vulnerabilities Development team level

Technology Level 2

JSP ASPNET APIs

Click to edit Master title style Architecturally Complex Defects

Primary cause of operational

problems

20x as many

fixes to correct

Architecturally Complex Defect

A structural flaw involving interactions among multiple components that reside in different application layers

48

52 92

8 Architecturally

Complex Defects

Code unit-level violations

of total app defects

of total repair effort

Most architecturally complex defects involve an architectural hotspot ndash a badly-built component that should be replaced

Click to edit Master title style Propagated Risk

The impact of violation 284 in Component B is very widely propagated throughout the system ndash

it presents very high risk

The impact of violation 342 in Component A is only narrowly propagated in the system

ndash it presents low risk

Propagated Risk Index

A ranking of the risk created by each violation based on the its severity and the breadth of its impact in the system

Violation 342 in component A

Violation 284 in component B

Remediating violation 284 in Component B will have

the greater impact on reducing risk and cost

Click to edit Master title style Transaction Risk

Customers care most about the dependability of their transactions

The risk of a transaction is determined by the number and severity of violations along its path

Transactions can be ranked by risk to prioritize their remediation

The Transaction Risk Index can be further tuned using the operational frequency of each transaction

Transaction Risk Index

A ranking of transactions based on the risk created by the number and severity of violations along the transaction path

Transaction entryexit in the user interface layer

Transaction fulfillment in the data storage layer

1 3 2 4 5

Transaction 1 poses greatest risk

Transaction processing in the business logic layer

Click to edit Master title style Risk-Based Prioritization

Violation Health Factor Severity Propagated Risk Index

In a high risk transaction

Frequency of path execution

148 Security 9

371 Robustness 9

062 Performance 8

284 Robustness 7

016 Changeability 7

241 Security 5

173 Transferability 4

359 Transferability 3

Click to edit Master title style Static Analysis in US DoD

John Keane (2013)

Testing finds lt13 of structural defects

Static analysis before testing is 85 efficient

With testing and static analysis combined test schedules are reduced by 50

Click to edit Master title style Reducing Operational Incidents amp Costs

Study of structural quality measures and maintenance effort across 20 customers in a large global system integrator

TQI increase of 24 decreased corrective maintenance effort by 50

Total Quality Index

Click to edit Master title style Reducing Operational Losses

0

20

40

60

80

100

120

140

160

180

100 150 200 250 300 350 400

Technical Quality Index

Pro

du

ctio

n In

cid

ents

+$2000000 Annual Loss

$50000 Annual Loss

$5000 Annual Loss

Large international investment bank Business critical applications

Click to edit Master title style Factors Affecting Structural Quality

2017 CRASH Report Request from infocastsoftwarecom

Factor apps Robust Security Perform Change Transfer

Technology 1606 20 3 16 26 7

Java-EE

Industry 677 5 5 7 4 6

App type 510 1 1 1 1 1

Source 580 1 1 4

Shore 540

Maturity 72 28 25 15 24 12

Method 208 10 6 14 6

Team size 186 7 5 8 8

of users 262 4 3 5 4

Percentage of variation in structural quality scores accounted for by various factors

Click to edit Master title style The Technical Debt Metaphor

Structural quality problems in production code

Technical Debt

Principal borrowed

Interest on the debt

Business Risk Liability from debt

Opportunity cost

Interestmdashcontinuing IT costs attributable to the violations causing technical debt ie higher maintenance costs greater resource usage etc

Principalcost of fixing problems remaining in the code after release that must be remediated

Opportunity costmdashbenefits that could have been achieved had resources been put on new capability rather than retiring technical debt

Liabilitymdashbusiness costs related to outages breaches corrupted data etc

Technical Debt the future cost of fixing defects remaining in code at release a component of the cost of ownership

Todayrsquos talk focuses on the principal

Curtis et al (2012) IEEE Software

Click to edit Master title style Tech Debt by Quality Factor

Transferability

40

Changeability

30

Security 7

Robustness

18

70 of Technical Debt is in IT Cost (Transferability Changeability)

30 of Technical Debt is in Business Risk

(Robustness Performance Security)

Health Factor proportions are mostly

consistent across technologies

Curtis et al (2012) IEEE Software

Click to edit Master title style Join CISQ wwwit-cisqorg

Click to edit Master title style Section

References

115

Click to edit Master title style References 1 Anderson DJ (2010) Kanban Sequim WA Blue Hole Press

Basili V Trendowicz A Kowalczyk M Heidrich J Seaman C Munch J amp Rombach D (2014) Aligning Organizations through Measurement Heidelberg Springer

Boehm B (1987) Increasing software productivity IEEE Computer 20 (9) 43-57

Boehm B et al (2000) Software Cost Estimation with COCOMO II Prentice-Hall

Chrissis MB Konrad M amp Shrum S (2011) CMMI Third Edition Guidelines for Process Integration and Product Improvement Reading MA Addison Wesley

Cohn M (2006) Agile Estimating and Planning Upper Saddle River NJ Prentice-Hall

Crosby P (1979) Quality Is Free New York McGraw-Hill

Curtis B Sapiddi J amp Szynkarski A (2012) Estimating the principle of an applicationrsquos technical debt IEEE Software 29 (6) 34-42

Dion R (1993) Process improvement and the corporate balance sheet IEEE Software 10 (4) 28-35

Duncker R (1992) Presentation at the 25th Annual Conference of the Singapore Computer Society Singapore November 1992

Dyne K (1999) ESSI Benefits from Continuous Software Improvements In Proceedings of ESEPGrsquo99 Milton Keyes UK ESPI Foundation

Flowe RM amp Thordahl JB (1994) A Correlational Study of the SEIrsquos Capability Maturity Model and Software Development Performance in DoD Contracts (AFITGSSLAR94D-2) Dayton OH Air Force Institute of Technology Wright Patterson Air Force Base

116

Click to edit Master title style References 2 Haley T Ireland B Wojtaszek E Nash D amp Dion R (1995) Raytheon Electronic Systems

Experience in Software Process Improvement (Tech Rep CMUSEI-95-TR-017) ) Pittsburgh Software Engineering Institute Carnegie Mellon University

Ebert C amp Dumke R (2007) Software Measurement Berlin Springer-Verlag

Fagan E amp Davis N (2005) TSP PSP at Intuit Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Fenton N (2006) New directions for Software Metrics CIO Symposium on Software Best Practices

Garmus D amp Herron D (2001) Function Point Analysis Boston Addison-Wesley

George E (2005) Project management with TSP Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Herbsleb J Carleton A Rozum J Siegel J amp Zubrow D (1994) Benefits of CMM-Based Software Process Improvement Initial Results (Tech Rep CMUSEI-94-TR-13) Pittsburgh Software Engineering Institute Carnegie Mellon University

Humphrey W (1997) Introduction to the Personal Software Process Boston Addison Wesley

Humphrey W (2000) Introduction to the Team Software Process Boston Addison Wesley

Humphrey W (2006) TSP Leading a Development Team Boston Addison Wesley

Jackson D (2009) A direct path to dependable software Communications of the ACM 52 (4)

Kan SH (1995) Metrics and Models in Software Quality Engineering Reading MA Addison-Wesley

Keeni B (2000) IEEE Software 16 (4)

117

Click to edit Master title style References 2 Larman C (2004) Agile and Iterative Development Boston Addison Wesley

Lydon T (1995) Productivity drivers Process and capital In Proceedings of the 1995 SEPG Conference Pittsburgh Software Engineering Institute Carnegie Mellon University

McGarry J et al (2002) Practical Software Measurement Reading MA Addison Wesley

Paulk M Weber C Curtis B amp Chrissis M (1995) The Capability Maturity Model Guidelines for Improving the Software Process Reading MA Addison-Wesley

Peterman W (2000) IEEE Software JulyAugust issue

Poppendieck M amp Poppendieck T (2007) Implementing Lean Software Development Boston Addison-Wesley

Sakamoto K Kishida K amp Nakakoji K (1996) Cultural adaptation of the CMM In Fuggetta A amp Wolf A (Eds) Software Process Chichester UK Wiley 137-154

Sinha M (2006) Performance models ndash Enabling organizations prediction capability Proceedings of SEPGrsquo06 Pittsburgh Software Engineering Institute Carnegie Mellon University

Schwaber K amp Beedle M (2002) Agile Software Development Using Scrum

Spinellis D (2006) Code Quality Addison-Wesley

Van Epps S amp Marshall R (2012) Using TSP at the end of the development cycle Proceedings of TSP 2012 Pittsburgh Software Engineering Institute Carnegie Mellon University

Vu JD (1996) Software process improvement A business case In Proceedings of the European SEPG Conference Milton Keynes UK European Software Process Improvement Foundation

118

  • Best Practices for Software Process and Product Measurement
  • Instructor Dr Bill Curtis
  • Agenda
  • Section 1
  • Why Does Measurement Languish
  • Measurement Process Guidance
  • Measurement Categories
  • Productivity Analysis Objectives
  • Productivity Analysis Measures
  • Software Productivity
  • Software Size Measures
  • How Many Lines of Code
  • Function Point Estimation
  • Effort mdash Weakest Measure
  • How Quality Affects Productivity
  • Carry-forward Rework
  • Example of Quality Impact
  • Quality-Adjusted Productivity
  • Best Practices for Analysis
  • 1 Evaluate Demographics
  • 2 Segment Baselines
  • 3 Beware Sampling Effects
  • 4 Understand Variation
  • 5 Investigate Distributions
  • 6 Account for Maturity Effects
  • 7 Caution for External Data Sets
  • Slide Number 27
  • Measuring Project Progress
  • Requirements Change Impact
  • Phase Defect Removal Model
  • SCRUM Board
  • Burndown Chart and Velocity
  • Analyzing Velocity
  • Measuring and Managing Flow
  • Trends Over Sprints
  • Sources and Measures
  • Recommended Books
  • Personal Software Process (PSP)
  • Team Software Process (TSP)
  • Aggregated Personal Data Best
  • Defect Detection at Intuit
  • TSP Benefits
  • Section 3
  • CMMI Maturity Transitions
  • Enhanced Maturity Framework
  • Maturity Level Transformation
  • Measurement by Level
  • Average Improvement Results
  • Schedule Performance
  • Cost Performance
  • Crosbyrsquos Cost of Quality Model
  • Raytheons Cost of Quality
  • Raytheonrsquos Productivity
  • Raytheonrsquos Cost Reduction
  • Raytheonrsquos Cost Predictability
  • OMRONrsquos Reuse gains
  • Section 4
  • What about the Product
  • Six Sigmarsquos Challenge
  • Testing is Not Enough
  • CASTrsquos Application Intelligence Platform
  • Modern Apps are a Technology Stack
  • Architecturally Complex Defects
  • Propagated Risk
  • Transaction Risk
  • Risk-Based Prioritization
  • Static Analysis in US DoD
  • Reducing Operational Incidents amp Costs
  • Reducing Operational Losses
  • Slide Number 70
  • The Technical Debt Metaphor
  • Tech Debt by Quality Factor
  • Join CISQ wwwit-cisqorg
  • Section
  • References 1
  • References 2
  • References 2
Page 60: Best Practices for Software Process and Product Measurement · Software Process and Product Measurement Dr. Bill Curtis ... measurement best practice • Best guidance for starting

Click to edit Master title style

Haley (1995)

1993 1992 1991 1990 1989 1988 1994

50

40

30

20

10

0

-10

Actual cost

Budgeted cost

over-run under-run

4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1

Raytheonrsquos Cost Predictability

Click to edit Master title style OMRONrsquos Reuse gains

Sakamoto et al (1996)

0

5 0 0

1 0 0 0

1 5 0 0

2 0 0 0

2 5 0 0

3 0 0 0

1 9 8 7 1 9 8 8 1 9 8 9 1 9 9 0 1 9 9 1 1 9 9 2 1 9 9 3 1 9 9 4

Steps

100

75

50

25

0

of reuse

Click to edit Master title style Section 4 Section 4 Measuring Quality and

Technical Debt

1) Structural quality measurement

2) Technical debt

3) Consortium for IT Software Quality (CISQ)

89

Click to edit Master title style What about the Product CMMIrsquos primary assumption

ldquoThe quality of a software system is governed by quality of the process used to develop itrdquo

- Watts Humphrey

CMMIrsquos product quality problem

1 Assessments do not verify product quality

2 Compliance focus undermines CMMIrsquos primary assumption

3 Product quality focus at Level 4 in CMM was lost in quantitative process control in CMMI

4 The assumptions underlying control charts are violated by software data

Click to edit Master title style Six Sigmarsquos Challenge

Process focus ndash process improvement ndash Six Sigma Product focus ndash product improvement ndash Design for 6σ

Product focus supplements product focus to unlock even more value from software

Six Sigma projects must have significant benefits Huge benefits

Large benefits

Good benefits

Okay benefits

Small benefits

Now what

Ultimately we run out of projects with significant enough benefits to continue the programhellip

so what is the solution that allows continual improvement

Click to edit Master title style Testing is Not Enough ldquoAs higher levels of assurance are demandedhelliptesting cannot deliver the level of confidence required at a reasonable costrdquo

ldquoThe correctness of the code is rarely the weakest linkrdquo

Jackson D (2009) Spinellis D (2006)

ldquohellipa failure to satisfy a non-functional requirement can be critical even catastrophichellipnon-functional requirements are sometimes difficult to verify We cannot write a test case to verify a systemrsquos reliabilityhellipThe ability to associate code to non-functional properties can be a powerful weapon in a software engineerrsquos arsenalrdquo

Click to edit Master title style CASTrsquos Application Intelligence Platform

Application Analysis

Evaluation of 1200+ coding amp

architectural rules

Application meta-data

Transferability

Changeability

Robustness

Performance

Security

Quality Measurements

Detected Violations

Expensive operation in loop Static vs pooled connections Complex query on big table Large indices on big table

Empty CATCH block Uncontrolled data access Poor memory management Opened resource not closed

SQL injection Cross-site scripting Buffer overflow Uncontrolled format string

Unstructured code Misuse of inheritance Lack of comments Violated naming convention

Highly coupled component Duplicated code Index modified in loop High cyclomatic complexity

Language Parsers

Oracle PLSQL Sybase T-SQL SQL Server T-SQL IBM SQLPSM C C++ C Pro C Cobol CICS Visual Basic VBNet ASPNet Java J2EE JSP XML HTML Javascript VBScript PHP PowerBuilder Oracle Forms PeopleSoft SAP ABAP Netweaver Tibco Business Objects Universal Analyzer for other languages

Click to edit Master title style Modern Apps are a Technology Stack Ar

chite

ctur

al C

ompl

ianc

e

Data Flow Transaction Risk

EJB PLSQL

Oracle

SQL Server

DB2

TSQL

Hibernate

Spring

Struts NET

COBOL

Sybase IMS

Messaging

Integration quality Architectural

compliance Risk propagation Application security Resiliency checks Transaction integrity

Function point Effort estimation Data access control SDK versioning Calibration across

technologies IT organization level

System Level 3

Code style amp layout Expression complexity Code documentation Class or program design Basic coding standards Developer level

Unit Level 1

Java Java Java

Java Java

Web Services Java Java Single languagetechnology layer

Intra-technology architecture Intra-layer dependencies Inter-program invocation Security vulnerabilities Development team level

Technology Level 2

JSP ASPNET APIs

Click to edit Master title style Architecturally Complex Defects

Primary cause of operational

problems

20x as many

fixes to correct

Architecturally Complex Defect

A structural flaw involving interactions among multiple components that reside in different application layers

48

52 92

8 Architecturally

Complex Defects

Code unit-level violations

of total app defects

of total repair effort

Most architecturally complex defects involve an architectural hotspot ndash a badly-built component that should be replaced

Click to edit Master title style Propagated Risk

The impact of violation 284 in Component B is very widely propagated throughout the system ndash

it presents very high risk

The impact of violation 342 in Component A is only narrowly propagated in the system

ndash it presents low risk

Propagated Risk Index

A ranking of the risk created by each violation based on the its severity and the breadth of its impact in the system

Violation 342 in component A

Violation 284 in component B

Remediating violation 284 in Component B will have

the greater impact on reducing risk and cost

Click to edit Master title style Transaction Risk

Customers care most about the dependability of their transactions

The risk of a transaction is determined by the number and severity of violations along its path

Transactions can be ranked by risk to prioritize their remediation

The Transaction Risk Index can be further tuned using the operational frequency of each transaction

Transaction Risk Index

A ranking of transactions based on the risk created by the number and severity of violations along the transaction path

Transaction entryexit in the user interface layer

Transaction fulfillment in the data storage layer

1 3 2 4 5

Transaction 1 poses greatest risk

Transaction processing in the business logic layer

Click to edit Master title style Risk-Based Prioritization

Violation Health Factor Severity Propagated Risk Index

In a high risk transaction

Frequency of path execution

148 Security 9

371 Robustness 9

062 Performance 8

284 Robustness 7

016 Changeability 7

241 Security 5

173 Transferability 4

359 Transferability 3

Click to edit Master title style Static Analysis in US DoD

John Keane (2013)

Testing finds lt13 of structural defects

Static analysis before testing is 85 efficient

With testing and static analysis combined test schedules are reduced by 50

Click to edit Master title style Reducing Operational Incidents amp Costs

Study of structural quality measures and maintenance effort across 20 customers in a large global system integrator

TQI increase of 24 decreased corrective maintenance effort by 50

Total Quality Index

Click to edit Master title style Reducing Operational Losses

0

20

40

60

80

100

120

140

160

180

100 150 200 250 300 350 400

Technical Quality Index

Pro

du

ctio

n In

cid

ents

+$2000000 Annual Loss

$50000 Annual Loss

$5000 Annual Loss

Large international investment bank Business critical applications

Click to edit Master title style Factors Affecting Structural Quality

2017 CRASH Report Request from infocastsoftwarecom

Factor apps Robust Security Perform Change Transfer

Technology 1606 20 3 16 26 7

Java-EE

Industry 677 5 5 7 4 6

App type 510 1 1 1 1 1

Source 580 1 1 4

Shore 540

Maturity 72 28 25 15 24 12

Method 208 10 6 14 6

Team size 186 7 5 8 8

of users 262 4 3 5 4

Percentage of variation in structural quality scores accounted for by various factors

Click to edit Master title style The Technical Debt Metaphor

Structural quality problems in production code

Technical Debt

Principal borrowed

Interest on the debt

Business Risk Liability from debt

Opportunity cost

Interestmdashcontinuing IT costs attributable to the violations causing technical debt ie higher maintenance costs greater resource usage etc

Principalcost of fixing problems remaining in the code after release that must be remediated

Opportunity costmdashbenefits that could have been achieved had resources been put on new capability rather than retiring technical debt

Liabilitymdashbusiness costs related to outages breaches corrupted data etc

Technical Debt the future cost of fixing defects remaining in code at release a component of the cost of ownership

Todayrsquos talk focuses on the principal

Curtis et al (2012) IEEE Software

Click to edit Master title style Tech Debt by Quality Factor

Transferability

40

Changeability

30

Security 7

Robustness

18

70 of Technical Debt is in IT Cost (Transferability Changeability)

30 of Technical Debt is in Business Risk

(Robustness Performance Security)

Health Factor proportions are mostly

consistent across technologies

Curtis et al (2012) IEEE Software

Click to edit Master title style Join CISQ wwwit-cisqorg

Click to edit Master title style Section

References

115

Click to edit Master title style References 1 Anderson DJ (2010) Kanban Sequim WA Blue Hole Press

Basili V Trendowicz A Kowalczyk M Heidrich J Seaman C Munch J amp Rombach D (2014) Aligning Organizations through Measurement Heidelberg Springer

Boehm B (1987) Increasing software productivity IEEE Computer 20 (9) 43-57

Boehm B et al (2000) Software Cost Estimation with COCOMO II Prentice-Hall

Chrissis MB Konrad M amp Shrum S (2011) CMMI Third Edition Guidelines for Process Integration and Product Improvement Reading MA Addison Wesley

Cohn M (2006) Agile Estimating and Planning Upper Saddle River NJ Prentice-Hall

Crosby P (1979) Quality Is Free New York McGraw-Hill

Curtis B Sapiddi J amp Szynkarski A (2012) Estimating the principle of an applicationrsquos technical debt IEEE Software 29 (6) 34-42

Dion R (1993) Process improvement and the corporate balance sheet IEEE Software 10 (4) 28-35

Duncker R (1992) Presentation at the 25th Annual Conference of the Singapore Computer Society Singapore November 1992

Dyne K (1999) ESSI Benefits from Continuous Software Improvements In Proceedings of ESEPGrsquo99 Milton Keyes UK ESPI Foundation

Flowe RM amp Thordahl JB (1994) A Correlational Study of the SEIrsquos Capability Maturity Model and Software Development Performance in DoD Contracts (AFITGSSLAR94D-2) Dayton OH Air Force Institute of Technology Wright Patterson Air Force Base

116

Click to edit Master title style References 2 Haley T Ireland B Wojtaszek E Nash D amp Dion R (1995) Raytheon Electronic Systems

Experience in Software Process Improvement (Tech Rep CMUSEI-95-TR-017) ) Pittsburgh Software Engineering Institute Carnegie Mellon University

Ebert C amp Dumke R (2007) Software Measurement Berlin Springer-Verlag

Fagan E amp Davis N (2005) TSP PSP at Intuit Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Fenton N (2006) New directions for Software Metrics CIO Symposium on Software Best Practices

Garmus D amp Herron D (2001) Function Point Analysis Boston Addison-Wesley

George E (2005) Project management with TSP Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Herbsleb J Carleton A Rozum J Siegel J amp Zubrow D (1994) Benefits of CMM-Based Software Process Improvement Initial Results (Tech Rep CMUSEI-94-TR-13) Pittsburgh Software Engineering Institute Carnegie Mellon University

Humphrey W (1997) Introduction to the Personal Software Process Boston Addison Wesley

Humphrey W (2000) Introduction to the Team Software Process Boston Addison Wesley

Humphrey W (2006) TSP Leading a Development Team Boston Addison Wesley

Jackson D (2009) A direct path to dependable software Communications of the ACM 52 (4)

Kan SH (1995) Metrics and Models in Software Quality Engineering Reading MA Addison-Wesley

Keeni B (2000) IEEE Software 16 (4)

117

Click to edit Master title style References 2 Larman C (2004) Agile and Iterative Development Boston Addison Wesley

Lydon T (1995) Productivity drivers Process and capital In Proceedings of the 1995 SEPG Conference Pittsburgh Software Engineering Institute Carnegie Mellon University

McGarry J et al (2002) Practical Software Measurement Reading MA Addison Wesley

Paulk M Weber C Curtis B amp Chrissis M (1995) The Capability Maturity Model Guidelines for Improving the Software Process Reading MA Addison-Wesley

Peterman W (2000) IEEE Software JulyAugust issue

Poppendieck M amp Poppendieck T (2007) Implementing Lean Software Development Boston Addison-Wesley

Sakamoto K Kishida K amp Nakakoji K (1996) Cultural adaptation of the CMM In Fuggetta A amp Wolf A (Eds) Software Process Chichester UK Wiley 137-154

Sinha M (2006) Performance models ndash Enabling organizations prediction capability Proceedings of SEPGrsquo06 Pittsburgh Software Engineering Institute Carnegie Mellon University

Schwaber K amp Beedle M (2002) Agile Software Development Using Scrum

Spinellis D (2006) Code Quality Addison-Wesley

Van Epps S amp Marshall R (2012) Using TSP at the end of the development cycle Proceedings of TSP 2012 Pittsburgh Software Engineering Institute Carnegie Mellon University

Vu JD (1996) Software process improvement A business case In Proceedings of the European SEPG Conference Milton Keynes UK European Software Process Improvement Foundation

118

  • Best Practices for Software Process and Product Measurement
  • Instructor Dr Bill Curtis
  • Agenda
  • Section 1
  • Why Does Measurement Languish
  • Measurement Process Guidance
  • Measurement Categories
  • Productivity Analysis Objectives
  • Productivity Analysis Measures
  • Software Productivity
  • Software Size Measures
  • How Many Lines of Code
  • Function Point Estimation
  • Effort mdash Weakest Measure
  • How Quality Affects Productivity
  • Carry-forward Rework
  • Example of Quality Impact
  • Quality-Adjusted Productivity
  • Best Practices for Analysis
  • 1 Evaluate Demographics
  • 2 Segment Baselines
  • 3 Beware Sampling Effects
  • 4 Understand Variation
  • 5 Investigate Distributions
  • 6 Account for Maturity Effects
  • 7 Caution for External Data Sets
  • Slide Number 27
  • Measuring Project Progress
  • Requirements Change Impact
  • Phase Defect Removal Model
  • SCRUM Board
  • Burndown Chart and Velocity
  • Analyzing Velocity
  • Measuring and Managing Flow
  • Trends Over Sprints
  • Sources and Measures
  • Recommended Books
  • Personal Software Process (PSP)
  • Team Software Process (TSP)
  • Aggregated Personal Data Best
  • Defect Detection at Intuit
  • TSP Benefits
  • Section 3
  • CMMI Maturity Transitions
  • Enhanced Maturity Framework
  • Maturity Level Transformation
  • Measurement by Level
  • Average Improvement Results
  • Schedule Performance
  • Cost Performance
  • Crosbyrsquos Cost of Quality Model
  • Raytheons Cost of Quality
  • Raytheonrsquos Productivity
  • Raytheonrsquos Cost Reduction
  • Raytheonrsquos Cost Predictability
  • OMRONrsquos Reuse gains
  • Section 4
  • What about the Product
  • Six Sigmarsquos Challenge
  • Testing is Not Enough
  • CASTrsquos Application Intelligence Platform
  • Modern Apps are a Technology Stack
  • Architecturally Complex Defects
  • Propagated Risk
  • Transaction Risk
  • Risk-Based Prioritization
  • Static Analysis in US DoD
  • Reducing Operational Incidents amp Costs
  • Reducing Operational Losses
  • Slide Number 70
  • The Technical Debt Metaphor
  • Tech Debt by Quality Factor
  • Join CISQ wwwit-cisqorg
  • Section
  • References 1
  • References 2
  • References 2
Page 61: Best Practices for Software Process and Product Measurement · Software Process and Product Measurement Dr. Bill Curtis ... measurement best practice • Best guidance for starting

Click to edit Master title style OMRONrsquos Reuse gains

Sakamoto et al (1996)

0

5 0 0

1 0 0 0

1 5 0 0

2 0 0 0

2 5 0 0

3 0 0 0

1 9 8 7 1 9 8 8 1 9 8 9 1 9 9 0 1 9 9 1 1 9 9 2 1 9 9 3 1 9 9 4

Steps

100

75

50

25

0

of reuse

Click to edit Master title style Section 4 Section 4 Measuring Quality and

Technical Debt

1) Structural quality measurement

2) Technical debt

3) Consortium for IT Software Quality (CISQ)

89

Click to edit Master title style What about the Product CMMIrsquos primary assumption

ldquoThe quality of a software system is governed by quality of the process used to develop itrdquo

- Watts Humphrey

CMMIrsquos product quality problem

1 Assessments do not verify product quality

2 Compliance focus undermines CMMIrsquos primary assumption

3 Product quality focus at Level 4 in CMM was lost in quantitative process control in CMMI

4 The assumptions underlying control charts are violated by software data

Click to edit Master title style Six Sigmarsquos Challenge

Process focus ndash process improvement ndash Six Sigma Product focus ndash product improvement ndash Design for 6σ

Product focus supplements product focus to unlock even more value from software

Six Sigma projects must have significant benefits Huge benefits

Large benefits

Good benefits

Okay benefits

Small benefits

Now what

Ultimately we run out of projects with significant enough benefits to continue the programhellip

so what is the solution that allows continual improvement

Click to edit Master title style Testing is Not Enough ldquoAs higher levels of assurance are demandedhelliptesting cannot deliver the level of confidence required at a reasonable costrdquo

ldquoThe correctness of the code is rarely the weakest linkrdquo

Jackson D (2009) Spinellis D (2006)

ldquohellipa failure to satisfy a non-functional requirement can be critical even catastrophichellipnon-functional requirements are sometimes difficult to verify We cannot write a test case to verify a systemrsquos reliabilityhellipThe ability to associate code to non-functional properties can be a powerful weapon in a software engineerrsquos arsenalrdquo

Click to edit Master title style CASTrsquos Application Intelligence Platform

Application Analysis

Evaluation of 1200+ coding amp

architectural rules

Application meta-data

Transferability

Changeability

Robustness

Performance

Security

Quality Measurements

Detected Violations

Expensive operation in loop Static vs pooled connections Complex query on big table Large indices on big table

Empty CATCH block Uncontrolled data access Poor memory management Opened resource not closed

SQL injection Cross-site scripting Buffer overflow Uncontrolled format string

Unstructured code Misuse of inheritance Lack of comments Violated naming convention

Highly coupled component Duplicated code Index modified in loop High cyclomatic complexity

Language Parsers

Oracle PLSQL Sybase T-SQL SQL Server T-SQL IBM SQLPSM C C++ C Pro C Cobol CICS Visual Basic VBNet ASPNet Java J2EE JSP XML HTML Javascript VBScript PHP PowerBuilder Oracle Forms PeopleSoft SAP ABAP Netweaver Tibco Business Objects Universal Analyzer for other languages

Click to edit Master title style Modern Apps are a Technology Stack Ar

chite

ctur

al C

ompl

ianc

e

Data Flow Transaction Risk

EJB PLSQL

Oracle

SQL Server

DB2

TSQL

Hibernate

Spring

Struts NET

COBOL

Sybase IMS

Messaging

Integration quality Architectural

compliance Risk propagation Application security Resiliency checks Transaction integrity

Function point Effort estimation Data access control SDK versioning Calibration across

technologies IT organization level

System Level 3

Code style amp layout Expression complexity Code documentation Class or program design Basic coding standards Developer level

Unit Level 1

Java Java Java

Java Java

Web Services Java Java Single languagetechnology layer

Intra-technology architecture Intra-layer dependencies Inter-program invocation Security vulnerabilities Development team level

Technology Level 2

JSP ASPNET APIs

Click to edit Master title style Architecturally Complex Defects

Primary cause of operational

problems

20x as many

fixes to correct

Architecturally Complex Defect

A structural flaw involving interactions among multiple components that reside in different application layers

48

52 92

8 Architecturally

Complex Defects

Code unit-level violations

of total app defects

of total repair effort

Most architecturally complex defects involve an architectural hotspot ndash a badly-built component that should be replaced

Click to edit Master title style Propagated Risk

The impact of violation 284 in Component B is very widely propagated throughout the system ndash

it presents very high risk

The impact of violation 342 in Component A is only narrowly propagated in the system

ndash it presents low risk

Propagated Risk Index

A ranking of the risk created by each violation based on the its severity and the breadth of its impact in the system

Violation 342 in component A

Violation 284 in component B

Remediating violation 284 in Component B will have

the greater impact on reducing risk and cost

Click to edit Master title style Transaction Risk

Customers care most about the dependability of their transactions

The risk of a transaction is determined by the number and severity of violations along its path

Transactions can be ranked by risk to prioritize their remediation

The Transaction Risk Index can be further tuned using the operational frequency of each transaction

Transaction Risk Index

A ranking of transactions based on the risk created by the number and severity of violations along the transaction path

Transaction entryexit in the user interface layer

Transaction fulfillment in the data storage layer

1 3 2 4 5

Transaction 1 poses greatest risk

Transaction processing in the business logic layer

Click to edit Master title style Risk-Based Prioritization

Violation Health Factor Severity Propagated Risk Index

In a high risk transaction

Frequency of path execution

148 Security 9

371 Robustness 9

062 Performance 8

284 Robustness 7

016 Changeability 7

241 Security 5

173 Transferability 4

359 Transferability 3

Click to edit Master title style Static Analysis in US DoD

John Keane (2013)

Testing finds lt13 of structural defects

Static analysis before testing is 85 efficient

With testing and static analysis combined test schedules are reduced by 50

Click to edit Master title style Reducing Operational Incidents amp Costs

Study of structural quality measures and maintenance effort across 20 customers in a large global system integrator

TQI increase of 24 decreased corrective maintenance effort by 50

Total Quality Index

Click to edit Master title style Reducing Operational Losses

0

20

40

60

80

100

120

140

160

180

100 150 200 250 300 350 400

Technical Quality Index

Pro

du

ctio

n In

cid

ents

+$2000000 Annual Loss

$50000 Annual Loss

$5000 Annual Loss

Large international investment bank Business critical applications

Click to edit Master title style Factors Affecting Structural Quality

2017 CRASH Report Request from infocastsoftwarecom

Factor apps Robust Security Perform Change Transfer

Technology 1606 20 3 16 26 7

Java-EE

Industry 677 5 5 7 4 6

App type 510 1 1 1 1 1

Source 580 1 1 4

Shore 540

Maturity 72 28 25 15 24 12

Method 208 10 6 14 6

Team size 186 7 5 8 8

of users 262 4 3 5 4

Percentage of variation in structural quality scores accounted for by various factors

Click to edit Master title style The Technical Debt Metaphor

Structural quality problems in production code

Technical Debt

Principal borrowed

Interest on the debt

Business Risk Liability from debt

Opportunity cost

Interestmdashcontinuing IT costs attributable to the violations causing technical debt ie higher maintenance costs greater resource usage etc

Principalcost of fixing problems remaining in the code after release that must be remediated

Opportunity costmdashbenefits that could have been achieved had resources been put on new capability rather than retiring technical debt

Liabilitymdashbusiness costs related to outages breaches corrupted data etc

Technical Debt the future cost of fixing defects remaining in code at release a component of the cost of ownership

Todayrsquos talk focuses on the principal

Curtis et al (2012) IEEE Software

Click to edit Master title style Tech Debt by Quality Factor

Transferability

40

Changeability

30

Security 7

Robustness

18

70 of Technical Debt is in IT Cost (Transferability Changeability)

30 of Technical Debt is in Business Risk

(Robustness Performance Security)

Health Factor proportions are mostly

consistent across technologies

Curtis et al (2012) IEEE Software

Click to edit Master title style Join CISQ wwwit-cisqorg

Click to edit Master title style Section

References

115

Click to edit Master title style References 1 Anderson DJ (2010) Kanban Sequim WA Blue Hole Press

Basili V Trendowicz A Kowalczyk M Heidrich J Seaman C Munch J amp Rombach D (2014) Aligning Organizations through Measurement Heidelberg Springer

Boehm B (1987) Increasing software productivity IEEE Computer 20 (9) 43-57

Boehm B et al (2000) Software Cost Estimation with COCOMO II Prentice-Hall

Chrissis MB Konrad M amp Shrum S (2011) CMMI Third Edition Guidelines for Process Integration and Product Improvement Reading MA Addison Wesley

Cohn M (2006) Agile Estimating and Planning Upper Saddle River NJ Prentice-Hall

Crosby P (1979) Quality Is Free New York McGraw-Hill

Curtis B Sapiddi J amp Szynkarski A (2012) Estimating the principle of an applicationrsquos technical debt IEEE Software 29 (6) 34-42

Dion R (1993) Process improvement and the corporate balance sheet IEEE Software 10 (4) 28-35

Duncker R (1992) Presentation at the 25th Annual Conference of the Singapore Computer Society Singapore November 1992

Dyne K (1999) ESSI Benefits from Continuous Software Improvements In Proceedings of ESEPGrsquo99 Milton Keyes UK ESPI Foundation

Flowe RM amp Thordahl JB (1994) A Correlational Study of the SEIrsquos Capability Maturity Model and Software Development Performance in DoD Contracts (AFITGSSLAR94D-2) Dayton OH Air Force Institute of Technology Wright Patterson Air Force Base

116

Click to edit Master title style References 2 Haley T Ireland B Wojtaszek E Nash D amp Dion R (1995) Raytheon Electronic Systems

Experience in Software Process Improvement (Tech Rep CMUSEI-95-TR-017) ) Pittsburgh Software Engineering Institute Carnegie Mellon University

Ebert C amp Dumke R (2007) Software Measurement Berlin Springer-Verlag

Fagan E amp Davis N (2005) TSP PSP at Intuit Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Fenton N (2006) New directions for Software Metrics CIO Symposium on Software Best Practices

Garmus D amp Herron D (2001) Function Point Analysis Boston Addison-Wesley

George E (2005) Project management with TSP Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Herbsleb J Carleton A Rozum J Siegel J amp Zubrow D (1994) Benefits of CMM-Based Software Process Improvement Initial Results (Tech Rep CMUSEI-94-TR-13) Pittsburgh Software Engineering Institute Carnegie Mellon University

Humphrey W (1997) Introduction to the Personal Software Process Boston Addison Wesley

Humphrey W (2000) Introduction to the Team Software Process Boston Addison Wesley

Humphrey W (2006) TSP Leading a Development Team Boston Addison Wesley

Jackson D (2009) A direct path to dependable software Communications of the ACM 52 (4)

Kan SH (1995) Metrics and Models in Software Quality Engineering Reading MA Addison-Wesley

Keeni B (2000) IEEE Software 16 (4)

117

Click to edit Master title style References 2 Larman C (2004) Agile and Iterative Development Boston Addison Wesley

Lydon T (1995) Productivity drivers Process and capital In Proceedings of the 1995 SEPG Conference Pittsburgh Software Engineering Institute Carnegie Mellon University

McGarry J et al (2002) Practical Software Measurement Reading MA Addison Wesley

Paulk M Weber C Curtis B amp Chrissis M (1995) The Capability Maturity Model Guidelines for Improving the Software Process Reading MA Addison-Wesley

Peterman W (2000) IEEE Software JulyAugust issue

Poppendieck M amp Poppendieck T (2007) Implementing Lean Software Development Boston Addison-Wesley

Sakamoto K Kishida K amp Nakakoji K (1996) Cultural adaptation of the CMM In Fuggetta A amp Wolf A (Eds) Software Process Chichester UK Wiley 137-154

Sinha M (2006) Performance models ndash Enabling organizations prediction capability Proceedings of SEPGrsquo06 Pittsburgh Software Engineering Institute Carnegie Mellon University

Schwaber K amp Beedle M (2002) Agile Software Development Using Scrum

Spinellis D (2006) Code Quality Addison-Wesley

Van Epps S amp Marshall R (2012) Using TSP at the end of the development cycle Proceedings of TSP 2012 Pittsburgh Software Engineering Institute Carnegie Mellon University

Vu JD (1996) Software process improvement A business case In Proceedings of the European SEPG Conference Milton Keynes UK European Software Process Improvement Foundation

118

  • Best Practices for Software Process and Product Measurement
  • Instructor Dr Bill Curtis
  • Agenda
  • Section 1
  • Why Does Measurement Languish
  • Measurement Process Guidance
  • Measurement Categories
  • Productivity Analysis Objectives
  • Productivity Analysis Measures
  • Software Productivity
  • Software Size Measures
  • How Many Lines of Code
  • Function Point Estimation
  • Effort mdash Weakest Measure
  • How Quality Affects Productivity
  • Carry-forward Rework
  • Example of Quality Impact
  • Quality-Adjusted Productivity
  • Best Practices for Analysis
  • 1 Evaluate Demographics
  • 2 Segment Baselines
  • 3 Beware Sampling Effects
  • 4 Understand Variation
  • 5 Investigate Distributions
  • 6 Account for Maturity Effects
  • 7 Caution for External Data Sets
  • Slide Number 27
  • Measuring Project Progress
  • Requirements Change Impact
  • Phase Defect Removal Model
  • SCRUM Board
  • Burndown Chart and Velocity
  • Analyzing Velocity
  • Measuring and Managing Flow
  • Trends Over Sprints
  • Sources and Measures
  • Recommended Books
  • Personal Software Process (PSP)
  • Team Software Process (TSP)
  • Aggregated Personal Data Best
  • Defect Detection at Intuit
  • TSP Benefits
  • Section 3
  • CMMI Maturity Transitions
  • Enhanced Maturity Framework
  • Maturity Level Transformation
  • Measurement by Level
  • Average Improvement Results
  • Schedule Performance
  • Cost Performance
  • Crosbyrsquos Cost of Quality Model
  • Raytheons Cost of Quality
  • Raytheonrsquos Productivity
  • Raytheonrsquos Cost Reduction
  • Raytheonrsquos Cost Predictability
  • OMRONrsquos Reuse gains
  • Section 4
  • What about the Product
  • Six Sigmarsquos Challenge
  • Testing is Not Enough
  • CASTrsquos Application Intelligence Platform
  • Modern Apps are a Technology Stack
  • Architecturally Complex Defects
  • Propagated Risk
  • Transaction Risk
  • Risk-Based Prioritization
  • Static Analysis in US DoD
  • Reducing Operational Incidents amp Costs
  • Reducing Operational Losses
  • Slide Number 70
  • The Technical Debt Metaphor
  • Tech Debt by Quality Factor
  • Join CISQ wwwit-cisqorg
  • Section
  • References 1
  • References 2
  • References 2
Page 62: Best Practices for Software Process and Product Measurement · Software Process and Product Measurement Dr. Bill Curtis ... measurement best practice • Best guidance for starting

Click to edit Master title style Section 4 Section 4 Measuring Quality and

Technical Debt

1) Structural quality measurement

2) Technical debt

3) Consortium for IT Software Quality (CISQ)

89

Click to edit Master title style What about the Product CMMIrsquos primary assumption

ldquoThe quality of a software system is governed by quality of the process used to develop itrdquo

- Watts Humphrey

CMMIrsquos product quality problem

1 Assessments do not verify product quality

2 Compliance focus undermines CMMIrsquos primary assumption

3 Product quality focus at Level 4 in CMM was lost in quantitative process control in CMMI

4 The assumptions underlying control charts are violated by software data

Click to edit Master title style Six Sigmarsquos Challenge

Process focus ndash process improvement ndash Six Sigma Product focus ndash product improvement ndash Design for 6σ

Product focus supplements product focus to unlock even more value from software

Six Sigma projects must have significant benefits Huge benefits

Large benefits

Good benefits

Okay benefits

Small benefits

Now what

Ultimately we run out of projects with significant enough benefits to continue the programhellip

so what is the solution that allows continual improvement

Click to edit Master title style Testing is Not Enough ldquoAs higher levels of assurance are demandedhelliptesting cannot deliver the level of confidence required at a reasonable costrdquo

ldquoThe correctness of the code is rarely the weakest linkrdquo

Jackson D (2009) Spinellis D (2006)

ldquohellipa failure to satisfy a non-functional requirement can be critical even catastrophichellipnon-functional requirements are sometimes difficult to verify We cannot write a test case to verify a systemrsquos reliabilityhellipThe ability to associate code to non-functional properties can be a powerful weapon in a software engineerrsquos arsenalrdquo

Click to edit Master title style CASTrsquos Application Intelligence Platform

Application Analysis

Evaluation of 1200+ coding amp

architectural rules

Application meta-data

Transferability

Changeability

Robustness

Performance

Security

Quality Measurements

Detected Violations

Expensive operation in loop Static vs pooled connections Complex query on big table Large indices on big table

Empty CATCH block Uncontrolled data access Poor memory management Opened resource not closed

SQL injection Cross-site scripting Buffer overflow Uncontrolled format string

Unstructured code Misuse of inheritance Lack of comments Violated naming convention

Highly coupled component Duplicated code Index modified in loop High cyclomatic complexity

Language Parsers

Oracle PLSQL Sybase T-SQL SQL Server T-SQL IBM SQLPSM C C++ C Pro C Cobol CICS Visual Basic VBNet ASPNet Java J2EE JSP XML HTML Javascript VBScript PHP PowerBuilder Oracle Forms PeopleSoft SAP ABAP Netweaver Tibco Business Objects Universal Analyzer for other languages

Click to edit Master title style Modern Apps are a Technology Stack Ar

chite

ctur

al C

ompl

ianc

e

Data Flow Transaction Risk

EJB PLSQL

Oracle

SQL Server

DB2

TSQL

Hibernate

Spring

Struts NET

COBOL

Sybase IMS

Messaging

Integration quality Architectural

compliance Risk propagation Application security Resiliency checks Transaction integrity

Function point Effort estimation Data access control SDK versioning Calibration across

technologies IT organization level

System Level 3

Code style amp layout Expression complexity Code documentation Class or program design Basic coding standards Developer level

Unit Level 1

Java Java Java

Java Java

Web Services Java Java Single languagetechnology layer

Intra-technology architecture Intra-layer dependencies Inter-program invocation Security vulnerabilities Development team level

Technology Level 2

JSP ASPNET APIs

Click to edit Master title style Architecturally Complex Defects

Primary cause of operational

problems

20x as many

fixes to correct

Architecturally Complex Defect

A structural flaw involving interactions among multiple components that reside in different application layers

48

52 92

8 Architecturally

Complex Defects

Code unit-level violations

of total app defects

of total repair effort

Most architecturally complex defects involve an architectural hotspot ndash a badly-built component that should be replaced

Click to edit Master title style Propagated Risk

The impact of violation 284 in Component B is very widely propagated throughout the system ndash

it presents very high risk

The impact of violation 342 in Component A is only narrowly propagated in the system

ndash it presents low risk

Propagated Risk Index

A ranking of the risk created by each violation based on the its severity and the breadth of its impact in the system

Violation 342 in component A

Violation 284 in component B

Remediating violation 284 in Component B will have

the greater impact on reducing risk and cost

Click to edit Master title style Transaction Risk

Customers care most about the dependability of their transactions

The risk of a transaction is determined by the number and severity of violations along its path

Transactions can be ranked by risk to prioritize their remediation

The Transaction Risk Index can be further tuned using the operational frequency of each transaction

Transaction Risk Index

A ranking of transactions based on the risk created by the number and severity of violations along the transaction path

Transaction entryexit in the user interface layer

Transaction fulfillment in the data storage layer

1 3 2 4 5

Transaction 1 poses greatest risk

Transaction processing in the business logic layer

Click to edit Master title style Risk-Based Prioritization

Violation Health Factor Severity Propagated Risk Index

In a high risk transaction

Frequency of path execution

148 Security 9

371 Robustness 9

062 Performance 8

284 Robustness 7

016 Changeability 7

241 Security 5

173 Transferability 4

359 Transferability 3

Click to edit Master title style Static Analysis in US DoD

John Keane (2013)

Testing finds lt13 of structural defects

Static analysis before testing is 85 efficient

With testing and static analysis combined test schedules are reduced by 50

Click to edit Master title style Reducing Operational Incidents amp Costs

Study of structural quality measures and maintenance effort across 20 customers in a large global system integrator

TQI increase of 24 decreased corrective maintenance effort by 50

Total Quality Index

Click to edit Master title style Reducing Operational Losses

0

20

40

60

80

100

120

140

160

180

100 150 200 250 300 350 400

Technical Quality Index

Pro

du

ctio

n In

cid

ents

+$2000000 Annual Loss

$50000 Annual Loss

$5000 Annual Loss

Large international investment bank Business critical applications

Click to edit Master title style Factors Affecting Structural Quality

2017 CRASH Report Request from infocastsoftwarecom

Factor apps Robust Security Perform Change Transfer

Technology 1606 20 3 16 26 7

Java-EE

Industry 677 5 5 7 4 6

App type 510 1 1 1 1 1

Source 580 1 1 4

Shore 540

Maturity 72 28 25 15 24 12

Method 208 10 6 14 6

Team size 186 7 5 8 8

of users 262 4 3 5 4

Percentage of variation in structural quality scores accounted for by various factors

Click to edit Master title style The Technical Debt Metaphor

Structural quality problems in production code

Technical Debt

Principal borrowed

Interest on the debt

Business Risk Liability from debt

Opportunity cost

Interestmdashcontinuing IT costs attributable to the violations causing technical debt ie higher maintenance costs greater resource usage etc

Principalcost of fixing problems remaining in the code after release that must be remediated

Opportunity costmdashbenefits that could have been achieved had resources been put on new capability rather than retiring technical debt

Liabilitymdashbusiness costs related to outages breaches corrupted data etc

Technical Debt the future cost of fixing defects remaining in code at release a component of the cost of ownership

Todayrsquos talk focuses on the principal

Curtis et al (2012) IEEE Software

Click to edit Master title style Tech Debt by Quality Factor

Transferability

40

Changeability

30

Security 7

Robustness

18

70 of Technical Debt is in IT Cost (Transferability Changeability)

30 of Technical Debt is in Business Risk

(Robustness Performance Security)

Health Factor proportions are mostly

consistent across technologies

Curtis et al (2012) IEEE Software

Click to edit Master title style Join CISQ wwwit-cisqorg

Click to edit Master title style Section

References

115

Click to edit Master title style References 1 Anderson DJ (2010) Kanban Sequim WA Blue Hole Press

Basili V Trendowicz A Kowalczyk M Heidrich J Seaman C Munch J amp Rombach D (2014) Aligning Organizations through Measurement Heidelberg Springer

Boehm B (1987) Increasing software productivity IEEE Computer 20 (9) 43-57

Boehm B et al (2000) Software Cost Estimation with COCOMO II Prentice-Hall

Chrissis MB Konrad M amp Shrum S (2011) CMMI Third Edition Guidelines for Process Integration and Product Improvement Reading MA Addison Wesley

Cohn M (2006) Agile Estimating and Planning Upper Saddle River NJ Prentice-Hall

Crosby P (1979) Quality Is Free New York McGraw-Hill

Curtis B Sapiddi J amp Szynkarski A (2012) Estimating the principle of an applicationrsquos technical debt IEEE Software 29 (6) 34-42

Dion R (1993) Process improvement and the corporate balance sheet IEEE Software 10 (4) 28-35

Duncker R (1992) Presentation at the 25th Annual Conference of the Singapore Computer Society Singapore November 1992

Dyne K (1999) ESSI Benefits from Continuous Software Improvements In Proceedings of ESEPGrsquo99 Milton Keyes UK ESPI Foundation

Flowe RM amp Thordahl JB (1994) A Correlational Study of the SEIrsquos Capability Maturity Model and Software Development Performance in DoD Contracts (AFITGSSLAR94D-2) Dayton OH Air Force Institute of Technology Wright Patterson Air Force Base

116

Click to edit Master title style References 2 Haley T Ireland B Wojtaszek E Nash D amp Dion R (1995) Raytheon Electronic Systems

Experience in Software Process Improvement (Tech Rep CMUSEI-95-TR-017) ) Pittsburgh Software Engineering Institute Carnegie Mellon University

Ebert C amp Dumke R (2007) Software Measurement Berlin Springer-Verlag

Fagan E amp Davis N (2005) TSP PSP at Intuit Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Fenton N (2006) New directions for Software Metrics CIO Symposium on Software Best Practices

Garmus D amp Herron D (2001) Function Point Analysis Boston Addison-Wesley

George E (2005) Project management with TSP Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Herbsleb J Carleton A Rozum J Siegel J amp Zubrow D (1994) Benefits of CMM-Based Software Process Improvement Initial Results (Tech Rep CMUSEI-94-TR-13) Pittsburgh Software Engineering Institute Carnegie Mellon University

Humphrey W (1997) Introduction to the Personal Software Process Boston Addison Wesley

Humphrey W (2000) Introduction to the Team Software Process Boston Addison Wesley

Humphrey W (2006) TSP Leading a Development Team Boston Addison Wesley

Jackson D (2009) A direct path to dependable software Communications of the ACM 52 (4)

Kan SH (1995) Metrics and Models in Software Quality Engineering Reading MA Addison-Wesley

Keeni B (2000) IEEE Software 16 (4)

117

Click to edit Master title style References 2 Larman C (2004) Agile and Iterative Development Boston Addison Wesley

Lydon T (1995) Productivity drivers Process and capital In Proceedings of the 1995 SEPG Conference Pittsburgh Software Engineering Institute Carnegie Mellon University

McGarry J et al (2002) Practical Software Measurement Reading MA Addison Wesley

Paulk M Weber C Curtis B amp Chrissis M (1995) The Capability Maturity Model Guidelines for Improving the Software Process Reading MA Addison-Wesley

Peterman W (2000) IEEE Software JulyAugust issue

Poppendieck M amp Poppendieck T (2007) Implementing Lean Software Development Boston Addison-Wesley

Sakamoto K Kishida K amp Nakakoji K (1996) Cultural adaptation of the CMM In Fuggetta A amp Wolf A (Eds) Software Process Chichester UK Wiley 137-154

Sinha M (2006) Performance models ndash Enabling organizations prediction capability Proceedings of SEPGrsquo06 Pittsburgh Software Engineering Institute Carnegie Mellon University

Schwaber K amp Beedle M (2002) Agile Software Development Using Scrum

Spinellis D (2006) Code Quality Addison-Wesley

Van Epps S amp Marshall R (2012) Using TSP at the end of the development cycle Proceedings of TSP 2012 Pittsburgh Software Engineering Institute Carnegie Mellon University

Vu JD (1996) Software process improvement A business case In Proceedings of the European SEPG Conference Milton Keynes UK European Software Process Improvement Foundation

118

  • Best Practices for Software Process and Product Measurement
  • Instructor Dr Bill Curtis
  • Agenda
  • Section 1
  • Why Does Measurement Languish
  • Measurement Process Guidance
  • Measurement Categories
  • Productivity Analysis Objectives
  • Productivity Analysis Measures
  • Software Productivity
  • Software Size Measures
  • How Many Lines of Code
  • Function Point Estimation
  • Effort mdash Weakest Measure
  • How Quality Affects Productivity
  • Carry-forward Rework
  • Example of Quality Impact
  • Quality-Adjusted Productivity
  • Best Practices for Analysis
  • 1 Evaluate Demographics
  • 2 Segment Baselines
  • 3 Beware Sampling Effects
  • 4 Understand Variation
  • 5 Investigate Distributions
  • 6 Account for Maturity Effects
  • 7 Caution for External Data Sets
  • Slide Number 27
  • Measuring Project Progress
  • Requirements Change Impact
  • Phase Defect Removal Model
  • SCRUM Board
  • Burndown Chart and Velocity
  • Analyzing Velocity
  • Measuring and Managing Flow
  • Trends Over Sprints
  • Sources and Measures
  • Recommended Books
  • Personal Software Process (PSP)
  • Team Software Process (TSP)
  • Aggregated Personal Data Best
  • Defect Detection at Intuit
  • TSP Benefits
  • Section 3
  • CMMI Maturity Transitions
  • Enhanced Maturity Framework
  • Maturity Level Transformation
  • Measurement by Level
  • Average Improvement Results
  • Schedule Performance
  • Cost Performance
  • Crosbyrsquos Cost of Quality Model
  • Raytheons Cost of Quality
  • Raytheonrsquos Productivity
  • Raytheonrsquos Cost Reduction
  • Raytheonrsquos Cost Predictability
  • OMRONrsquos Reuse gains
  • Section 4
  • What about the Product
  • Six Sigmarsquos Challenge
  • Testing is Not Enough
  • CASTrsquos Application Intelligence Platform
  • Modern Apps are a Technology Stack
  • Architecturally Complex Defects
  • Propagated Risk
  • Transaction Risk
  • Risk-Based Prioritization
  • Static Analysis in US DoD
  • Reducing Operational Incidents amp Costs
  • Reducing Operational Losses
  • Slide Number 70
  • The Technical Debt Metaphor
  • Tech Debt by Quality Factor
  • Join CISQ wwwit-cisqorg
  • Section
  • References 1
  • References 2
  • References 2
Page 63: Best Practices for Software Process and Product Measurement · Software Process and Product Measurement Dr. Bill Curtis ... measurement best practice • Best guidance for starting

Click to edit Master title style What about the Product CMMIrsquos primary assumption

ldquoThe quality of a software system is governed by quality of the process used to develop itrdquo

- Watts Humphrey

CMMIrsquos product quality problem

1 Assessments do not verify product quality

2 Compliance focus undermines CMMIrsquos primary assumption

3 Product quality focus at Level 4 in CMM was lost in quantitative process control in CMMI

4 The assumptions underlying control charts are violated by software data

Click to edit Master title style Six Sigmarsquos Challenge

Process focus ndash process improvement ndash Six Sigma Product focus ndash product improvement ndash Design for 6σ

Product focus supplements product focus to unlock even more value from software

Six Sigma projects must have significant benefits Huge benefits

Large benefits

Good benefits

Okay benefits

Small benefits

Now what

Ultimately we run out of projects with significant enough benefits to continue the programhellip

so what is the solution that allows continual improvement

Click to edit Master title style Testing is Not Enough ldquoAs higher levels of assurance are demandedhelliptesting cannot deliver the level of confidence required at a reasonable costrdquo

ldquoThe correctness of the code is rarely the weakest linkrdquo

Jackson D (2009) Spinellis D (2006)

ldquohellipa failure to satisfy a non-functional requirement can be critical even catastrophichellipnon-functional requirements are sometimes difficult to verify We cannot write a test case to verify a systemrsquos reliabilityhellipThe ability to associate code to non-functional properties can be a powerful weapon in a software engineerrsquos arsenalrdquo

Click to edit Master title style CASTrsquos Application Intelligence Platform

Application Analysis

Evaluation of 1200+ coding amp

architectural rules

Application meta-data

Transferability

Changeability

Robustness

Performance

Security

Quality Measurements

Detected Violations

Expensive operation in loop Static vs pooled connections Complex query on big table Large indices on big table

Empty CATCH block Uncontrolled data access Poor memory management Opened resource not closed

SQL injection Cross-site scripting Buffer overflow Uncontrolled format string

Unstructured code Misuse of inheritance Lack of comments Violated naming convention

Highly coupled component Duplicated code Index modified in loop High cyclomatic complexity

Language Parsers

Oracle PLSQL Sybase T-SQL SQL Server T-SQL IBM SQLPSM C C++ C Pro C Cobol CICS Visual Basic VBNet ASPNet Java J2EE JSP XML HTML Javascript VBScript PHP PowerBuilder Oracle Forms PeopleSoft SAP ABAP Netweaver Tibco Business Objects Universal Analyzer for other languages

Click to edit Master title style Modern Apps are a Technology Stack Ar

chite

ctur

al C

ompl

ianc

e

Data Flow Transaction Risk

EJB PLSQL

Oracle

SQL Server

DB2

TSQL

Hibernate

Spring

Struts NET

COBOL

Sybase IMS

Messaging

Integration quality Architectural

compliance Risk propagation Application security Resiliency checks Transaction integrity

Function point Effort estimation Data access control SDK versioning Calibration across

technologies IT organization level

System Level 3

Code style amp layout Expression complexity Code documentation Class or program design Basic coding standards Developer level

Unit Level 1

Java Java Java

Java Java

Web Services Java Java Single languagetechnology layer

Intra-technology architecture Intra-layer dependencies Inter-program invocation Security vulnerabilities Development team level

Technology Level 2

JSP ASPNET APIs

Click to edit Master title style Architecturally Complex Defects

Primary cause of operational

problems

20x as many

fixes to correct

Architecturally Complex Defect

A structural flaw involving interactions among multiple components that reside in different application layers

48

52 92

8 Architecturally

Complex Defects

Code unit-level violations

of total app defects

of total repair effort

Most architecturally complex defects involve an architectural hotspot ndash a badly-built component that should be replaced

Click to edit Master title style Propagated Risk

The impact of violation 284 in Component B is very widely propagated throughout the system ndash

it presents very high risk

The impact of violation 342 in Component A is only narrowly propagated in the system

ndash it presents low risk

Propagated Risk Index

A ranking of the risk created by each violation based on the its severity and the breadth of its impact in the system

Violation 342 in component A

Violation 284 in component B

Remediating violation 284 in Component B will have

the greater impact on reducing risk and cost

Click to edit Master title style Transaction Risk

Customers care most about the dependability of their transactions

The risk of a transaction is determined by the number and severity of violations along its path

Transactions can be ranked by risk to prioritize their remediation

The Transaction Risk Index can be further tuned using the operational frequency of each transaction

Transaction Risk Index

A ranking of transactions based on the risk created by the number and severity of violations along the transaction path

Transaction entryexit in the user interface layer

Transaction fulfillment in the data storage layer

1 3 2 4 5

Transaction 1 poses greatest risk

Transaction processing in the business logic layer

Click to edit Master title style Risk-Based Prioritization

Violation Health Factor Severity Propagated Risk Index

In a high risk transaction

Frequency of path execution

148 Security 9

371 Robustness 9

062 Performance 8

284 Robustness 7

016 Changeability 7

241 Security 5

173 Transferability 4

359 Transferability 3

Click to edit Master title style Static Analysis in US DoD

John Keane (2013)

Testing finds lt13 of structural defects

Static analysis before testing is 85 efficient

With testing and static analysis combined test schedules are reduced by 50

Click to edit Master title style Reducing Operational Incidents amp Costs

Study of structural quality measures and maintenance effort across 20 customers in a large global system integrator

TQI increase of 24 decreased corrective maintenance effort by 50

Total Quality Index

Click to edit Master title style Reducing Operational Losses

0

20

40

60

80

100

120

140

160

180

100 150 200 250 300 350 400

Technical Quality Index

Pro

du

ctio

n In

cid

ents

+$2000000 Annual Loss

$50000 Annual Loss

$5000 Annual Loss

Large international investment bank Business critical applications

Click to edit Master title style Factors Affecting Structural Quality

2017 CRASH Report Request from infocastsoftwarecom

Factor apps Robust Security Perform Change Transfer

Technology 1606 20 3 16 26 7

Java-EE

Industry 677 5 5 7 4 6

App type 510 1 1 1 1 1

Source 580 1 1 4

Shore 540

Maturity 72 28 25 15 24 12

Method 208 10 6 14 6

Team size 186 7 5 8 8

of users 262 4 3 5 4

Percentage of variation in structural quality scores accounted for by various factors

Click to edit Master title style The Technical Debt Metaphor

Structural quality problems in production code

Technical Debt

Principal borrowed

Interest on the debt

Business Risk Liability from debt

Opportunity cost

Interestmdashcontinuing IT costs attributable to the violations causing technical debt ie higher maintenance costs greater resource usage etc

Principalcost of fixing problems remaining in the code after release that must be remediated

Opportunity costmdashbenefits that could have been achieved had resources been put on new capability rather than retiring technical debt

Liabilitymdashbusiness costs related to outages breaches corrupted data etc

Technical Debt the future cost of fixing defects remaining in code at release a component of the cost of ownership

Todayrsquos talk focuses on the principal

Curtis et al (2012) IEEE Software

Click to edit Master title style Tech Debt by Quality Factor

Transferability

40

Changeability

30

Security 7

Robustness

18

70 of Technical Debt is in IT Cost (Transferability Changeability)

30 of Technical Debt is in Business Risk

(Robustness Performance Security)

Health Factor proportions are mostly

consistent across technologies

Curtis et al (2012) IEEE Software

Click to edit Master title style Join CISQ wwwit-cisqorg

Click to edit Master title style Section

References

115

Click to edit Master title style References 1 Anderson DJ (2010) Kanban Sequim WA Blue Hole Press

Basili V Trendowicz A Kowalczyk M Heidrich J Seaman C Munch J amp Rombach D (2014) Aligning Organizations through Measurement Heidelberg Springer

Boehm B (1987) Increasing software productivity IEEE Computer 20 (9) 43-57

Boehm B et al (2000) Software Cost Estimation with COCOMO II Prentice-Hall

Chrissis MB Konrad M amp Shrum S (2011) CMMI Third Edition Guidelines for Process Integration and Product Improvement Reading MA Addison Wesley

Cohn M (2006) Agile Estimating and Planning Upper Saddle River NJ Prentice-Hall

Crosby P (1979) Quality Is Free New York McGraw-Hill

Curtis B Sapiddi J amp Szynkarski A (2012) Estimating the principle of an applicationrsquos technical debt IEEE Software 29 (6) 34-42

Dion R (1993) Process improvement and the corporate balance sheet IEEE Software 10 (4) 28-35

Duncker R (1992) Presentation at the 25th Annual Conference of the Singapore Computer Society Singapore November 1992

Dyne K (1999) ESSI Benefits from Continuous Software Improvements In Proceedings of ESEPGrsquo99 Milton Keyes UK ESPI Foundation

Flowe RM amp Thordahl JB (1994) A Correlational Study of the SEIrsquos Capability Maturity Model and Software Development Performance in DoD Contracts (AFITGSSLAR94D-2) Dayton OH Air Force Institute of Technology Wright Patterson Air Force Base

116

Click to edit Master title style References 2 Haley T Ireland B Wojtaszek E Nash D amp Dion R (1995) Raytheon Electronic Systems

Experience in Software Process Improvement (Tech Rep CMUSEI-95-TR-017) ) Pittsburgh Software Engineering Institute Carnegie Mellon University

Ebert C amp Dumke R (2007) Software Measurement Berlin Springer-Verlag

Fagan E amp Davis N (2005) TSP PSP at Intuit Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Fenton N (2006) New directions for Software Metrics CIO Symposium on Software Best Practices

Garmus D amp Herron D (2001) Function Point Analysis Boston Addison-Wesley

George E (2005) Project management with TSP Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Herbsleb J Carleton A Rozum J Siegel J amp Zubrow D (1994) Benefits of CMM-Based Software Process Improvement Initial Results (Tech Rep CMUSEI-94-TR-13) Pittsburgh Software Engineering Institute Carnegie Mellon University

Humphrey W (1997) Introduction to the Personal Software Process Boston Addison Wesley

Humphrey W (2000) Introduction to the Team Software Process Boston Addison Wesley

Humphrey W (2006) TSP Leading a Development Team Boston Addison Wesley

Jackson D (2009) A direct path to dependable software Communications of the ACM 52 (4)

Kan SH (1995) Metrics and Models in Software Quality Engineering Reading MA Addison-Wesley

Keeni B (2000) IEEE Software 16 (4)

117

Click to edit Master title style References 2 Larman C (2004) Agile and Iterative Development Boston Addison Wesley

Lydon T (1995) Productivity drivers Process and capital In Proceedings of the 1995 SEPG Conference Pittsburgh Software Engineering Institute Carnegie Mellon University

McGarry J et al (2002) Practical Software Measurement Reading MA Addison Wesley

Paulk M Weber C Curtis B amp Chrissis M (1995) The Capability Maturity Model Guidelines for Improving the Software Process Reading MA Addison-Wesley

Peterman W (2000) IEEE Software JulyAugust issue

Poppendieck M amp Poppendieck T (2007) Implementing Lean Software Development Boston Addison-Wesley

Sakamoto K Kishida K amp Nakakoji K (1996) Cultural adaptation of the CMM In Fuggetta A amp Wolf A (Eds) Software Process Chichester UK Wiley 137-154

Sinha M (2006) Performance models ndash Enabling organizations prediction capability Proceedings of SEPGrsquo06 Pittsburgh Software Engineering Institute Carnegie Mellon University

Schwaber K amp Beedle M (2002) Agile Software Development Using Scrum

Spinellis D (2006) Code Quality Addison-Wesley

Van Epps S amp Marshall R (2012) Using TSP at the end of the development cycle Proceedings of TSP 2012 Pittsburgh Software Engineering Institute Carnegie Mellon University

Vu JD (1996) Software process improvement A business case In Proceedings of the European SEPG Conference Milton Keynes UK European Software Process Improvement Foundation

118

  • Best Practices for Software Process and Product Measurement
  • Instructor Dr Bill Curtis
  • Agenda
  • Section 1
  • Why Does Measurement Languish
  • Measurement Process Guidance
  • Measurement Categories
  • Productivity Analysis Objectives
  • Productivity Analysis Measures
  • Software Productivity
  • Software Size Measures
  • How Many Lines of Code
  • Function Point Estimation
  • Effort mdash Weakest Measure
  • How Quality Affects Productivity
  • Carry-forward Rework
  • Example of Quality Impact
  • Quality-Adjusted Productivity
  • Best Practices for Analysis
  • 1 Evaluate Demographics
  • 2 Segment Baselines
  • 3 Beware Sampling Effects
  • 4 Understand Variation
  • 5 Investigate Distributions
  • 6 Account for Maturity Effects
  • 7 Caution for External Data Sets
  • Slide Number 27
  • Measuring Project Progress
  • Requirements Change Impact
  • Phase Defect Removal Model
  • SCRUM Board
  • Burndown Chart and Velocity
  • Analyzing Velocity
  • Measuring and Managing Flow
  • Trends Over Sprints
  • Sources and Measures
  • Recommended Books
  • Personal Software Process (PSP)
  • Team Software Process (TSP)
  • Aggregated Personal Data Best
  • Defect Detection at Intuit
  • TSP Benefits
  • Section 3
  • CMMI Maturity Transitions
  • Enhanced Maturity Framework
  • Maturity Level Transformation
  • Measurement by Level
  • Average Improvement Results
  • Schedule Performance
  • Cost Performance
  • Crosbyrsquos Cost of Quality Model
  • Raytheons Cost of Quality
  • Raytheonrsquos Productivity
  • Raytheonrsquos Cost Reduction
  • Raytheonrsquos Cost Predictability
  • OMRONrsquos Reuse gains
  • Section 4
  • What about the Product
  • Six Sigmarsquos Challenge
  • Testing is Not Enough
  • CASTrsquos Application Intelligence Platform
  • Modern Apps are a Technology Stack
  • Architecturally Complex Defects
  • Propagated Risk
  • Transaction Risk
  • Risk-Based Prioritization
  • Static Analysis in US DoD
  • Reducing Operational Incidents amp Costs
  • Reducing Operational Losses
  • Slide Number 70
  • The Technical Debt Metaphor
  • Tech Debt by Quality Factor
  • Join CISQ wwwit-cisqorg
  • Section
  • References 1
  • References 2
  • References 2
Page 64: Best Practices for Software Process and Product Measurement · Software Process and Product Measurement Dr. Bill Curtis ... measurement best practice • Best guidance for starting

Click to edit Master title style Six Sigmarsquos Challenge

Process focus ndash process improvement ndash Six Sigma Product focus ndash product improvement ndash Design for 6σ

Product focus supplements product focus to unlock even more value from software

Six Sigma projects must have significant benefits Huge benefits

Large benefits

Good benefits

Okay benefits

Small benefits

Now what

Ultimately we run out of projects with significant enough benefits to continue the programhellip

so what is the solution that allows continual improvement

Click to edit Master title style Testing is Not Enough ldquoAs higher levels of assurance are demandedhelliptesting cannot deliver the level of confidence required at a reasonable costrdquo

ldquoThe correctness of the code is rarely the weakest linkrdquo

Jackson D (2009) Spinellis D (2006)

ldquohellipa failure to satisfy a non-functional requirement can be critical even catastrophichellipnon-functional requirements are sometimes difficult to verify We cannot write a test case to verify a systemrsquos reliabilityhellipThe ability to associate code to non-functional properties can be a powerful weapon in a software engineerrsquos arsenalrdquo

Click to edit Master title style CASTrsquos Application Intelligence Platform

Application Analysis

Evaluation of 1200+ coding amp

architectural rules

Application meta-data

Transferability

Changeability

Robustness

Performance

Security

Quality Measurements

Detected Violations

Expensive operation in loop Static vs pooled connections Complex query on big table Large indices on big table

Empty CATCH block Uncontrolled data access Poor memory management Opened resource not closed

SQL injection Cross-site scripting Buffer overflow Uncontrolled format string

Unstructured code Misuse of inheritance Lack of comments Violated naming convention

Highly coupled component Duplicated code Index modified in loop High cyclomatic complexity

Language Parsers

Oracle PLSQL Sybase T-SQL SQL Server T-SQL IBM SQLPSM C C++ C Pro C Cobol CICS Visual Basic VBNet ASPNet Java J2EE JSP XML HTML Javascript VBScript PHP PowerBuilder Oracle Forms PeopleSoft SAP ABAP Netweaver Tibco Business Objects Universal Analyzer for other languages

Click to edit Master title style Modern Apps are a Technology Stack Ar

chite

ctur

al C

ompl

ianc

e

Data Flow Transaction Risk

EJB PLSQL

Oracle

SQL Server

DB2

TSQL

Hibernate

Spring

Struts NET

COBOL

Sybase IMS

Messaging

Integration quality Architectural

compliance Risk propagation Application security Resiliency checks Transaction integrity

Function point Effort estimation Data access control SDK versioning Calibration across

technologies IT organization level

System Level 3

Code style amp layout Expression complexity Code documentation Class or program design Basic coding standards Developer level

Unit Level 1

Java Java Java

Java Java

Web Services Java Java Single languagetechnology layer

Intra-technology architecture Intra-layer dependencies Inter-program invocation Security vulnerabilities Development team level

Technology Level 2

JSP ASPNET APIs

Click to edit Master title style Architecturally Complex Defects

Primary cause of operational

problems

20x as many

fixes to correct

Architecturally Complex Defect

A structural flaw involving interactions among multiple components that reside in different application layers

48

52 92

8 Architecturally

Complex Defects

Code unit-level violations

of total app defects

of total repair effort

Most architecturally complex defects involve an architectural hotspot ndash a badly-built component that should be replaced

Click to edit Master title style Propagated Risk

The impact of violation 284 in Component B is very widely propagated throughout the system ndash

it presents very high risk

The impact of violation 342 in Component A is only narrowly propagated in the system

ndash it presents low risk

Propagated Risk Index

A ranking of the risk created by each violation based on the its severity and the breadth of its impact in the system

Violation 342 in component A

Violation 284 in component B

Remediating violation 284 in Component B will have

the greater impact on reducing risk and cost

Click to edit Master title style Transaction Risk

Customers care most about the dependability of their transactions

The risk of a transaction is determined by the number and severity of violations along its path

Transactions can be ranked by risk to prioritize their remediation

The Transaction Risk Index can be further tuned using the operational frequency of each transaction

Transaction Risk Index

A ranking of transactions based on the risk created by the number and severity of violations along the transaction path

Transaction entryexit in the user interface layer

Transaction fulfillment in the data storage layer

1 3 2 4 5

Transaction 1 poses greatest risk

Transaction processing in the business logic layer

Click to edit Master title style Risk-Based Prioritization

Violation Health Factor Severity Propagated Risk Index

In a high risk transaction

Frequency of path execution

148 Security 9

371 Robustness 9

062 Performance 8

284 Robustness 7

016 Changeability 7

241 Security 5

173 Transferability 4

359 Transferability 3

Click to edit Master title style Static Analysis in US DoD

John Keane (2013)

Testing finds lt13 of structural defects

Static analysis before testing is 85 efficient

With testing and static analysis combined test schedules are reduced by 50

Click to edit Master title style Reducing Operational Incidents amp Costs

Study of structural quality measures and maintenance effort across 20 customers in a large global system integrator

TQI increase of 24 decreased corrective maintenance effort by 50

Total Quality Index

Click to edit Master title style Reducing Operational Losses

0

20

40

60

80

100

120

140

160

180

100 150 200 250 300 350 400

Technical Quality Index

Pro

du

ctio

n In

cid

ents

+$2000000 Annual Loss

$50000 Annual Loss

$5000 Annual Loss

Large international investment bank Business critical applications

Click to edit Master title style Factors Affecting Structural Quality

2017 CRASH Report Request from infocastsoftwarecom

Factor apps Robust Security Perform Change Transfer

Technology 1606 20 3 16 26 7

Java-EE

Industry 677 5 5 7 4 6

App type 510 1 1 1 1 1

Source 580 1 1 4

Shore 540

Maturity 72 28 25 15 24 12

Method 208 10 6 14 6

Team size 186 7 5 8 8

of users 262 4 3 5 4

Percentage of variation in structural quality scores accounted for by various factors

Click to edit Master title style The Technical Debt Metaphor

Structural quality problems in production code

Technical Debt

Principal borrowed

Interest on the debt

Business Risk Liability from debt

Opportunity cost

Interestmdashcontinuing IT costs attributable to the violations causing technical debt ie higher maintenance costs greater resource usage etc

Principalcost of fixing problems remaining in the code after release that must be remediated

Opportunity costmdashbenefits that could have been achieved had resources been put on new capability rather than retiring technical debt

Liabilitymdashbusiness costs related to outages breaches corrupted data etc

Technical Debt the future cost of fixing defects remaining in code at release a component of the cost of ownership

Todayrsquos talk focuses on the principal

Curtis et al (2012) IEEE Software

Click to edit Master title style Tech Debt by Quality Factor

Transferability

40

Changeability

30

Security 7

Robustness

18

70 of Technical Debt is in IT Cost (Transferability Changeability)

30 of Technical Debt is in Business Risk

(Robustness Performance Security)

Health Factor proportions are mostly

consistent across technologies

Curtis et al (2012) IEEE Software

Click to edit Master title style Join CISQ wwwit-cisqorg

Click to edit Master title style Section

References

115

Click to edit Master title style References 1 Anderson DJ (2010) Kanban Sequim WA Blue Hole Press

Basili V Trendowicz A Kowalczyk M Heidrich J Seaman C Munch J amp Rombach D (2014) Aligning Organizations through Measurement Heidelberg Springer

Boehm B (1987) Increasing software productivity IEEE Computer 20 (9) 43-57

Boehm B et al (2000) Software Cost Estimation with COCOMO II Prentice-Hall

Chrissis MB Konrad M amp Shrum S (2011) CMMI Third Edition Guidelines for Process Integration and Product Improvement Reading MA Addison Wesley

Cohn M (2006) Agile Estimating and Planning Upper Saddle River NJ Prentice-Hall

Crosby P (1979) Quality Is Free New York McGraw-Hill

Curtis B Sapiddi J amp Szynkarski A (2012) Estimating the principle of an applicationrsquos technical debt IEEE Software 29 (6) 34-42

Dion R (1993) Process improvement and the corporate balance sheet IEEE Software 10 (4) 28-35

Duncker R (1992) Presentation at the 25th Annual Conference of the Singapore Computer Society Singapore November 1992

Dyne K (1999) ESSI Benefits from Continuous Software Improvements In Proceedings of ESEPGrsquo99 Milton Keyes UK ESPI Foundation

Flowe RM amp Thordahl JB (1994) A Correlational Study of the SEIrsquos Capability Maturity Model and Software Development Performance in DoD Contracts (AFITGSSLAR94D-2) Dayton OH Air Force Institute of Technology Wright Patterson Air Force Base

116

Click to edit Master title style References 2 Haley T Ireland B Wojtaszek E Nash D amp Dion R (1995) Raytheon Electronic Systems

Experience in Software Process Improvement (Tech Rep CMUSEI-95-TR-017) ) Pittsburgh Software Engineering Institute Carnegie Mellon University

Ebert C amp Dumke R (2007) Software Measurement Berlin Springer-Verlag

Fagan E amp Davis N (2005) TSP PSP at Intuit Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Fenton N (2006) New directions for Software Metrics CIO Symposium on Software Best Practices

Garmus D amp Herron D (2001) Function Point Analysis Boston Addison-Wesley

George E (2005) Project management with TSP Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Herbsleb J Carleton A Rozum J Siegel J amp Zubrow D (1994) Benefits of CMM-Based Software Process Improvement Initial Results (Tech Rep CMUSEI-94-TR-13) Pittsburgh Software Engineering Institute Carnegie Mellon University

Humphrey W (1997) Introduction to the Personal Software Process Boston Addison Wesley

Humphrey W (2000) Introduction to the Team Software Process Boston Addison Wesley

Humphrey W (2006) TSP Leading a Development Team Boston Addison Wesley

Jackson D (2009) A direct path to dependable software Communications of the ACM 52 (4)

Kan SH (1995) Metrics and Models in Software Quality Engineering Reading MA Addison-Wesley

Keeni B (2000) IEEE Software 16 (4)

117

Click to edit Master title style References 2 Larman C (2004) Agile and Iterative Development Boston Addison Wesley

Lydon T (1995) Productivity drivers Process and capital In Proceedings of the 1995 SEPG Conference Pittsburgh Software Engineering Institute Carnegie Mellon University

McGarry J et al (2002) Practical Software Measurement Reading MA Addison Wesley

Paulk M Weber C Curtis B amp Chrissis M (1995) The Capability Maturity Model Guidelines for Improving the Software Process Reading MA Addison-Wesley

Peterman W (2000) IEEE Software JulyAugust issue

Poppendieck M amp Poppendieck T (2007) Implementing Lean Software Development Boston Addison-Wesley

Sakamoto K Kishida K amp Nakakoji K (1996) Cultural adaptation of the CMM In Fuggetta A amp Wolf A (Eds) Software Process Chichester UK Wiley 137-154

Sinha M (2006) Performance models ndash Enabling organizations prediction capability Proceedings of SEPGrsquo06 Pittsburgh Software Engineering Institute Carnegie Mellon University

Schwaber K amp Beedle M (2002) Agile Software Development Using Scrum

Spinellis D (2006) Code Quality Addison-Wesley

Van Epps S amp Marshall R (2012) Using TSP at the end of the development cycle Proceedings of TSP 2012 Pittsburgh Software Engineering Institute Carnegie Mellon University

Vu JD (1996) Software process improvement A business case In Proceedings of the European SEPG Conference Milton Keynes UK European Software Process Improvement Foundation

118

  • Best Practices for Software Process and Product Measurement
  • Instructor Dr Bill Curtis
  • Agenda
  • Section 1
  • Why Does Measurement Languish
  • Measurement Process Guidance
  • Measurement Categories
  • Productivity Analysis Objectives
  • Productivity Analysis Measures
  • Software Productivity
  • Software Size Measures
  • How Many Lines of Code
  • Function Point Estimation
  • Effort mdash Weakest Measure
  • How Quality Affects Productivity
  • Carry-forward Rework
  • Example of Quality Impact
  • Quality-Adjusted Productivity
  • Best Practices for Analysis
  • 1 Evaluate Demographics
  • 2 Segment Baselines
  • 3 Beware Sampling Effects
  • 4 Understand Variation
  • 5 Investigate Distributions
  • 6 Account for Maturity Effects
  • 7 Caution for External Data Sets
  • Slide Number 27
  • Measuring Project Progress
  • Requirements Change Impact
  • Phase Defect Removal Model
  • SCRUM Board
  • Burndown Chart and Velocity
  • Analyzing Velocity
  • Measuring and Managing Flow
  • Trends Over Sprints
  • Sources and Measures
  • Recommended Books
  • Personal Software Process (PSP)
  • Team Software Process (TSP)
  • Aggregated Personal Data Best
  • Defect Detection at Intuit
  • TSP Benefits
  • Section 3
  • CMMI Maturity Transitions
  • Enhanced Maturity Framework
  • Maturity Level Transformation
  • Measurement by Level
  • Average Improvement Results
  • Schedule Performance
  • Cost Performance
  • Crosbyrsquos Cost of Quality Model
  • Raytheons Cost of Quality
  • Raytheonrsquos Productivity
  • Raytheonrsquos Cost Reduction
  • Raytheonrsquos Cost Predictability
  • OMRONrsquos Reuse gains
  • Section 4
  • What about the Product
  • Six Sigmarsquos Challenge
  • Testing is Not Enough
  • CASTrsquos Application Intelligence Platform
  • Modern Apps are a Technology Stack
  • Architecturally Complex Defects
  • Propagated Risk
  • Transaction Risk
  • Risk-Based Prioritization
  • Static Analysis in US DoD
  • Reducing Operational Incidents amp Costs
  • Reducing Operational Losses
  • Slide Number 70
  • The Technical Debt Metaphor
  • Tech Debt by Quality Factor
  • Join CISQ wwwit-cisqorg
  • Section
  • References 1
  • References 2
  • References 2
Page 65: Best Practices for Software Process and Product Measurement · Software Process and Product Measurement Dr. Bill Curtis ... measurement best practice • Best guidance for starting

Click to edit Master title style Testing is Not Enough ldquoAs higher levels of assurance are demandedhelliptesting cannot deliver the level of confidence required at a reasonable costrdquo

ldquoThe correctness of the code is rarely the weakest linkrdquo

Jackson D (2009) Spinellis D (2006)

ldquohellipa failure to satisfy a non-functional requirement can be critical even catastrophichellipnon-functional requirements are sometimes difficult to verify We cannot write a test case to verify a systemrsquos reliabilityhellipThe ability to associate code to non-functional properties can be a powerful weapon in a software engineerrsquos arsenalrdquo

Click to edit Master title style CASTrsquos Application Intelligence Platform

Application Analysis

Evaluation of 1200+ coding amp

architectural rules

Application meta-data

Transferability

Changeability

Robustness

Performance

Security

Quality Measurements

Detected Violations

Expensive operation in loop Static vs pooled connections Complex query on big table Large indices on big table

Empty CATCH block Uncontrolled data access Poor memory management Opened resource not closed

SQL injection Cross-site scripting Buffer overflow Uncontrolled format string

Unstructured code Misuse of inheritance Lack of comments Violated naming convention

Highly coupled component Duplicated code Index modified in loop High cyclomatic complexity

Language Parsers

Oracle PLSQL Sybase T-SQL SQL Server T-SQL IBM SQLPSM C C++ C Pro C Cobol CICS Visual Basic VBNet ASPNet Java J2EE JSP XML HTML Javascript VBScript PHP PowerBuilder Oracle Forms PeopleSoft SAP ABAP Netweaver Tibco Business Objects Universal Analyzer for other languages

Click to edit Master title style Modern Apps are a Technology Stack Ar

chite

ctur

al C

ompl

ianc

e

Data Flow Transaction Risk

EJB PLSQL

Oracle

SQL Server

DB2

TSQL

Hibernate

Spring

Struts NET

COBOL

Sybase IMS

Messaging

Integration quality Architectural

compliance Risk propagation Application security Resiliency checks Transaction integrity

Function point Effort estimation Data access control SDK versioning Calibration across

technologies IT organization level

System Level 3

Code style amp layout Expression complexity Code documentation Class or program design Basic coding standards Developer level

Unit Level 1

Java Java Java

Java Java

Web Services Java Java Single languagetechnology layer

Intra-technology architecture Intra-layer dependencies Inter-program invocation Security vulnerabilities Development team level

Technology Level 2

JSP ASPNET APIs

Click to edit Master title style Architecturally Complex Defects

Primary cause of operational

problems

20x as many

fixes to correct

Architecturally Complex Defect

A structural flaw involving interactions among multiple components that reside in different application layers

48

52 92

8 Architecturally

Complex Defects

Code unit-level violations

of total app defects

of total repair effort

Most architecturally complex defects involve an architectural hotspot ndash a badly-built component that should be replaced

Click to edit Master title style Propagated Risk

The impact of violation 284 in Component B is very widely propagated throughout the system ndash

it presents very high risk

The impact of violation 342 in Component A is only narrowly propagated in the system

ndash it presents low risk

Propagated Risk Index

A ranking of the risk created by each violation based on the its severity and the breadth of its impact in the system

Violation 342 in component A

Violation 284 in component B

Remediating violation 284 in Component B will have

the greater impact on reducing risk and cost

Click to edit Master title style Transaction Risk

Customers care most about the dependability of their transactions

The risk of a transaction is determined by the number and severity of violations along its path

Transactions can be ranked by risk to prioritize their remediation

The Transaction Risk Index can be further tuned using the operational frequency of each transaction

Transaction Risk Index

A ranking of transactions based on the risk created by the number and severity of violations along the transaction path

Transaction entryexit in the user interface layer

Transaction fulfillment in the data storage layer

1 3 2 4 5

Transaction 1 poses greatest risk

Transaction processing in the business logic layer

Click to edit Master title style Risk-Based Prioritization

Violation Health Factor Severity Propagated Risk Index

In a high risk transaction

Frequency of path execution

148 Security 9

371 Robustness 9

062 Performance 8

284 Robustness 7

016 Changeability 7

241 Security 5

173 Transferability 4

359 Transferability 3

Click to edit Master title style Static Analysis in US DoD

John Keane (2013)

Testing finds lt13 of structural defects

Static analysis before testing is 85 efficient

With testing and static analysis combined test schedules are reduced by 50

Click to edit Master title style Reducing Operational Incidents amp Costs

Study of structural quality measures and maintenance effort across 20 customers in a large global system integrator

TQI increase of 24 decreased corrective maintenance effort by 50

Total Quality Index

Click to edit Master title style Reducing Operational Losses

0

20

40

60

80

100

120

140

160

180

100 150 200 250 300 350 400

Technical Quality Index

Pro

du

ctio

n In

cid

ents

+$2000000 Annual Loss

$50000 Annual Loss

$5000 Annual Loss

Large international investment bank Business critical applications

Click to edit Master title style Factors Affecting Structural Quality

2017 CRASH Report Request from infocastsoftwarecom

Factor apps Robust Security Perform Change Transfer

Technology 1606 20 3 16 26 7

Java-EE

Industry 677 5 5 7 4 6

App type 510 1 1 1 1 1

Source 580 1 1 4

Shore 540

Maturity 72 28 25 15 24 12

Method 208 10 6 14 6

Team size 186 7 5 8 8

of users 262 4 3 5 4

Percentage of variation in structural quality scores accounted for by various factors

Click to edit Master title style The Technical Debt Metaphor

Structural quality problems in production code

Technical Debt

Principal borrowed

Interest on the debt

Business Risk Liability from debt

Opportunity cost

Interestmdashcontinuing IT costs attributable to the violations causing technical debt ie higher maintenance costs greater resource usage etc

Principalcost of fixing problems remaining in the code after release that must be remediated

Opportunity costmdashbenefits that could have been achieved had resources been put on new capability rather than retiring technical debt

Liabilitymdashbusiness costs related to outages breaches corrupted data etc

Technical Debt the future cost of fixing defects remaining in code at release a component of the cost of ownership

Todayrsquos talk focuses on the principal

Curtis et al (2012) IEEE Software

Click to edit Master title style Tech Debt by Quality Factor

Transferability

40

Changeability

30

Security 7

Robustness

18

70 of Technical Debt is in IT Cost (Transferability Changeability)

30 of Technical Debt is in Business Risk

(Robustness Performance Security)

Health Factor proportions are mostly

consistent across technologies

Curtis et al (2012) IEEE Software

Click to edit Master title style Join CISQ wwwit-cisqorg

Click to edit Master title style Section

References

115

Click to edit Master title style References 1 Anderson DJ (2010) Kanban Sequim WA Blue Hole Press

Basili V Trendowicz A Kowalczyk M Heidrich J Seaman C Munch J amp Rombach D (2014) Aligning Organizations through Measurement Heidelberg Springer

Boehm B (1987) Increasing software productivity IEEE Computer 20 (9) 43-57

Boehm B et al (2000) Software Cost Estimation with COCOMO II Prentice-Hall

Chrissis MB Konrad M amp Shrum S (2011) CMMI Third Edition Guidelines for Process Integration and Product Improvement Reading MA Addison Wesley

Cohn M (2006) Agile Estimating and Planning Upper Saddle River NJ Prentice-Hall

Crosby P (1979) Quality Is Free New York McGraw-Hill

Curtis B Sapiddi J amp Szynkarski A (2012) Estimating the principle of an applicationrsquos technical debt IEEE Software 29 (6) 34-42

Dion R (1993) Process improvement and the corporate balance sheet IEEE Software 10 (4) 28-35

Duncker R (1992) Presentation at the 25th Annual Conference of the Singapore Computer Society Singapore November 1992

Dyne K (1999) ESSI Benefits from Continuous Software Improvements In Proceedings of ESEPGrsquo99 Milton Keyes UK ESPI Foundation

Flowe RM amp Thordahl JB (1994) A Correlational Study of the SEIrsquos Capability Maturity Model and Software Development Performance in DoD Contracts (AFITGSSLAR94D-2) Dayton OH Air Force Institute of Technology Wright Patterson Air Force Base

116

Click to edit Master title style References 2 Haley T Ireland B Wojtaszek E Nash D amp Dion R (1995) Raytheon Electronic Systems

Experience in Software Process Improvement (Tech Rep CMUSEI-95-TR-017) ) Pittsburgh Software Engineering Institute Carnegie Mellon University

Ebert C amp Dumke R (2007) Software Measurement Berlin Springer-Verlag

Fagan E amp Davis N (2005) TSP PSP at Intuit Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Fenton N (2006) New directions for Software Metrics CIO Symposium on Software Best Practices

Garmus D amp Herron D (2001) Function Point Analysis Boston Addison-Wesley

George E (2005) Project management with TSP Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Herbsleb J Carleton A Rozum J Siegel J amp Zubrow D (1994) Benefits of CMM-Based Software Process Improvement Initial Results (Tech Rep CMUSEI-94-TR-13) Pittsburgh Software Engineering Institute Carnegie Mellon University

Humphrey W (1997) Introduction to the Personal Software Process Boston Addison Wesley

Humphrey W (2000) Introduction to the Team Software Process Boston Addison Wesley

Humphrey W (2006) TSP Leading a Development Team Boston Addison Wesley

Jackson D (2009) A direct path to dependable software Communications of the ACM 52 (4)

Kan SH (1995) Metrics and Models in Software Quality Engineering Reading MA Addison-Wesley

Keeni B (2000) IEEE Software 16 (4)

117

Click to edit Master title style References 2 Larman C (2004) Agile and Iterative Development Boston Addison Wesley

Lydon T (1995) Productivity drivers Process and capital In Proceedings of the 1995 SEPG Conference Pittsburgh Software Engineering Institute Carnegie Mellon University

McGarry J et al (2002) Practical Software Measurement Reading MA Addison Wesley

Paulk M Weber C Curtis B amp Chrissis M (1995) The Capability Maturity Model Guidelines for Improving the Software Process Reading MA Addison-Wesley

Peterman W (2000) IEEE Software JulyAugust issue

Poppendieck M amp Poppendieck T (2007) Implementing Lean Software Development Boston Addison-Wesley

Sakamoto K Kishida K amp Nakakoji K (1996) Cultural adaptation of the CMM In Fuggetta A amp Wolf A (Eds) Software Process Chichester UK Wiley 137-154

Sinha M (2006) Performance models ndash Enabling organizations prediction capability Proceedings of SEPGrsquo06 Pittsburgh Software Engineering Institute Carnegie Mellon University

Schwaber K amp Beedle M (2002) Agile Software Development Using Scrum

Spinellis D (2006) Code Quality Addison-Wesley

Van Epps S amp Marshall R (2012) Using TSP at the end of the development cycle Proceedings of TSP 2012 Pittsburgh Software Engineering Institute Carnegie Mellon University

Vu JD (1996) Software process improvement A business case In Proceedings of the European SEPG Conference Milton Keynes UK European Software Process Improvement Foundation

118

  • Best Practices for Software Process and Product Measurement
  • Instructor Dr Bill Curtis
  • Agenda
  • Section 1
  • Why Does Measurement Languish
  • Measurement Process Guidance
  • Measurement Categories
  • Productivity Analysis Objectives
  • Productivity Analysis Measures
  • Software Productivity
  • Software Size Measures
  • How Many Lines of Code
  • Function Point Estimation
  • Effort mdash Weakest Measure
  • How Quality Affects Productivity
  • Carry-forward Rework
  • Example of Quality Impact
  • Quality-Adjusted Productivity
  • Best Practices for Analysis
  • 1 Evaluate Demographics
  • 2 Segment Baselines
  • 3 Beware Sampling Effects
  • 4 Understand Variation
  • 5 Investigate Distributions
  • 6 Account for Maturity Effects
  • 7 Caution for External Data Sets
  • Slide Number 27
  • Measuring Project Progress
  • Requirements Change Impact
  • Phase Defect Removal Model
  • SCRUM Board
  • Burndown Chart and Velocity
  • Analyzing Velocity
  • Measuring and Managing Flow
  • Trends Over Sprints
  • Sources and Measures
  • Recommended Books
  • Personal Software Process (PSP)
  • Team Software Process (TSP)
  • Aggregated Personal Data Best
  • Defect Detection at Intuit
  • TSP Benefits
  • Section 3
  • CMMI Maturity Transitions
  • Enhanced Maturity Framework
  • Maturity Level Transformation
  • Measurement by Level
  • Average Improvement Results
  • Schedule Performance
  • Cost Performance
  • Crosbyrsquos Cost of Quality Model
  • Raytheons Cost of Quality
  • Raytheonrsquos Productivity
  • Raytheonrsquos Cost Reduction
  • Raytheonrsquos Cost Predictability
  • OMRONrsquos Reuse gains
  • Section 4
  • What about the Product
  • Six Sigmarsquos Challenge
  • Testing is Not Enough
  • CASTrsquos Application Intelligence Platform
  • Modern Apps are a Technology Stack
  • Architecturally Complex Defects
  • Propagated Risk
  • Transaction Risk
  • Risk-Based Prioritization
  • Static Analysis in US DoD
  • Reducing Operational Incidents amp Costs
  • Reducing Operational Losses
  • Slide Number 70
  • The Technical Debt Metaphor
  • Tech Debt by Quality Factor
  • Join CISQ wwwit-cisqorg
  • Section
  • References 1
  • References 2
  • References 2
Page 66: Best Practices for Software Process and Product Measurement · Software Process and Product Measurement Dr. Bill Curtis ... measurement best practice • Best guidance for starting

Click to edit Master title style CASTrsquos Application Intelligence Platform

Application Analysis

Evaluation of 1200+ coding amp

architectural rules

Application meta-data

Transferability

Changeability

Robustness

Performance

Security

Quality Measurements

Detected Violations

Expensive operation in loop Static vs pooled connections Complex query on big table Large indices on big table

Empty CATCH block Uncontrolled data access Poor memory management Opened resource not closed

SQL injection Cross-site scripting Buffer overflow Uncontrolled format string

Unstructured code Misuse of inheritance Lack of comments Violated naming convention

Highly coupled component Duplicated code Index modified in loop High cyclomatic complexity

Language Parsers

Oracle PLSQL Sybase T-SQL SQL Server T-SQL IBM SQLPSM C C++ C Pro C Cobol CICS Visual Basic VBNet ASPNet Java J2EE JSP XML HTML Javascript VBScript PHP PowerBuilder Oracle Forms PeopleSoft SAP ABAP Netweaver Tibco Business Objects Universal Analyzer for other languages

Click to edit Master title style Modern Apps are a Technology Stack Ar

chite

ctur

al C

ompl

ianc

e

Data Flow Transaction Risk

EJB PLSQL

Oracle

SQL Server

DB2

TSQL

Hibernate

Spring

Struts NET

COBOL

Sybase IMS

Messaging

Integration quality Architectural

compliance Risk propagation Application security Resiliency checks Transaction integrity

Function point Effort estimation Data access control SDK versioning Calibration across

technologies IT organization level

System Level 3

Code style amp layout Expression complexity Code documentation Class or program design Basic coding standards Developer level

Unit Level 1

Java Java Java

Java Java

Web Services Java Java Single languagetechnology layer

Intra-technology architecture Intra-layer dependencies Inter-program invocation Security vulnerabilities Development team level

Technology Level 2

JSP ASPNET APIs

Click to edit Master title style Architecturally Complex Defects

Primary cause of operational

problems

20x as many

fixes to correct

Architecturally Complex Defect

A structural flaw involving interactions among multiple components that reside in different application layers

48

52 92

8 Architecturally

Complex Defects

Code unit-level violations

of total app defects

of total repair effort

Most architecturally complex defects involve an architectural hotspot ndash a badly-built component that should be replaced

Click to edit Master title style Propagated Risk

The impact of violation 284 in Component B is very widely propagated throughout the system ndash

it presents very high risk

The impact of violation 342 in Component A is only narrowly propagated in the system

ndash it presents low risk

Propagated Risk Index

A ranking of the risk created by each violation based on the its severity and the breadth of its impact in the system

Violation 342 in component A

Violation 284 in component B

Remediating violation 284 in Component B will have

the greater impact on reducing risk and cost

Click to edit Master title style Transaction Risk

Customers care most about the dependability of their transactions

The risk of a transaction is determined by the number and severity of violations along its path

Transactions can be ranked by risk to prioritize their remediation

The Transaction Risk Index can be further tuned using the operational frequency of each transaction

Transaction Risk Index

A ranking of transactions based on the risk created by the number and severity of violations along the transaction path

Transaction entryexit in the user interface layer

Transaction fulfillment in the data storage layer

1 3 2 4 5

Transaction 1 poses greatest risk

Transaction processing in the business logic layer

Click to edit Master title style Risk-Based Prioritization

Violation Health Factor Severity Propagated Risk Index

In a high risk transaction

Frequency of path execution

148 Security 9

371 Robustness 9

062 Performance 8

284 Robustness 7

016 Changeability 7

241 Security 5

173 Transferability 4

359 Transferability 3

Click to edit Master title style Static Analysis in US DoD

John Keane (2013)

Testing finds lt13 of structural defects

Static analysis before testing is 85 efficient

With testing and static analysis combined test schedules are reduced by 50

Click to edit Master title style Reducing Operational Incidents amp Costs

Study of structural quality measures and maintenance effort across 20 customers in a large global system integrator

TQI increase of 24 decreased corrective maintenance effort by 50

Total Quality Index

Click to edit Master title style Reducing Operational Losses

0

20

40

60

80

100

120

140

160

180

100 150 200 250 300 350 400

Technical Quality Index

Pro

du

ctio

n In

cid

ents

+$2000000 Annual Loss

$50000 Annual Loss

$5000 Annual Loss

Large international investment bank Business critical applications

Click to edit Master title style Factors Affecting Structural Quality

2017 CRASH Report Request from infocastsoftwarecom

Factor apps Robust Security Perform Change Transfer

Technology 1606 20 3 16 26 7

Java-EE

Industry 677 5 5 7 4 6

App type 510 1 1 1 1 1

Source 580 1 1 4

Shore 540

Maturity 72 28 25 15 24 12

Method 208 10 6 14 6

Team size 186 7 5 8 8

of users 262 4 3 5 4

Percentage of variation in structural quality scores accounted for by various factors

Click to edit Master title style The Technical Debt Metaphor

Structural quality problems in production code

Technical Debt

Principal borrowed

Interest on the debt

Business Risk Liability from debt

Opportunity cost

Interestmdashcontinuing IT costs attributable to the violations causing technical debt ie higher maintenance costs greater resource usage etc

Principalcost of fixing problems remaining in the code after release that must be remediated

Opportunity costmdashbenefits that could have been achieved had resources been put on new capability rather than retiring technical debt

Liabilitymdashbusiness costs related to outages breaches corrupted data etc

Technical Debt the future cost of fixing defects remaining in code at release a component of the cost of ownership

Todayrsquos talk focuses on the principal

Curtis et al (2012) IEEE Software

Click to edit Master title style Tech Debt by Quality Factor

Transferability

40

Changeability

30

Security 7

Robustness

18

70 of Technical Debt is in IT Cost (Transferability Changeability)

30 of Technical Debt is in Business Risk

(Robustness Performance Security)

Health Factor proportions are mostly

consistent across technologies

Curtis et al (2012) IEEE Software

Click to edit Master title style Join CISQ wwwit-cisqorg

Click to edit Master title style Section

References

115

Click to edit Master title style References 1 Anderson DJ (2010) Kanban Sequim WA Blue Hole Press

Basili V Trendowicz A Kowalczyk M Heidrich J Seaman C Munch J amp Rombach D (2014) Aligning Organizations through Measurement Heidelberg Springer

Boehm B (1987) Increasing software productivity IEEE Computer 20 (9) 43-57

Boehm B et al (2000) Software Cost Estimation with COCOMO II Prentice-Hall

Chrissis MB Konrad M amp Shrum S (2011) CMMI Third Edition Guidelines for Process Integration and Product Improvement Reading MA Addison Wesley

Cohn M (2006) Agile Estimating and Planning Upper Saddle River NJ Prentice-Hall

Crosby P (1979) Quality Is Free New York McGraw-Hill

Curtis B Sapiddi J amp Szynkarski A (2012) Estimating the principle of an applicationrsquos technical debt IEEE Software 29 (6) 34-42

Dion R (1993) Process improvement and the corporate balance sheet IEEE Software 10 (4) 28-35

Duncker R (1992) Presentation at the 25th Annual Conference of the Singapore Computer Society Singapore November 1992

Dyne K (1999) ESSI Benefits from Continuous Software Improvements In Proceedings of ESEPGrsquo99 Milton Keyes UK ESPI Foundation

Flowe RM amp Thordahl JB (1994) A Correlational Study of the SEIrsquos Capability Maturity Model and Software Development Performance in DoD Contracts (AFITGSSLAR94D-2) Dayton OH Air Force Institute of Technology Wright Patterson Air Force Base

116

Click to edit Master title style References 2 Haley T Ireland B Wojtaszek E Nash D amp Dion R (1995) Raytheon Electronic Systems

Experience in Software Process Improvement (Tech Rep CMUSEI-95-TR-017) ) Pittsburgh Software Engineering Institute Carnegie Mellon University

Ebert C amp Dumke R (2007) Software Measurement Berlin Springer-Verlag

Fagan E amp Davis N (2005) TSP PSP at Intuit Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Fenton N (2006) New directions for Software Metrics CIO Symposium on Software Best Practices

Garmus D amp Herron D (2001) Function Point Analysis Boston Addison-Wesley

George E (2005) Project management with TSP Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Herbsleb J Carleton A Rozum J Siegel J amp Zubrow D (1994) Benefits of CMM-Based Software Process Improvement Initial Results (Tech Rep CMUSEI-94-TR-13) Pittsburgh Software Engineering Institute Carnegie Mellon University

Humphrey W (1997) Introduction to the Personal Software Process Boston Addison Wesley

Humphrey W (2000) Introduction to the Team Software Process Boston Addison Wesley

Humphrey W (2006) TSP Leading a Development Team Boston Addison Wesley

Jackson D (2009) A direct path to dependable software Communications of the ACM 52 (4)

Kan SH (1995) Metrics and Models in Software Quality Engineering Reading MA Addison-Wesley

Keeni B (2000) IEEE Software 16 (4)

117

Click to edit Master title style References 2 Larman C (2004) Agile and Iterative Development Boston Addison Wesley

Lydon T (1995) Productivity drivers Process and capital In Proceedings of the 1995 SEPG Conference Pittsburgh Software Engineering Institute Carnegie Mellon University

McGarry J et al (2002) Practical Software Measurement Reading MA Addison Wesley

Paulk M Weber C Curtis B amp Chrissis M (1995) The Capability Maturity Model Guidelines for Improving the Software Process Reading MA Addison-Wesley

Peterman W (2000) IEEE Software JulyAugust issue

Poppendieck M amp Poppendieck T (2007) Implementing Lean Software Development Boston Addison-Wesley

Sakamoto K Kishida K amp Nakakoji K (1996) Cultural adaptation of the CMM In Fuggetta A amp Wolf A (Eds) Software Process Chichester UK Wiley 137-154

Sinha M (2006) Performance models ndash Enabling organizations prediction capability Proceedings of SEPGrsquo06 Pittsburgh Software Engineering Institute Carnegie Mellon University

Schwaber K amp Beedle M (2002) Agile Software Development Using Scrum

Spinellis D (2006) Code Quality Addison-Wesley

Van Epps S amp Marshall R (2012) Using TSP at the end of the development cycle Proceedings of TSP 2012 Pittsburgh Software Engineering Institute Carnegie Mellon University

Vu JD (1996) Software process improvement A business case In Proceedings of the European SEPG Conference Milton Keynes UK European Software Process Improvement Foundation

118

  • Best Practices for Software Process and Product Measurement
  • Instructor Dr Bill Curtis
  • Agenda
  • Section 1
  • Why Does Measurement Languish
  • Measurement Process Guidance
  • Measurement Categories
  • Productivity Analysis Objectives
  • Productivity Analysis Measures
  • Software Productivity
  • Software Size Measures
  • How Many Lines of Code
  • Function Point Estimation
  • Effort mdash Weakest Measure
  • How Quality Affects Productivity
  • Carry-forward Rework
  • Example of Quality Impact
  • Quality-Adjusted Productivity
  • Best Practices for Analysis
  • 1 Evaluate Demographics
  • 2 Segment Baselines
  • 3 Beware Sampling Effects
  • 4 Understand Variation
  • 5 Investigate Distributions
  • 6 Account for Maturity Effects
  • 7 Caution for External Data Sets
  • Slide Number 27
  • Measuring Project Progress
  • Requirements Change Impact
  • Phase Defect Removal Model
  • SCRUM Board
  • Burndown Chart and Velocity
  • Analyzing Velocity
  • Measuring and Managing Flow
  • Trends Over Sprints
  • Sources and Measures
  • Recommended Books
  • Personal Software Process (PSP)
  • Team Software Process (TSP)
  • Aggregated Personal Data Best
  • Defect Detection at Intuit
  • TSP Benefits
  • Section 3
  • CMMI Maturity Transitions
  • Enhanced Maturity Framework
  • Maturity Level Transformation
  • Measurement by Level
  • Average Improvement Results
  • Schedule Performance
  • Cost Performance
  • Crosbyrsquos Cost of Quality Model
  • Raytheons Cost of Quality
  • Raytheonrsquos Productivity
  • Raytheonrsquos Cost Reduction
  • Raytheonrsquos Cost Predictability
  • OMRONrsquos Reuse gains
  • Section 4
  • What about the Product
  • Six Sigmarsquos Challenge
  • Testing is Not Enough
  • CASTrsquos Application Intelligence Platform
  • Modern Apps are a Technology Stack
  • Architecturally Complex Defects
  • Propagated Risk
  • Transaction Risk
  • Risk-Based Prioritization
  • Static Analysis in US DoD
  • Reducing Operational Incidents amp Costs
  • Reducing Operational Losses
  • Slide Number 70
  • The Technical Debt Metaphor
  • Tech Debt by Quality Factor
  • Join CISQ wwwit-cisqorg
  • Section
  • References 1
  • References 2
  • References 2
Page 67: Best Practices for Software Process and Product Measurement · Software Process and Product Measurement Dr. Bill Curtis ... measurement best practice • Best guidance for starting

Click to edit Master title style Modern Apps are a Technology Stack Ar

chite

ctur

al C

ompl

ianc

e

Data Flow Transaction Risk

EJB PLSQL

Oracle

SQL Server

DB2

TSQL

Hibernate

Spring

Struts NET

COBOL

Sybase IMS

Messaging

Integration quality Architectural

compliance Risk propagation Application security Resiliency checks Transaction integrity

Function point Effort estimation Data access control SDK versioning Calibration across

technologies IT organization level

System Level 3

Code style amp layout Expression complexity Code documentation Class or program design Basic coding standards Developer level

Unit Level 1

Java Java Java

Java Java

Web Services Java Java Single languagetechnology layer

Intra-technology architecture Intra-layer dependencies Inter-program invocation Security vulnerabilities Development team level

Technology Level 2

JSP ASPNET APIs

Click to edit Master title style Architecturally Complex Defects

Primary cause of operational

problems

20x as many

fixes to correct

Architecturally Complex Defect

A structural flaw involving interactions among multiple components that reside in different application layers

48

52 92

8 Architecturally

Complex Defects

Code unit-level violations

of total app defects

of total repair effort

Most architecturally complex defects involve an architectural hotspot ndash a badly-built component that should be replaced

Click to edit Master title style Propagated Risk

The impact of violation 284 in Component B is very widely propagated throughout the system ndash

it presents very high risk

The impact of violation 342 in Component A is only narrowly propagated in the system

ndash it presents low risk

Propagated Risk Index

A ranking of the risk created by each violation based on the its severity and the breadth of its impact in the system

Violation 342 in component A

Violation 284 in component B

Remediating violation 284 in Component B will have

the greater impact on reducing risk and cost

Click to edit Master title style Transaction Risk

Customers care most about the dependability of their transactions

The risk of a transaction is determined by the number and severity of violations along its path

Transactions can be ranked by risk to prioritize their remediation

The Transaction Risk Index can be further tuned using the operational frequency of each transaction

Transaction Risk Index

A ranking of transactions based on the risk created by the number and severity of violations along the transaction path

Transaction entryexit in the user interface layer

Transaction fulfillment in the data storage layer

1 3 2 4 5

Transaction 1 poses greatest risk

Transaction processing in the business logic layer

Click to edit Master title style Risk-Based Prioritization

Violation Health Factor Severity Propagated Risk Index

In a high risk transaction

Frequency of path execution

148 Security 9

371 Robustness 9

062 Performance 8

284 Robustness 7

016 Changeability 7

241 Security 5

173 Transferability 4

359 Transferability 3

Click to edit Master title style Static Analysis in US DoD

John Keane (2013)

Testing finds lt13 of structural defects

Static analysis before testing is 85 efficient

With testing and static analysis combined test schedules are reduced by 50

Click to edit Master title style Reducing Operational Incidents amp Costs

Study of structural quality measures and maintenance effort across 20 customers in a large global system integrator

TQI increase of 24 decreased corrective maintenance effort by 50

Total Quality Index

Click to edit Master title style Reducing Operational Losses

0

20

40

60

80

100

120

140

160

180

100 150 200 250 300 350 400

Technical Quality Index

Pro

du

ctio

n In

cid

ents

+$2000000 Annual Loss

$50000 Annual Loss

$5000 Annual Loss

Large international investment bank Business critical applications

Click to edit Master title style Factors Affecting Structural Quality

2017 CRASH Report Request from infocastsoftwarecom

Factor apps Robust Security Perform Change Transfer

Technology 1606 20 3 16 26 7

Java-EE

Industry 677 5 5 7 4 6

App type 510 1 1 1 1 1

Source 580 1 1 4

Shore 540

Maturity 72 28 25 15 24 12

Method 208 10 6 14 6

Team size 186 7 5 8 8

of users 262 4 3 5 4

Percentage of variation in structural quality scores accounted for by various factors

Click to edit Master title style The Technical Debt Metaphor

Structural quality problems in production code

Technical Debt

Principal borrowed

Interest on the debt

Business Risk Liability from debt

Opportunity cost

Interestmdashcontinuing IT costs attributable to the violations causing technical debt ie higher maintenance costs greater resource usage etc

Principalcost of fixing problems remaining in the code after release that must be remediated

Opportunity costmdashbenefits that could have been achieved had resources been put on new capability rather than retiring technical debt

Liabilitymdashbusiness costs related to outages breaches corrupted data etc

Technical Debt the future cost of fixing defects remaining in code at release a component of the cost of ownership

Todayrsquos talk focuses on the principal

Curtis et al (2012) IEEE Software

Click to edit Master title style Tech Debt by Quality Factor

Transferability

40

Changeability

30

Security 7

Robustness

18

70 of Technical Debt is in IT Cost (Transferability Changeability)

30 of Technical Debt is in Business Risk

(Robustness Performance Security)

Health Factor proportions are mostly

consistent across technologies

Curtis et al (2012) IEEE Software

Click to edit Master title style Join CISQ wwwit-cisqorg

Click to edit Master title style Section

References

115

Click to edit Master title style References 1 Anderson DJ (2010) Kanban Sequim WA Blue Hole Press

Basili V Trendowicz A Kowalczyk M Heidrich J Seaman C Munch J amp Rombach D (2014) Aligning Organizations through Measurement Heidelberg Springer

Boehm B (1987) Increasing software productivity IEEE Computer 20 (9) 43-57

Boehm B et al (2000) Software Cost Estimation with COCOMO II Prentice-Hall

Chrissis MB Konrad M amp Shrum S (2011) CMMI Third Edition Guidelines for Process Integration and Product Improvement Reading MA Addison Wesley

Cohn M (2006) Agile Estimating and Planning Upper Saddle River NJ Prentice-Hall

Crosby P (1979) Quality Is Free New York McGraw-Hill

Curtis B Sapiddi J amp Szynkarski A (2012) Estimating the principle of an applicationrsquos technical debt IEEE Software 29 (6) 34-42

Dion R (1993) Process improvement and the corporate balance sheet IEEE Software 10 (4) 28-35

Duncker R (1992) Presentation at the 25th Annual Conference of the Singapore Computer Society Singapore November 1992

Dyne K (1999) ESSI Benefits from Continuous Software Improvements In Proceedings of ESEPGrsquo99 Milton Keyes UK ESPI Foundation

Flowe RM amp Thordahl JB (1994) A Correlational Study of the SEIrsquos Capability Maturity Model and Software Development Performance in DoD Contracts (AFITGSSLAR94D-2) Dayton OH Air Force Institute of Technology Wright Patterson Air Force Base

116

Click to edit Master title style References 2 Haley T Ireland B Wojtaszek E Nash D amp Dion R (1995) Raytheon Electronic Systems

Experience in Software Process Improvement (Tech Rep CMUSEI-95-TR-017) ) Pittsburgh Software Engineering Institute Carnegie Mellon University

Ebert C amp Dumke R (2007) Software Measurement Berlin Springer-Verlag

Fagan E amp Davis N (2005) TSP PSP at Intuit Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Fenton N (2006) New directions for Software Metrics CIO Symposium on Software Best Practices

Garmus D amp Herron D (2001) Function Point Analysis Boston Addison-Wesley

George E (2005) Project management with TSP Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Herbsleb J Carleton A Rozum J Siegel J amp Zubrow D (1994) Benefits of CMM-Based Software Process Improvement Initial Results (Tech Rep CMUSEI-94-TR-13) Pittsburgh Software Engineering Institute Carnegie Mellon University

Humphrey W (1997) Introduction to the Personal Software Process Boston Addison Wesley

Humphrey W (2000) Introduction to the Team Software Process Boston Addison Wesley

Humphrey W (2006) TSP Leading a Development Team Boston Addison Wesley

Jackson D (2009) A direct path to dependable software Communications of the ACM 52 (4)

Kan SH (1995) Metrics and Models in Software Quality Engineering Reading MA Addison-Wesley

Keeni B (2000) IEEE Software 16 (4)

117

Click to edit Master title style References 2 Larman C (2004) Agile and Iterative Development Boston Addison Wesley

Lydon T (1995) Productivity drivers Process and capital In Proceedings of the 1995 SEPG Conference Pittsburgh Software Engineering Institute Carnegie Mellon University

McGarry J et al (2002) Practical Software Measurement Reading MA Addison Wesley

Paulk M Weber C Curtis B amp Chrissis M (1995) The Capability Maturity Model Guidelines for Improving the Software Process Reading MA Addison-Wesley

Peterman W (2000) IEEE Software JulyAugust issue

Poppendieck M amp Poppendieck T (2007) Implementing Lean Software Development Boston Addison-Wesley

Sakamoto K Kishida K amp Nakakoji K (1996) Cultural adaptation of the CMM In Fuggetta A amp Wolf A (Eds) Software Process Chichester UK Wiley 137-154

Sinha M (2006) Performance models ndash Enabling organizations prediction capability Proceedings of SEPGrsquo06 Pittsburgh Software Engineering Institute Carnegie Mellon University

Schwaber K amp Beedle M (2002) Agile Software Development Using Scrum

Spinellis D (2006) Code Quality Addison-Wesley

Van Epps S amp Marshall R (2012) Using TSP at the end of the development cycle Proceedings of TSP 2012 Pittsburgh Software Engineering Institute Carnegie Mellon University

Vu JD (1996) Software process improvement A business case In Proceedings of the European SEPG Conference Milton Keynes UK European Software Process Improvement Foundation

118

  • Best Practices for Software Process and Product Measurement
  • Instructor Dr Bill Curtis
  • Agenda
  • Section 1
  • Why Does Measurement Languish
  • Measurement Process Guidance
  • Measurement Categories
  • Productivity Analysis Objectives
  • Productivity Analysis Measures
  • Software Productivity
  • Software Size Measures
  • How Many Lines of Code
  • Function Point Estimation
  • Effort mdash Weakest Measure
  • How Quality Affects Productivity
  • Carry-forward Rework
  • Example of Quality Impact
  • Quality-Adjusted Productivity
  • Best Practices for Analysis
  • 1 Evaluate Demographics
  • 2 Segment Baselines
  • 3 Beware Sampling Effects
  • 4 Understand Variation
  • 5 Investigate Distributions
  • 6 Account for Maturity Effects
  • 7 Caution for External Data Sets
  • Slide Number 27
  • Measuring Project Progress
  • Requirements Change Impact
  • Phase Defect Removal Model
  • SCRUM Board
  • Burndown Chart and Velocity
  • Analyzing Velocity
  • Measuring and Managing Flow
  • Trends Over Sprints
  • Sources and Measures
  • Recommended Books
  • Personal Software Process (PSP)
  • Team Software Process (TSP)
  • Aggregated Personal Data Best
  • Defect Detection at Intuit
  • TSP Benefits
  • Section 3
  • CMMI Maturity Transitions
  • Enhanced Maturity Framework
  • Maturity Level Transformation
  • Measurement by Level
  • Average Improvement Results
  • Schedule Performance
  • Cost Performance
  • Crosbyrsquos Cost of Quality Model
  • Raytheons Cost of Quality
  • Raytheonrsquos Productivity
  • Raytheonrsquos Cost Reduction
  • Raytheonrsquos Cost Predictability
  • OMRONrsquos Reuse gains
  • Section 4
  • What about the Product
  • Six Sigmarsquos Challenge
  • Testing is Not Enough
  • CASTrsquos Application Intelligence Platform
  • Modern Apps are a Technology Stack
  • Architecturally Complex Defects
  • Propagated Risk
  • Transaction Risk
  • Risk-Based Prioritization
  • Static Analysis in US DoD
  • Reducing Operational Incidents amp Costs
  • Reducing Operational Losses
  • Slide Number 70
  • The Technical Debt Metaphor
  • Tech Debt by Quality Factor
  • Join CISQ wwwit-cisqorg
  • Section
  • References 1
  • References 2
  • References 2
Page 68: Best Practices for Software Process and Product Measurement · Software Process and Product Measurement Dr. Bill Curtis ... measurement best practice • Best guidance for starting

Click to edit Master title style Architecturally Complex Defects

Primary cause of operational

problems

20x as many

fixes to correct

Architecturally Complex Defect

A structural flaw involving interactions among multiple components that reside in different application layers

48

52 92

8 Architecturally

Complex Defects

Code unit-level violations

of total app defects

of total repair effort

Most architecturally complex defects involve an architectural hotspot ndash a badly-built component that should be replaced

Click to edit Master title style Propagated Risk

The impact of violation 284 in Component B is very widely propagated throughout the system ndash

it presents very high risk

The impact of violation 342 in Component A is only narrowly propagated in the system

ndash it presents low risk

Propagated Risk Index

A ranking of the risk created by each violation based on the its severity and the breadth of its impact in the system

Violation 342 in component A

Violation 284 in component B

Remediating violation 284 in Component B will have

the greater impact on reducing risk and cost

Click to edit Master title style Transaction Risk

Customers care most about the dependability of their transactions

The risk of a transaction is determined by the number and severity of violations along its path

Transactions can be ranked by risk to prioritize their remediation

The Transaction Risk Index can be further tuned using the operational frequency of each transaction

Transaction Risk Index

A ranking of transactions based on the risk created by the number and severity of violations along the transaction path

Transaction entryexit in the user interface layer

Transaction fulfillment in the data storage layer

1 3 2 4 5

Transaction 1 poses greatest risk

Transaction processing in the business logic layer

Click to edit Master title style Risk-Based Prioritization

Violation Health Factor Severity Propagated Risk Index

In a high risk transaction

Frequency of path execution

148 Security 9

371 Robustness 9

062 Performance 8

284 Robustness 7

016 Changeability 7

241 Security 5

173 Transferability 4

359 Transferability 3

Click to edit Master title style Static Analysis in US DoD

John Keane (2013)

Testing finds lt13 of structural defects

Static analysis before testing is 85 efficient

With testing and static analysis combined test schedules are reduced by 50

Click to edit Master title style Reducing Operational Incidents amp Costs

Study of structural quality measures and maintenance effort across 20 customers in a large global system integrator

TQI increase of 24 decreased corrective maintenance effort by 50

Total Quality Index

Click to edit Master title style Reducing Operational Losses

0

20

40

60

80

100

120

140

160

180

100 150 200 250 300 350 400

Technical Quality Index

Pro

du

ctio

n In

cid

ents

+$2000000 Annual Loss

$50000 Annual Loss

$5000 Annual Loss

Large international investment bank Business critical applications

Click to edit Master title style Factors Affecting Structural Quality

2017 CRASH Report Request from infocastsoftwarecom

Factor apps Robust Security Perform Change Transfer

Technology 1606 20 3 16 26 7

Java-EE

Industry 677 5 5 7 4 6

App type 510 1 1 1 1 1

Source 580 1 1 4

Shore 540

Maturity 72 28 25 15 24 12

Method 208 10 6 14 6

Team size 186 7 5 8 8

of users 262 4 3 5 4

Percentage of variation in structural quality scores accounted for by various factors

Click to edit Master title style The Technical Debt Metaphor

Structural quality problems in production code

Technical Debt

Principal borrowed

Interest on the debt

Business Risk Liability from debt

Opportunity cost

Interestmdashcontinuing IT costs attributable to the violations causing technical debt ie higher maintenance costs greater resource usage etc

Principalcost of fixing problems remaining in the code after release that must be remediated

Opportunity costmdashbenefits that could have been achieved had resources been put on new capability rather than retiring technical debt

Liabilitymdashbusiness costs related to outages breaches corrupted data etc

Technical Debt the future cost of fixing defects remaining in code at release a component of the cost of ownership

Todayrsquos talk focuses on the principal

Curtis et al (2012) IEEE Software

Click to edit Master title style Tech Debt by Quality Factor

Transferability

40

Changeability

30

Security 7

Robustness

18

70 of Technical Debt is in IT Cost (Transferability Changeability)

30 of Technical Debt is in Business Risk

(Robustness Performance Security)

Health Factor proportions are mostly

consistent across technologies

Curtis et al (2012) IEEE Software

Click to edit Master title style Join CISQ wwwit-cisqorg

Click to edit Master title style Section

References

115

Click to edit Master title style References 1 Anderson DJ (2010) Kanban Sequim WA Blue Hole Press

Basili V Trendowicz A Kowalczyk M Heidrich J Seaman C Munch J amp Rombach D (2014) Aligning Organizations through Measurement Heidelberg Springer

Boehm B (1987) Increasing software productivity IEEE Computer 20 (9) 43-57

Boehm B et al (2000) Software Cost Estimation with COCOMO II Prentice-Hall

Chrissis MB Konrad M amp Shrum S (2011) CMMI Third Edition Guidelines for Process Integration and Product Improvement Reading MA Addison Wesley

Cohn M (2006) Agile Estimating and Planning Upper Saddle River NJ Prentice-Hall

Crosby P (1979) Quality Is Free New York McGraw-Hill

Curtis B Sapiddi J amp Szynkarski A (2012) Estimating the principle of an applicationrsquos technical debt IEEE Software 29 (6) 34-42

Dion R (1993) Process improvement and the corporate balance sheet IEEE Software 10 (4) 28-35

Duncker R (1992) Presentation at the 25th Annual Conference of the Singapore Computer Society Singapore November 1992

Dyne K (1999) ESSI Benefits from Continuous Software Improvements In Proceedings of ESEPGrsquo99 Milton Keyes UK ESPI Foundation

Flowe RM amp Thordahl JB (1994) A Correlational Study of the SEIrsquos Capability Maturity Model and Software Development Performance in DoD Contracts (AFITGSSLAR94D-2) Dayton OH Air Force Institute of Technology Wright Patterson Air Force Base

116

Click to edit Master title style References 2 Haley T Ireland B Wojtaszek E Nash D amp Dion R (1995) Raytheon Electronic Systems

Experience in Software Process Improvement (Tech Rep CMUSEI-95-TR-017) ) Pittsburgh Software Engineering Institute Carnegie Mellon University

Ebert C amp Dumke R (2007) Software Measurement Berlin Springer-Verlag

Fagan E amp Davis N (2005) TSP PSP at Intuit Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Fenton N (2006) New directions for Software Metrics CIO Symposium on Software Best Practices

Garmus D amp Herron D (2001) Function Point Analysis Boston Addison-Wesley

George E (2005) Project management with TSP Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Herbsleb J Carleton A Rozum J Siegel J amp Zubrow D (1994) Benefits of CMM-Based Software Process Improvement Initial Results (Tech Rep CMUSEI-94-TR-13) Pittsburgh Software Engineering Institute Carnegie Mellon University

Humphrey W (1997) Introduction to the Personal Software Process Boston Addison Wesley

Humphrey W (2000) Introduction to the Team Software Process Boston Addison Wesley

Humphrey W (2006) TSP Leading a Development Team Boston Addison Wesley

Jackson D (2009) A direct path to dependable software Communications of the ACM 52 (4)

Kan SH (1995) Metrics and Models in Software Quality Engineering Reading MA Addison-Wesley

Keeni B (2000) IEEE Software 16 (4)

117

Click to edit Master title style References 2 Larman C (2004) Agile and Iterative Development Boston Addison Wesley

Lydon T (1995) Productivity drivers Process and capital In Proceedings of the 1995 SEPG Conference Pittsburgh Software Engineering Institute Carnegie Mellon University

McGarry J et al (2002) Practical Software Measurement Reading MA Addison Wesley

Paulk M Weber C Curtis B amp Chrissis M (1995) The Capability Maturity Model Guidelines for Improving the Software Process Reading MA Addison-Wesley

Peterman W (2000) IEEE Software JulyAugust issue

Poppendieck M amp Poppendieck T (2007) Implementing Lean Software Development Boston Addison-Wesley

Sakamoto K Kishida K amp Nakakoji K (1996) Cultural adaptation of the CMM In Fuggetta A amp Wolf A (Eds) Software Process Chichester UK Wiley 137-154

Sinha M (2006) Performance models ndash Enabling organizations prediction capability Proceedings of SEPGrsquo06 Pittsburgh Software Engineering Institute Carnegie Mellon University

Schwaber K amp Beedle M (2002) Agile Software Development Using Scrum

Spinellis D (2006) Code Quality Addison-Wesley

Van Epps S amp Marshall R (2012) Using TSP at the end of the development cycle Proceedings of TSP 2012 Pittsburgh Software Engineering Institute Carnegie Mellon University

Vu JD (1996) Software process improvement A business case In Proceedings of the European SEPG Conference Milton Keynes UK European Software Process Improvement Foundation

118

  • Best Practices for Software Process and Product Measurement
  • Instructor Dr Bill Curtis
  • Agenda
  • Section 1
  • Why Does Measurement Languish
  • Measurement Process Guidance
  • Measurement Categories
  • Productivity Analysis Objectives
  • Productivity Analysis Measures
  • Software Productivity
  • Software Size Measures
  • How Many Lines of Code
  • Function Point Estimation
  • Effort mdash Weakest Measure
  • How Quality Affects Productivity
  • Carry-forward Rework
  • Example of Quality Impact
  • Quality-Adjusted Productivity
  • Best Practices for Analysis
  • 1 Evaluate Demographics
  • 2 Segment Baselines
  • 3 Beware Sampling Effects
  • 4 Understand Variation
  • 5 Investigate Distributions
  • 6 Account for Maturity Effects
  • 7 Caution for External Data Sets
  • Slide Number 27
  • Measuring Project Progress
  • Requirements Change Impact
  • Phase Defect Removal Model
  • SCRUM Board
  • Burndown Chart and Velocity
  • Analyzing Velocity
  • Measuring and Managing Flow
  • Trends Over Sprints
  • Sources and Measures
  • Recommended Books
  • Personal Software Process (PSP)
  • Team Software Process (TSP)
  • Aggregated Personal Data Best
  • Defect Detection at Intuit
  • TSP Benefits
  • Section 3
  • CMMI Maturity Transitions
  • Enhanced Maturity Framework
  • Maturity Level Transformation
  • Measurement by Level
  • Average Improvement Results
  • Schedule Performance
  • Cost Performance
  • Crosbyrsquos Cost of Quality Model
  • Raytheons Cost of Quality
  • Raytheonrsquos Productivity
  • Raytheonrsquos Cost Reduction
  • Raytheonrsquos Cost Predictability
  • OMRONrsquos Reuse gains
  • Section 4
  • What about the Product
  • Six Sigmarsquos Challenge
  • Testing is Not Enough
  • CASTrsquos Application Intelligence Platform
  • Modern Apps are a Technology Stack
  • Architecturally Complex Defects
  • Propagated Risk
  • Transaction Risk
  • Risk-Based Prioritization
  • Static Analysis in US DoD
  • Reducing Operational Incidents amp Costs
  • Reducing Operational Losses
  • Slide Number 70
  • The Technical Debt Metaphor
  • Tech Debt by Quality Factor
  • Join CISQ wwwit-cisqorg
  • Section
  • References 1
  • References 2
  • References 2
Page 69: Best Practices for Software Process and Product Measurement · Software Process and Product Measurement Dr. Bill Curtis ... measurement best practice • Best guidance for starting

Click to edit Master title style Propagated Risk

The impact of violation 284 in Component B is very widely propagated throughout the system ndash

it presents very high risk

The impact of violation 342 in Component A is only narrowly propagated in the system

ndash it presents low risk

Propagated Risk Index

A ranking of the risk created by each violation based on the its severity and the breadth of its impact in the system

Violation 342 in component A

Violation 284 in component B

Remediating violation 284 in Component B will have

the greater impact on reducing risk and cost

Click to edit Master title style Transaction Risk

Customers care most about the dependability of their transactions

The risk of a transaction is determined by the number and severity of violations along its path

Transactions can be ranked by risk to prioritize their remediation

The Transaction Risk Index can be further tuned using the operational frequency of each transaction

Transaction Risk Index

A ranking of transactions based on the risk created by the number and severity of violations along the transaction path

Transaction entryexit in the user interface layer

Transaction fulfillment in the data storage layer

1 3 2 4 5

Transaction 1 poses greatest risk

Transaction processing in the business logic layer

Click to edit Master title style Risk-Based Prioritization

Violation Health Factor Severity Propagated Risk Index

In a high risk transaction

Frequency of path execution

148 Security 9

371 Robustness 9

062 Performance 8

284 Robustness 7

016 Changeability 7

241 Security 5

173 Transferability 4

359 Transferability 3

Click to edit Master title style Static Analysis in US DoD

John Keane (2013)

Testing finds lt13 of structural defects

Static analysis before testing is 85 efficient

With testing and static analysis combined test schedules are reduced by 50

Click to edit Master title style Reducing Operational Incidents amp Costs

Study of structural quality measures and maintenance effort across 20 customers in a large global system integrator

TQI increase of 24 decreased corrective maintenance effort by 50

Total Quality Index

Click to edit Master title style Reducing Operational Losses

0

20

40

60

80

100

120

140

160

180

100 150 200 250 300 350 400

Technical Quality Index

Pro

du

ctio

n In

cid

ents

+$2000000 Annual Loss

$50000 Annual Loss

$5000 Annual Loss

Large international investment bank Business critical applications

Click to edit Master title style Factors Affecting Structural Quality

2017 CRASH Report Request from infocastsoftwarecom

Factor apps Robust Security Perform Change Transfer

Technology 1606 20 3 16 26 7

Java-EE

Industry 677 5 5 7 4 6

App type 510 1 1 1 1 1

Source 580 1 1 4

Shore 540

Maturity 72 28 25 15 24 12

Method 208 10 6 14 6

Team size 186 7 5 8 8

of users 262 4 3 5 4

Percentage of variation in structural quality scores accounted for by various factors

Click to edit Master title style The Technical Debt Metaphor

Structural quality problems in production code

Technical Debt

Principal borrowed

Interest on the debt

Business Risk Liability from debt

Opportunity cost

Interestmdashcontinuing IT costs attributable to the violations causing technical debt ie higher maintenance costs greater resource usage etc

Principalcost of fixing problems remaining in the code after release that must be remediated

Opportunity costmdashbenefits that could have been achieved had resources been put on new capability rather than retiring technical debt

Liabilitymdashbusiness costs related to outages breaches corrupted data etc

Technical Debt the future cost of fixing defects remaining in code at release a component of the cost of ownership

Todayrsquos talk focuses on the principal

Curtis et al (2012) IEEE Software

Click to edit Master title style Tech Debt by Quality Factor

Transferability

40

Changeability

30

Security 7

Robustness

18

70 of Technical Debt is in IT Cost (Transferability Changeability)

30 of Technical Debt is in Business Risk

(Robustness Performance Security)

Health Factor proportions are mostly

consistent across technologies

Curtis et al (2012) IEEE Software

Click to edit Master title style Join CISQ wwwit-cisqorg

Click to edit Master title style Section

References

115

Click to edit Master title style References 1 Anderson DJ (2010) Kanban Sequim WA Blue Hole Press

Basili V Trendowicz A Kowalczyk M Heidrich J Seaman C Munch J amp Rombach D (2014) Aligning Organizations through Measurement Heidelberg Springer

Boehm B (1987) Increasing software productivity IEEE Computer 20 (9) 43-57

Boehm B et al (2000) Software Cost Estimation with COCOMO II Prentice-Hall

Chrissis MB Konrad M amp Shrum S (2011) CMMI Third Edition Guidelines for Process Integration and Product Improvement Reading MA Addison Wesley

Cohn M (2006) Agile Estimating and Planning Upper Saddle River NJ Prentice-Hall

Crosby P (1979) Quality Is Free New York McGraw-Hill

Curtis B Sapiddi J amp Szynkarski A (2012) Estimating the principle of an applicationrsquos technical debt IEEE Software 29 (6) 34-42

Dion R (1993) Process improvement and the corporate balance sheet IEEE Software 10 (4) 28-35

Duncker R (1992) Presentation at the 25th Annual Conference of the Singapore Computer Society Singapore November 1992

Dyne K (1999) ESSI Benefits from Continuous Software Improvements In Proceedings of ESEPGrsquo99 Milton Keyes UK ESPI Foundation

Flowe RM amp Thordahl JB (1994) A Correlational Study of the SEIrsquos Capability Maturity Model and Software Development Performance in DoD Contracts (AFITGSSLAR94D-2) Dayton OH Air Force Institute of Technology Wright Patterson Air Force Base

116

Click to edit Master title style References 2 Haley T Ireland B Wojtaszek E Nash D amp Dion R (1995) Raytheon Electronic Systems

Experience in Software Process Improvement (Tech Rep CMUSEI-95-TR-017) ) Pittsburgh Software Engineering Institute Carnegie Mellon University

Ebert C amp Dumke R (2007) Software Measurement Berlin Springer-Verlag

Fagan E amp Davis N (2005) TSP PSP at Intuit Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Fenton N (2006) New directions for Software Metrics CIO Symposium on Software Best Practices

Garmus D amp Herron D (2001) Function Point Analysis Boston Addison-Wesley

George E (2005) Project management with TSP Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Herbsleb J Carleton A Rozum J Siegel J amp Zubrow D (1994) Benefits of CMM-Based Software Process Improvement Initial Results (Tech Rep CMUSEI-94-TR-13) Pittsburgh Software Engineering Institute Carnegie Mellon University

Humphrey W (1997) Introduction to the Personal Software Process Boston Addison Wesley

Humphrey W (2000) Introduction to the Team Software Process Boston Addison Wesley

Humphrey W (2006) TSP Leading a Development Team Boston Addison Wesley

Jackson D (2009) A direct path to dependable software Communications of the ACM 52 (4)

Kan SH (1995) Metrics and Models in Software Quality Engineering Reading MA Addison-Wesley

Keeni B (2000) IEEE Software 16 (4)

117

Click to edit Master title style References 2 Larman C (2004) Agile and Iterative Development Boston Addison Wesley

Lydon T (1995) Productivity drivers Process and capital In Proceedings of the 1995 SEPG Conference Pittsburgh Software Engineering Institute Carnegie Mellon University

McGarry J et al (2002) Practical Software Measurement Reading MA Addison Wesley

Paulk M Weber C Curtis B amp Chrissis M (1995) The Capability Maturity Model Guidelines for Improving the Software Process Reading MA Addison-Wesley

Peterman W (2000) IEEE Software JulyAugust issue

Poppendieck M amp Poppendieck T (2007) Implementing Lean Software Development Boston Addison-Wesley

Sakamoto K Kishida K amp Nakakoji K (1996) Cultural adaptation of the CMM In Fuggetta A amp Wolf A (Eds) Software Process Chichester UK Wiley 137-154

Sinha M (2006) Performance models ndash Enabling organizations prediction capability Proceedings of SEPGrsquo06 Pittsburgh Software Engineering Institute Carnegie Mellon University

Schwaber K amp Beedle M (2002) Agile Software Development Using Scrum

Spinellis D (2006) Code Quality Addison-Wesley

Van Epps S amp Marshall R (2012) Using TSP at the end of the development cycle Proceedings of TSP 2012 Pittsburgh Software Engineering Institute Carnegie Mellon University

Vu JD (1996) Software process improvement A business case In Proceedings of the European SEPG Conference Milton Keynes UK European Software Process Improvement Foundation

118

  • Best Practices for Software Process and Product Measurement
  • Instructor Dr Bill Curtis
  • Agenda
  • Section 1
  • Why Does Measurement Languish
  • Measurement Process Guidance
  • Measurement Categories
  • Productivity Analysis Objectives
  • Productivity Analysis Measures
  • Software Productivity
  • Software Size Measures
  • How Many Lines of Code
  • Function Point Estimation
  • Effort mdash Weakest Measure
  • How Quality Affects Productivity
  • Carry-forward Rework
  • Example of Quality Impact
  • Quality-Adjusted Productivity
  • Best Practices for Analysis
  • 1 Evaluate Demographics
  • 2 Segment Baselines
  • 3 Beware Sampling Effects
  • 4 Understand Variation
  • 5 Investigate Distributions
  • 6 Account for Maturity Effects
  • 7 Caution for External Data Sets
  • Slide Number 27
  • Measuring Project Progress
  • Requirements Change Impact
  • Phase Defect Removal Model
  • SCRUM Board
  • Burndown Chart and Velocity
  • Analyzing Velocity
  • Measuring and Managing Flow
  • Trends Over Sprints
  • Sources and Measures
  • Recommended Books
  • Personal Software Process (PSP)
  • Team Software Process (TSP)
  • Aggregated Personal Data Best
  • Defect Detection at Intuit
  • TSP Benefits
  • Section 3
  • CMMI Maturity Transitions
  • Enhanced Maturity Framework
  • Maturity Level Transformation
  • Measurement by Level
  • Average Improvement Results
  • Schedule Performance
  • Cost Performance
  • Crosbyrsquos Cost of Quality Model
  • Raytheons Cost of Quality
  • Raytheonrsquos Productivity
  • Raytheonrsquos Cost Reduction
  • Raytheonrsquos Cost Predictability
  • OMRONrsquos Reuse gains
  • Section 4
  • What about the Product
  • Six Sigmarsquos Challenge
  • Testing is Not Enough
  • CASTrsquos Application Intelligence Platform
  • Modern Apps are a Technology Stack
  • Architecturally Complex Defects
  • Propagated Risk
  • Transaction Risk
  • Risk-Based Prioritization
  • Static Analysis in US DoD
  • Reducing Operational Incidents amp Costs
  • Reducing Operational Losses
  • Slide Number 70
  • The Technical Debt Metaphor
  • Tech Debt by Quality Factor
  • Join CISQ wwwit-cisqorg
  • Section
  • References 1
  • References 2
  • References 2
Page 70: Best Practices for Software Process and Product Measurement · Software Process and Product Measurement Dr. Bill Curtis ... measurement best practice • Best guidance for starting

Click to edit Master title style Transaction Risk

Customers care most about the dependability of their transactions

The risk of a transaction is determined by the number and severity of violations along its path

Transactions can be ranked by risk to prioritize their remediation

The Transaction Risk Index can be further tuned using the operational frequency of each transaction

Transaction Risk Index

A ranking of transactions based on the risk created by the number and severity of violations along the transaction path

Transaction entryexit in the user interface layer

Transaction fulfillment in the data storage layer

1 3 2 4 5

Transaction 1 poses greatest risk

Transaction processing in the business logic layer

Click to edit Master title style Risk-Based Prioritization

Violation Health Factor Severity Propagated Risk Index

In a high risk transaction

Frequency of path execution

148 Security 9

371 Robustness 9

062 Performance 8

284 Robustness 7

016 Changeability 7

241 Security 5

173 Transferability 4

359 Transferability 3

Click to edit Master title style Static Analysis in US DoD

John Keane (2013)

Testing finds lt13 of structural defects

Static analysis before testing is 85 efficient

With testing and static analysis combined test schedules are reduced by 50

Click to edit Master title style Reducing Operational Incidents amp Costs

Study of structural quality measures and maintenance effort across 20 customers in a large global system integrator

TQI increase of 24 decreased corrective maintenance effort by 50

Total Quality Index

Click to edit Master title style Reducing Operational Losses

0

20

40

60

80

100

120

140

160

180

100 150 200 250 300 350 400

Technical Quality Index

Pro

du

ctio

n In

cid

ents

+$2000000 Annual Loss

$50000 Annual Loss

$5000 Annual Loss

Large international investment bank Business critical applications

Click to edit Master title style Factors Affecting Structural Quality

2017 CRASH Report Request from infocastsoftwarecom

Factor apps Robust Security Perform Change Transfer

Technology 1606 20 3 16 26 7

Java-EE

Industry 677 5 5 7 4 6

App type 510 1 1 1 1 1

Source 580 1 1 4

Shore 540

Maturity 72 28 25 15 24 12

Method 208 10 6 14 6

Team size 186 7 5 8 8

of users 262 4 3 5 4

Percentage of variation in structural quality scores accounted for by various factors

Click to edit Master title style The Technical Debt Metaphor

Structural quality problems in production code

Technical Debt

Principal borrowed

Interest on the debt

Business Risk Liability from debt

Opportunity cost

Interestmdashcontinuing IT costs attributable to the violations causing technical debt ie higher maintenance costs greater resource usage etc

Principalcost of fixing problems remaining in the code after release that must be remediated

Opportunity costmdashbenefits that could have been achieved had resources been put on new capability rather than retiring technical debt

Liabilitymdashbusiness costs related to outages breaches corrupted data etc

Technical Debt the future cost of fixing defects remaining in code at release a component of the cost of ownership

Todayrsquos talk focuses on the principal

Curtis et al (2012) IEEE Software

Click to edit Master title style Tech Debt by Quality Factor

Transferability

40

Changeability

30

Security 7

Robustness

18

70 of Technical Debt is in IT Cost (Transferability Changeability)

30 of Technical Debt is in Business Risk

(Robustness Performance Security)

Health Factor proportions are mostly

consistent across technologies

Curtis et al (2012) IEEE Software

Click to edit Master title style Join CISQ wwwit-cisqorg

Click to edit Master title style Section

References

115

Click to edit Master title style References 1 Anderson DJ (2010) Kanban Sequim WA Blue Hole Press

Basili V Trendowicz A Kowalczyk M Heidrich J Seaman C Munch J amp Rombach D (2014) Aligning Organizations through Measurement Heidelberg Springer

Boehm B (1987) Increasing software productivity IEEE Computer 20 (9) 43-57

Boehm B et al (2000) Software Cost Estimation with COCOMO II Prentice-Hall

Chrissis MB Konrad M amp Shrum S (2011) CMMI Third Edition Guidelines for Process Integration and Product Improvement Reading MA Addison Wesley

Cohn M (2006) Agile Estimating and Planning Upper Saddle River NJ Prentice-Hall

Crosby P (1979) Quality Is Free New York McGraw-Hill

Curtis B Sapiddi J amp Szynkarski A (2012) Estimating the principle of an applicationrsquos technical debt IEEE Software 29 (6) 34-42

Dion R (1993) Process improvement and the corporate balance sheet IEEE Software 10 (4) 28-35

Duncker R (1992) Presentation at the 25th Annual Conference of the Singapore Computer Society Singapore November 1992

Dyne K (1999) ESSI Benefits from Continuous Software Improvements In Proceedings of ESEPGrsquo99 Milton Keyes UK ESPI Foundation

Flowe RM amp Thordahl JB (1994) A Correlational Study of the SEIrsquos Capability Maturity Model and Software Development Performance in DoD Contracts (AFITGSSLAR94D-2) Dayton OH Air Force Institute of Technology Wright Patterson Air Force Base

116

Click to edit Master title style References 2 Haley T Ireland B Wojtaszek E Nash D amp Dion R (1995) Raytheon Electronic Systems

Experience in Software Process Improvement (Tech Rep CMUSEI-95-TR-017) ) Pittsburgh Software Engineering Institute Carnegie Mellon University

Ebert C amp Dumke R (2007) Software Measurement Berlin Springer-Verlag

Fagan E amp Davis N (2005) TSP PSP at Intuit Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Fenton N (2006) New directions for Software Metrics CIO Symposium on Software Best Practices

Garmus D amp Herron D (2001) Function Point Analysis Boston Addison-Wesley

George E (2005) Project management with TSP Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Herbsleb J Carleton A Rozum J Siegel J amp Zubrow D (1994) Benefits of CMM-Based Software Process Improvement Initial Results (Tech Rep CMUSEI-94-TR-13) Pittsburgh Software Engineering Institute Carnegie Mellon University

Humphrey W (1997) Introduction to the Personal Software Process Boston Addison Wesley

Humphrey W (2000) Introduction to the Team Software Process Boston Addison Wesley

Humphrey W (2006) TSP Leading a Development Team Boston Addison Wesley

Jackson D (2009) A direct path to dependable software Communications of the ACM 52 (4)

Kan SH (1995) Metrics and Models in Software Quality Engineering Reading MA Addison-Wesley

Keeni B (2000) IEEE Software 16 (4)

117

Click to edit Master title style References 2 Larman C (2004) Agile and Iterative Development Boston Addison Wesley

Lydon T (1995) Productivity drivers Process and capital In Proceedings of the 1995 SEPG Conference Pittsburgh Software Engineering Institute Carnegie Mellon University

McGarry J et al (2002) Practical Software Measurement Reading MA Addison Wesley

Paulk M Weber C Curtis B amp Chrissis M (1995) The Capability Maturity Model Guidelines for Improving the Software Process Reading MA Addison-Wesley

Peterman W (2000) IEEE Software JulyAugust issue

Poppendieck M amp Poppendieck T (2007) Implementing Lean Software Development Boston Addison-Wesley

Sakamoto K Kishida K amp Nakakoji K (1996) Cultural adaptation of the CMM In Fuggetta A amp Wolf A (Eds) Software Process Chichester UK Wiley 137-154

Sinha M (2006) Performance models ndash Enabling organizations prediction capability Proceedings of SEPGrsquo06 Pittsburgh Software Engineering Institute Carnegie Mellon University

Schwaber K amp Beedle M (2002) Agile Software Development Using Scrum

Spinellis D (2006) Code Quality Addison-Wesley

Van Epps S amp Marshall R (2012) Using TSP at the end of the development cycle Proceedings of TSP 2012 Pittsburgh Software Engineering Institute Carnegie Mellon University

Vu JD (1996) Software process improvement A business case In Proceedings of the European SEPG Conference Milton Keynes UK European Software Process Improvement Foundation

118

  • Best Practices for Software Process and Product Measurement
  • Instructor Dr Bill Curtis
  • Agenda
  • Section 1
  • Why Does Measurement Languish
  • Measurement Process Guidance
  • Measurement Categories
  • Productivity Analysis Objectives
  • Productivity Analysis Measures
  • Software Productivity
  • Software Size Measures
  • How Many Lines of Code
  • Function Point Estimation
  • Effort mdash Weakest Measure
  • How Quality Affects Productivity
  • Carry-forward Rework
  • Example of Quality Impact
  • Quality-Adjusted Productivity
  • Best Practices for Analysis
  • 1 Evaluate Demographics
  • 2 Segment Baselines
  • 3 Beware Sampling Effects
  • 4 Understand Variation
  • 5 Investigate Distributions
  • 6 Account for Maturity Effects
  • 7 Caution for External Data Sets
  • Slide Number 27
  • Measuring Project Progress
  • Requirements Change Impact
  • Phase Defect Removal Model
  • SCRUM Board
  • Burndown Chart and Velocity
  • Analyzing Velocity
  • Measuring and Managing Flow
  • Trends Over Sprints
  • Sources and Measures
  • Recommended Books
  • Personal Software Process (PSP)
  • Team Software Process (TSP)
  • Aggregated Personal Data Best
  • Defect Detection at Intuit
  • TSP Benefits
  • Section 3
  • CMMI Maturity Transitions
  • Enhanced Maturity Framework
  • Maturity Level Transformation
  • Measurement by Level
  • Average Improvement Results
  • Schedule Performance
  • Cost Performance
  • Crosbyrsquos Cost of Quality Model
  • Raytheons Cost of Quality
  • Raytheonrsquos Productivity
  • Raytheonrsquos Cost Reduction
  • Raytheonrsquos Cost Predictability
  • OMRONrsquos Reuse gains
  • Section 4
  • What about the Product
  • Six Sigmarsquos Challenge
  • Testing is Not Enough
  • CASTrsquos Application Intelligence Platform
  • Modern Apps are a Technology Stack
  • Architecturally Complex Defects
  • Propagated Risk
  • Transaction Risk
  • Risk-Based Prioritization
  • Static Analysis in US DoD
  • Reducing Operational Incidents amp Costs
  • Reducing Operational Losses
  • Slide Number 70
  • The Technical Debt Metaphor
  • Tech Debt by Quality Factor
  • Join CISQ wwwit-cisqorg
  • Section
  • References 1
  • References 2
  • References 2
Page 71: Best Practices for Software Process and Product Measurement · Software Process and Product Measurement Dr. Bill Curtis ... measurement best practice • Best guidance for starting

Click to edit Master title style Risk-Based Prioritization

Violation Health Factor Severity Propagated Risk Index

In a high risk transaction

Frequency of path execution

148 Security 9

371 Robustness 9

062 Performance 8

284 Robustness 7

016 Changeability 7

241 Security 5

173 Transferability 4

359 Transferability 3

Click to edit Master title style Static Analysis in US DoD

John Keane (2013)

Testing finds lt13 of structural defects

Static analysis before testing is 85 efficient

With testing and static analysis combined test schedules are reduced by 50

Click to edit Master title style Reducing Operational Incidents amp Costs

Study of structural quality measures and maintenance effort across 20 customers in a large global system integrator

TQI increase of 24 decreased corrective maintenance effort by 50

Total Quality Index

Click to edit Master title style Reducing Operational Losses

0

20

40

60

80

100

120

140

160

180

100 150 200 250 300 350 400

Technical Quality Index

Pro

du

ctio

n In

cid

ents

+$2000000 Annual Loss

$50000 Annual Loss

$5000 Annual Loss

Large international investment bank Business critical applications

Click to edit Master title style Factors Affecting Structural Quality

2017 CRASH Report Request from infocastsoftwarecom

Factor apps Robust Security Perform Change Transfer

Technology 1606 20 3 16 26 7

Java-EE

Industry 677 5 5 7 4 6

App type 510 1 1 1 1 1

Source 580 1 1 4

Shore 540

Maturity 72 28 25 15 24 12

Method 208 10 6 14 6

Team size 186 7 5 8 8

of users 262 4 3 5 4

Percentage of variation in structural quality scores accounted for by various factors

Click to edit Master title style The Technical Debt Metaphor

Structural quality problems in production code

Technical Debt

Principal borrowed

Interest on the debt

Business Risk Liability from debt

Opportunity cost

Interestmdashcontinuing IT costs attributable to the violations causing technical debt ie higher maintenance costs greater resource usage etc

Principalcost of fixing problems remaining in the code after release that must be remediated

Opportunity costmdashbenefits that could have been achieved had resources been put on new capability rather than retiring technical debt

Liabilitymdashbusiness costs related to outages breaches corrupted data etc

Technical Debt the future cost of fixing defects remaining in code at release a component of the cost of ownership

Todayrsquos talk focuses on the principal

Curtis et al (2012) IEEE Software

Click to edit Master title style Tech Debt by Quality Factor

Transferability

40

Changeability

30

Security 7

Robustness

18

70 of Technical Debt is in IT Cost (Transferability Changeability)

30 of Technical Debt is in Business Risk

(Robustness Performance Security)

Health Factor proportions are mostly

consistent across technologies

Curtis et al (2012) IEEE Software

Click to edit Master title style Join CISQ wwwit-cisqorg

Click to edit Master title style Section

References

115

Click to edit Master title style References 1 Anderson DJ (2010) Kanban Sequim WA Blue Hole Press

Basili V Trendowicz A Kowalczyk M Heidrich J Seaman C Munch J amp Rombach D (2014) Aligning Organizations through Measurement Heidelberg Springer

Boehm B (1987) Increasing software productivity IEEE Computer 20 (9) 43-57

Boehm B et al (2000) Software Cost Estimation with COCOMO II Prentice-Hall

Chrissis MB Konrad M amp Shrum S (2011) CMMI Third Edition Guidelines for Process Integration and Product Improvement Reading MA Addison Wesley

Cohn M (2006) Agile Estimating and Planning Upper Saddle River NJ Prentice-Hall

Crosby P (1979) Quality Is Free New York McGraw-Hill

Curtis B Sapiddi J amp Szynkarski A (2012) Estimating the principle of an applicationrsquos technical debt IEEE Software 29 (6) 34-42

Dion R (1993) Process improvement and the corporate balance sheet IEEE Software 10 (4) 28-35

Duncker R (1992) Presentation at the 25th Annual Conference of the Singapore Computer Society Singapore November 1992

Dyne K (1999) ESSI Benefits from Continuous Software Improvements In Proceedings of ESEPGrsquo99 Milton Keyes UK ESPI Foundation

Flowe RM amp Thordahl JB (1994) A Correlational Study of the SEIrsquos Capability Maturity Model and Software Development Performance in DoD Contracts (AFITGSSLAR94D-2) Dayton OH Air Force Institute of Technology Wright Patterson Air Force Base

116

Click to edit Master title style References 2 Haley T Ireland B Wojtaszek E Nash D amp Dion R (1995) Raytheon Electronic Systems

Experience in Software Process Improvement (Tech Rep CMUSEI-95-TR-017) ) Pittsburgh Software Engineering Institute Carnegie Mellon University

Ebert C amp Dumke R (2007) Software Measurement Berlin Springer-Verlag

Fagan E amp Davis N (2005) TSP PSP at Intuit Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Fenton N (2006) New directions for Software Metrics CIO Symposium on Software Best Practices

Garmus D amp Herron D (2001) Function Point Analysis Boston Addison-Wesley

George E (2005) Project management with TSP Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Herbsleb J Carleton A Rozum J Siegel J amp Zubrow D (1994) Benefits of CMM-Based Software Process Improvement Initial Results (Tech Rep CMUSEI-94-TR-13) Pittsburgh Software Engineering Institute Carnegie Mellon University

Humphrey W (1997) Introduction to the Personal Software Process Boston Addison Wesley

Humphrey W (2000) Introduction to the Team Software Process Boston Addison Wesley

Humphrey W (2006) TSP Leading a Development Team Boston Addison Wesley

Jackson D (2009) A direct path to dependable software Communications of the ACM 52 (4)

Kan SH (1995) Metrics and Models in Software Quality Engineering Reading MA Addison-Wesley

Keeni B (2000) IEEE Software 16 (4)

117

Click to edit Master title style References 2 Larman C (2004) Agile and Iterative Development Boston Addison Wesley

Lydon T (1995) Productivity drivers Process and capital In Proceedings of the 1995 SEPG Conference Pittsburgh Software Engineering Institute Carnegie Mellon University

McGarry J et al (2002) Practical Software Measurement Reading MA Addison Wesley

Paulk M Weber C Curtis B amp Chrissis M (1995) The Capability Maturity Model Guidelines for Improving the Software Process Reading MA Addison-Wesley

Peterman W (2000) IEEE Software JulyAugust issue

Poppendieck M amp Poppendieck T (2007) Implementing Lean Software Development Boston Addison-Wesley

Sakamoto K Kishida K amp Nakakoji K (1996) Cultural adaptation of the CMM In Fuggetta A amp Wolf A (Eds) Software Process Chichester UK Wiley 137-154

Sinha M (2006) Performance models ndash Enabling organizations prediction capability Proceedings of SEPGrsquo06 Pittsburgh Software Engineering Institute Carnegie Mellon University

Schwaber K amp Beedle M (2002) Agile Software Development Using Scrum

Spinellis D (2006) Code Quality Addison-Wesley

Van Epps S amp Marshall R (2012) Using TSP at the end of the development cycle Proceedings of TSP 2012 Pittsburgh Software Engineering Institute Carnegie Mellon University

Vu JD (1996) Software process improvement A business case In Proceedings of the European SEPG Conference Milton Keynes UK European Software Process Improvement Foundation

118

  • Best Practices for Software Process and Product Measurement
  • Instructor Dr Bill Curtis
  • Agenda
  • Section 1
  • Why Does Measurement Languish
  • Measurement Process Guidance
  • Measurement Categories
  • Productivity Analysis Objectives
  • Productivity Analysis Measures
  • Software Productivity
  • Software Size Measures
  • How Many Lines of Code
  • Function Point Estimation
  • Effort mdash Weakest Measure
  • How Quality Affects Productivity
  • Carry-forward Rework
  • Example of Quality Impact
  • Quality-Adjusted Productivity
  • Best Practices for Analysis
  • 1 Evaluate Demographics
  • 2 Segment Baselines
  • 3 Beware Sampling Effects
  • 4 Understand Variation
  • 5 Investigate Distributions
  • 6 Account for Maturity Effects
  • 7 Caution for External Data Sets
  • Slide Number 27
  • Measuring Project Progress
  • Requirements Change Impact
  • Phase Defect Removal Model
  • SCRUM Board
  • Burndown Chart and Velocity
  • Analyzing Velocity
  • Measuring and Managing Flow
  • Trends Over Sprints
  • Sources and Measures
  • Recommended Books
  • Personal Software Process (PSP)
  • Team Software Process (TSP)
  • Aggregated Personal Data Best
  • Defect Detection at Intuit
  • TSP Benefits
  • Section 3
  • CMMI Maturity Transitions
  • Enhanced Maturity Framework
  • Maturity Level Transformation
  • Measurement by Level
  • Average Improvement Results
  • Schedule Performance
  • Cost Performance
  • Crosbyrsquos Cost of Quality Model
  • Raytheons Cost of Quality
  • Raytheonrsquos Productivity
  • Raytheonrsquos Cost Reduction
  • Raytheonrsquos Cost Predictability
  • OMRONrsquos Reuse gains
  • Section 4
  • What about the Product
  • Six Sigmarsquos Challenge
  • Testing is Not Enough
  • CASTrsquos Application Intelligence Platform
  • Modern Apps are a Technology Stack
  • Architecturally Complex Defects
  • Propagated Risk
  • Transaction Risk
  • Risk-Based Prioritization
  • Static Analysis in US DoD
  • Reducing Operational Incidents amp Costs
  • Reducing Operational Losses
  • Slide Number 70
  • The Technical Debt Metaphor
  • Tech Debt by Quality Factor
  • Join CISQ wwwit-cisqorg
  • Section
  • References 1
  • References 2
  • References 2
Page 72: Best Practices for Software Process and Product Measurement · Software Process and Product Measurement Dr. Bill Curtis ... measurement best practice • Best guidance for starting

Click to edit Master title style Static Analysis in US DoD

John Keane (2013)

Testing finds lt13 of structural defects

Static analysis before testing is 85 efficient

With testing and static analysis combined test schedules are reduced by 50

Click to edit Master title style Reducing Operational Incidents amp Costs

Study of structural quality measures and maintenance effort across 20 customers in a large global system integrator

TQI increase of 24 decreased corrective maintenance effort by 50

Total Quality Index

Click to edit Master title style Reducing Operational Losses

0

20

40

60

80

100

120

140

160

180

100 150 200 250 300 350 400

Technical Quality Index

Pro

du

ctio

n In

cid

ents

+$2000000 Annual Loss

$50000 Annual Loss

$5000 Annual Loss

Large international investment bank Business critical applications

Click to edit Master title style Factors Affecting Structural Quality

2017 CRASH Report Request from infocastsoftwarecom

Factor apps Robust Security Perform Change Transfer

Technology 1606 20 3 16 26 7

Java-EE

Industry 677 5 5 7 4 6

App type 510 1 1 1 1 1

Source 580 1 1 4

Shore 540

Maturity 72 28 25 15 24 12

Method 208 10 6 14 6

Team size 186 7 5 8 8

of users 262 4 3 5 4

Percentage of variation in structural quality scores accounted for by various factors

Click to edit Master title style The Technical Debt Metaphor

Structural quality problems in production code

Technical Debt

Principal borrowed

Interest on the debt

Business Risk Liability from debt

Opportunity cost

Interestmdashcontinuing IT costs attributable to the violations causing technical debt ie higher maintenance costs greater resource usage etc

Principalcost of fixing problems remaining in the code after release that must be remediated

Opportunity costmdashbenefits that could have been achieved had resources been put on new capability rather than retiring technical debt

Liabilitymdashbusiness costs related to outages breaches corrupted data etc

Technical Debt the future cost of fixing defects remaining in code at release a component of the cost of ownership

Todayrsquos talk focuses on the principal

Curtis et al (2012) IEEE Software

Click to edit Master title style Tech Debt by Quality Factor

Transferability

40

Changeability

30

Security 7

Robustness

18

70 of Technical Debt is in IT Cost (Transferability Changeability)

30 of Technical Debt is in Business Risk

(Robustness Performance Security)

Health Factor proportions are mostly

consistent across technologies

Curtis et al (2012) IEEE Software

Click to edit Master title style Join CISQ wwwit-cisqorg

Click to edit Master title style Section

References

115

Click to edit Master title style References 1 Anderson DJ (2010) Kanban Sequim WA Blue Hole Press

Basili V Trendowicz A Kowalczyk M Heidrich J Seaman C Munch J amp Rombach D (2014) Aligning Organizations through Measurement Heidelberg Springer

Boehm B (1987) Increasing software productivity IEEE Computer 20 (9) 43-57

Boehm B et al (2000) Software Cost Estimation with COCOMO II Prentice-Hall

Chrissis MB Konrad M amp Shrum S (2011) CMMI Third Edition Guidelines for Process Integration and Product Improvement Reading MA Addison Wesley

Cohn M (2006) Agile Estimating and Planning Upper Saddle River NJ Prentice-Hall

Crosby P (1979) Quality Is Free New York McGraw-Hill

Curtis B Sapiddi J amp Szynkarski A (2012) Estimating the principle of an applicationrsquos technical debt IEEE Software 29 (6) 34-42

Dion R (1993) Process improvement and the corporate balance sheet IEEE Software 10 (4) 28-35

Duncker R (1992) Presentation at the 25th Annual Conference of the Singapore Computer Society Singapore November 1992

Dyne K (1999) ESSI Benefits from Continuous Software Improvements In Proceedings of ESEPGrsquo99 Milton Keyes UK ESPI Foundation

Flowe RM amp Thordahl JB (1994) A Correlational Study of the SEIrsquos Capability Maturity Model and Software Development Performance in DoD Contracts (AFITGSSLAR94D-2) Dayton OH Air Force Institute of Technology Wright Patterson Air Force Base

116

Click to edit Master title style References 2 Haley T Ireland B Wojtaszek E Nash D amp Dion R (1995) Raytheon Electronic Systems

Experience in Software Process Improvement (Tech Rep CMUSEI-95-TR-017) ) Pittsburgh Software Engineering Institute Carnegie Mellon University

Ebert C amp Dumke R (2007) Software Measurement Berlin Springer-Verlag

Fagan E amp Davis N (2005) TSP PSP at Intuit Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Fenton N (2006) New directions for Software Metrics CIO Symposium on Software Best Practices

Garmus D amp Herron D (2001) Function Point Analysis Boston Addison-Wesley

George E (2005) Project management with TSP Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Herbsleb J Carleton A Rozum J Siegel J amp Zubrow D (1994) Benefits of CMM-Based Software Process Improvement Initial Results (Tech Rep CMUSEI-94-TR-13) Pittsburgh Software Engineering Institute Carnegie Mellon University

Humphrey W (1997) Introduction to the Personal Software Process Boston Addison Wesley

Humphrey W (2000) Introduction to the Team Software Process Boston Addison Wesley

Humphrey W (2006) TSP Leading a Development Team Boston Addison Wesley

Jackson D (2009) A direct path to dependable software Communications of the ACM 52 (4)

Kan SH (1995) Metrics and Models in Software Quality Engineering Reading MA Addison-Wesley

Keeni B (2000) IEEE Software 16 (4)

117

Click to edit Master title style References 2 Larman C (2004) Agile and Iterative Development Boston Addison Wesley

Lydon T (1995) Productivity drivers Process and capital In Proceedings of the 1995 SEPG Conference Pittsburgh Software Engineering Institute Carnegie Mellon University

McGarry J et al (2002) Practical Software Measurement Reading MA Addison Wesley

Paulk M Weber C Curtis B amp Chrissis M (1995) The Capability Maturity Model Guidelines for Improving the Software Process Reading MA Addison-Wesley

Peterman W (2000) IEEE Software JulyAugust issue

Poppendieck M amp Poppendieck T (2007) Implementing Lean Software Development Boston Addison-Wesley

Sakamoto K Kishida K amp Nakakoji K (1996) Cultural adaptation of the CMM In Fuggetta A amp Wolf A (Eds) Software Process Chichester UK Wiley 137-154

Sinha M (2006) Performance models ndash Enabling organizations prediction capability Proceedings of SEPGrsquo06 Pittsburgh Software Engineering Institute Carnegie Mellon University

Schwaber K amp Beedle M (2002) Agile Software Development Using Scrum

Spinellis D (2006) Code Quality Addison-Wesley

Van Epps S amp Marshall R (2012) Using TSP at the end of the development cycle Proceedings of TSP 2012 Pittsburgh Software Engineering Institute Carnegie Mellon University

Vu JD (1996) Software process improvement A business case In Proceedings of the European SEPG Conference Milton Keynes UK European Software Process Improvement Foundation

118

  • Best Practices for Software Process and Product Measurement
  • Instructor Dr Bill Curtis
  • Agenda
  • Section 1
  • Why Does Measurement Languish
  • Measurement Process Guidance
  • Measurement Categories
  • Productivity Analysis Objectives
  • Productivity Analysis Measures
  • Software Productivity
  • Software Size Measures
  • How Many Lines of Code
  • Function Point Estimation
  • Effort mdash Weakest Measure
  • How Quality Affects Productivity
  • Carry-forward Rework
  • Example of Quality Impact
  • Quality-Adjusted Productivity
  • Best Practices for Analysis
  • 1 Evaluate Demographics
  • 2 Segment Baselines
  • 3 Beware Sampling Effects
  • 4 Understand Variation
  • 5 Investigate Distributions
  • 6 Account for Maturity Effects
  • 7 Caution for External Data Sets
  • Slide Number 27
  • Measuring Project Progress
  • Requirements Change Impact
  • Phase Defect Removal Model
  • SCRUM Board
  • Burndown Chart and Velocity
  • Analyzing Velocity
  • Measuring and Managing Flow
  • Trends Over Sprints
  • Sources and Measures
  • Recommended Books
  • Personal Software Process (PSP)
  • Team Software Process (TSP)
  • Aggregated Personal Data Best
  • Defect Detection at Intuit
  • TSP Benefits
  • Section 3
  • CMMI Maturity Transitions
  • Enhanced Maturity Framework
  • Maturity Level Transformation
  • Measurement by Level
  • Average Improvement Results
  • Schedule Performance
  • Cost Performance
  • Crosbyrsquos Cost of Quality Model
  • Raytheons Cost of Quality
  • Raytheonrsquos Productivity
  • Raytheonrsquos Cost Reduction
  • Raytheonrsquos Cost Predictability
  • OMRONrsquos Reuse gains
  • Section 4
  • What about the Product
  • Six Sigmarsquos Challenge
  • Testing is Not Enough
  • CASTrsquos Application Intelligence Platform
  • Modern Apps are a Technology Stack
  • Architecturally Complex Defects
  • Propagated Risk
  • Transaction Risk
  • Risk-Based Prioritization
  • Static Analysis in US DoD
  • Reducing Operational Incidents amp Costs
  • Reducing Operational Losses
  • Slide Number 70
  • The Technical Debt Metaphor
  • Tech Debt by Quality Factor
  • Join CISQ wwwit-cisqorg
  • Section
  • References 1
  • References 2
  • References 2
Page 73: Best Practices for Software Process and Product Measurement · Software Process and Product Measurement Dr. Bill Curtis ... measurement best practice • Best guidance for starting

Click to edit Master title style Reducing Operational Incidents amp Costs

Study of structural quality measures and maintenance effort across 20 customers in a large global system integrator

TQI increase of 24 decreased corrective maintenance effort by 50

Total Quality Index

Click to edit Master title style Reducing Operational Losses

0

20

40

60

80

100

120

140

160

180

100 150 200 250 300 350 400

Technical Quality Index

Pro

du

ctio

n In

cid

ents

+$2000000 Annual Loss

$50000 Annual Loss

$5000 Annual Loss

Large international investment bank Business critical applications

Click to edit Master title style Factors Affecting Structural Quality

2017 CRASH Report Request from infocastsoftwarecom

Factor apps Robust Security Perform Change Transfer

Technology 1606 20 3 16 26 7

Java-EE

Industry 677 5 5 7 4 6

App type 510 1 1 1 1 1

Source 580 1 1 4

Shore 540

Maturity 72 28 25 15 24 12

Method 208 10 6 14 6

Team size 186 7 5 8 8

of users 262 4 3 5 4

Percentage of variation in structural quality scores accounted for by various factors

Click to edit Master title style The Technical Debt Metaphor

Structural quality problems in production code

Technical Debt

Principal borrowed

Interest on the debt

Business Risk Liability from debt

Opportunity cost

Interestmdashcontinuing IT costs attributable to the violations causing technical debt ie higher maintenance costs greater resource usage etc

Principalcost of fixing problems remaining in the code after release that must be remediated

Opportunity costmdashbenefits that could have been achieved had resources been put on new capability rather than retiring technical debt

Liabilitymdashbusiness costs related to outages breaches corrupted data etc

Technical Debt the future cost of fixing defects remaining in code at release a component of the cost of ownership

Todayrsquos talk focuses on the principal

Curtis et al (2012) IEEE Software

Click to edit Master title style Tech Debt by Quality Factor

Transferability

40

Changeability

30

Security 7

Robustness

18

70 of Technical Debt is in IT Cost (Transferability Changeability)

30 of Technical Debt is in Business Risk

(Robustness Performance Security)

Health Factor proportions are mostly

consistent across technologies

Curtis et al (2012) IEEE Software

Click to edit Master title style Join CISQ wwwit-cisqorg

Click to edit Master title style Section

References

115

Click to edit Master title style References 1 Anderson DJ (2010) Kanban Sequim WA Blue Hole Press

Basili V Trendowicz A Kowalczyk M Heidrich J Seaman C Munch J amp Rombach D (2014) Aligning Organizations through Measurement Heidelberg Springer

Boehm B (1987) Increasing software productivity IEEE Computer 20 (9) 43-57

Boehm B et al (2000) Software Cost Estimation with COCOMO II Prentice-Hall

Chrissis MB Konrad M amp Shrum S (2011) CMMI Third Edition Guidelines for Process Integration and Product Improvement Reading MA Addison Wesley

Cohn M (2006) Agile Estimating and Planning Upper Saddle River NJ Prentice-Hall

Crosby P (1979) Quality Is Free New York McGraw-Hill

Curtis B Sapiddi J amp Szynkarski A (2012) Estimating the principle of an applicationrsquos technical debt IEEE Software 29 (6) 34-42

Dion R (1993) Process improvement and the corporate balance sheet IEEE Software 10 (4) 28-35

Duncker R (1992) Presentation at the 25th Annual Conference of the Singapore Computer Society Singapore November 1992

Dyne K (1999) ESSI Benefits from Continuous Software Improvements In Proceedings of ESEPGrsquo99 Milton Keyes UK ESPI Foundation

Flowe RM amp Thordahl JB (1994) A Correlational Study of the SEIrsquos Capability Maturity Model and Software Development Performance in DoD Contracts (AFITGSSLAR94D-2) Dayton OH Air Force Institute of Technology Wright Patterson Air Force Base

116

Click to edit Master title style References 2 Haley T Ireland B Wojtaszek E Nash D amp Dion R (1995) Raytheon Electronic Systems

Experience in Software Process Improvement (Tech Rep CMUSEI-95-TR-017) ) Pittsburgh Software Engineering Institute Carnegie Mellon University

Ebert C amp Dumke R (2007) Software Measurement Berlin Springer-Verlag

Fagan E amp Davis N (2005) TSP PSP at Intuit Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Fenton N (2006) New directions for Software Metrics CIO Symposium on Software Best Practices

Garmus D amp Herron D (2001) Function Point Analysis Boston Addison-Wesley

George E (2005) Project management with TSP Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Herbsleb J Carleton A Rozum J Siegel J amp Zubrow D (1994) Benefits of CMM-Based Software Process Improvement Initial Results (Tech Rep CMUSEI-94-TR-13) Pittsburgh Software Engineering Institute Carnegie Mellon University

Humphrey W (1997) Introduction to the Personal Software Process Boston Addison Wesley

Humphrey W (2000) Introduction to the Team Software Process Boston Addison Wesley

Humphrey W (2006) TSP Leading a Development Team Boston Addison Wesley

Jackson D (2009) A direct path to dependable software Communications of the ACM 52 (4)

Kan SH (1995) Metrics and Models in Software Quality Engineering Reading MA Addison-Wesley

Keeni B (2000) IEEE Software 16 (4)

117

Click to edit Master title style References 2 Larman C (2004) Agile and Iterative Development Boston Addison Wesley

Lydon T (1995) Productivity drivers Process and capital In Proceedings of the 1995 SEPG Conference Pittsburgh Software Engineering Institute Carnegie Mellon University

McGarry J et al (2002) Practical Software Measurement Reading MA Addison Wesley

Paulk M Weber C Curtis B amp Chrissis M (1995) The Capability Maturity Model Guidelines for Improving the Software Process Reading MA Addison-Wesley

Peterman W (2000) IEEE Software JulyAugust issue

Poppendieck M amp Poppendieck T (2007) Implementing Lean Software Development Boston Addison-Wesley

Sakamoto K Kishida K amp Nakakoji K (1996) Cultural adaptation of the CMM In Fuggetta A amp Wolf A (Eds) Software Process Chichester UK Wiley 137-154

Sinha M (2006) Performance models ndash Enabling organizations prediction capability Proceedings of SEPGrsquo06 Pittsburgh Software Engineering Institute Carnegie Mellon University

Schwaber K amp Beedle M (2002) Agile Software Development Using Scrum

Spinellis D (2006) Code Quality Addison-Wesley

Van Epps S amp Marshall R (2012) Using TSP at the end of the development cycle Proceedings of TSP 2012 Pittsburgh Software Engineering Institute Carnegie Mellon University

Vu JD (1996) Software process improvement A business case In Proceedings of the European SEPG Conference Milton Keynes UK European Software Process Improvement Foundation

118

  • Best Practices for Software Process and Product Measurement
  • Instructor Dr Bill Curtis
  • Agenda
  • Section 1
  • Why Does Measurement Languish
  • Measurement Process Guidance
  • Measurement Categories
  • Productivity Analysis Objectives
  • Productivity Analysis Measures
  • Software Productivity
  • Software Size Measures
  • How Many Lines of Code
  • Function Point Estimation
  • Effort mdash Weakest Measure
  • How Quality Affects Productivity
  • Carry-forward Rework
  • Example of Quality Impact
  • Quality-Adjusted Productivity
  • Best Practices for Analysis
  • 1 Evaluate Demographics
  • 2 Segment Baselines
  • 3 Beware Sampling Effects
  • 4 Understand Variation
  • 5 Investigate Distributions
  • 6 Account for Maturity Effects
  • 7 Caution for External Data Sets
  • Slide Number 27
  • Measuring Project Progress
  • Requirements Change Impact
  • Phase Defect Removal Model
  • SCRUM Board
  • Burndown Chart and Velocity
  • Analyzing Velocity
  • Measuring and Managing Flow
  • Trends Over Sprints
  • Sources and Measures
  • Recommended Books
  • Personal Software Process (PSP)
  • Team Software Process (TSP)
  • Aggregated Personal Data Best
  • Defect Detection at Intuit
  • TSP Benefits
  • Section 3
  • CMMI Maturity Transitions
  • Enhanced Maturity Framework
  • Maturity Level Transformation
  • Measurement by Level
  • Average Improvement Results
  • Schedule Performance
  • Cost Performance
  • Crosbyrsquos Cost of Quality Model
  • Raytheons Cost of Quality
  • Raytheonrsquos Productivity
  • Raytheonrsquos Cost Reduction
  • Raytheonrsquos Cost Predictability
  • OMRONrsquos Reuse gains
  • Section 4
  • What about the Product
  • Six Sigmarsquos Challenge
  • Testing is Not Enough
  • CASTrsquos Application Intelligence Platform
  • Modern Apps are a Technology Stack
  • Architecturally Complex Defects
  • Propagated Risk
  • Transaction Risk
  • Risk-Based Prioritization
  • Static Analysis in US DoD
  • Reducing Operational Incidents amp Costs
  • Reducing Operational Losses
  • Slide Number 70
  • The Technical Debt Metaphor
  • Tech Debt by Quality Factor
  • Join CISQ wwwit-cisqorg
  • Section
  • References 1
  • References 2
  • References 2
Page 74: Best Practices for Software Process and Product Measurement · Software Process and Product Measurement Dr. Bill Curtis ... measurement best practice • Best guidance for starting

Click to edit Master title style Reducing Operational Losses

0

20

40

60

80

100

120

140

160

180

100 150 200 250 300 350 400

Technical Quality Index

Pro

du

ctio

n In

cid

ents

+$2000000 Annual Loss

$50000 Annual Loss

$5000 Annual Loss

Large international investment bank Business critical applications

Click to edit Master title style Factors Affecting Structural Quality

2017 CRASH Report Request from infocastsoftwarecom

Factor apps Robust Security Perform Change Transfer

Technology 1606 20 3 16 26 7

Java-EE

Industry 677 5 5 7 4 6

App type 510 1 1 1 1 1

Source 580 1 1 4

Shore 540

Maturity 72 28 25 15 24 12

Method 208 10 6 14 6

Team size 186 7 5 8 8

of users 262 4 3 5 4

Percentage of variation in structural quality scores accounted for by various factors

Click to edit Master title style The Technical Debt Metaphor

Structural quality problems in production code

Technical Debt

Principal borrowed

Interest on the debt

Business Risk Liability from debt

Opportunity cost

Interestmdashcontinuing IT costs attributable to the violations causing technical debt ie higher maintenance costs greater resource usage etc

Principalcost of fixing problems remaining in the code after release that must be remediated

Opportunity costmdashbenefits that could have been achieved had resources been put on new capability rather than retiring technical debt

Liabilitymdashbusiness costs related to outages breaches corrupted data etc

Technical Debt the future cost of fixing defects remaining in code at release a component of the cost of ownership

Todayrsquos talk focuses on the principal

Curtis et al (2012) IEEE Software

Click to edit Master title style Tech Debt by Quality Factor

Transferability

40

Changeability

30

Security 7

Robustness

18

70 of Technical Debt is in IT Cost (Transferability Changeability)

30 of Technical Debt is in Business Risk

(Robustness Performance Security)

Health Factor proportions are mostly

consistent across technologies

Curtis et al (2012) IEEE Software

Click to edit Master title style Join CISQ wwwit-cisqorg

Click to edit Master title style Section

References

115

Click to edit Master title style References 1 Anderson DJ (2010) Kanban Sequim WA Blue Hole Press

Basili V Trendowicz A Kowalczyk M Heidrich J Seaman C Munch J amp Rombach D (2014) Aligning Organizations through Measurement Heidelberg Springer

Boehm B (1987) Increasing software productivity IEEE Computer 20 (9) 43-57

Boehm B et al (2000) Software Cost Estimation with COCOMO II Prentice-Hall

Chrissis MB Konrad M amp Shrum S (2011) CMMI Third Edition Guidelines for Process Integration and Product Improvement Reading MA Addison Wesley

Cohn M (2006) Agile Estimating and Planning Upper Saddle River NJ Prentice-Hall

Crosby P (1979) Quality Is Free New York McGraw-Hill

Curtis B Sapiddi J amp Szynkarski A (2012) Estimating the principle of an applicationrsquos technical debt IEEE Software 29 (6) 34-42

Dion R (1993) Process improvement and the corporate balance sheet IEEE Software 10 (4) 28-35

Duncker R (1992) Presentation at the 25th Annual Conference of the Singapore Computer Society Singapore November 1992

Dyne K (1999) ESSI Benefits from Continuous Software Improvements In Proceedings of ESEPGrsquo99 Milton Keyes UK ESPI Foundation

Flowe RM amp Thordahl JB (1994) A Correlational Study of the SEIrsquos Capability Maturity Model and Software Development Performance in DoD Contracts (AFITGSSLAR94D-2) Dayton OH Air Force Institute of Technology Wright Patterson Air Force Base

116

Click to edit Master title style References 2 Haley T Ireland B Wojtaszek E Nash D amp Dion R (1995) Raytheon Electronic Systems

Experience in Software Process Improvement (Tech Rep CMUSEI-95-TR-017) ) Pittsburgh Software Engineering Institute Carnegie Mellon University

Ebert C amp Dumke R (2007) Software Measurement Berlin Springer-Verlag

Fagan E amp Davis N (2005) TSP PSP at Intuit Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Fenton N (2006) New directions for Software Metrics CIO Symposium on Software Best Practices

Garmus D amp Herron D (2001) Function Point Analysis Boston Addison-Wesley

George E (2005) Project management with TSP Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Herbsleb J Carleton A Rozum J Siegel J amp Zubrow D (1994) Benefits of CMM-Based Software Process Improvement Initial Results (Tech Rep CMUSEI-94-TR-13) Pittsburgh Software Engineering Institute Carnegie Mellon University

Humphrey W (1997) Introduction to the Personal Software Process Boston Addison Wesley

Humphrey W (2000) Introduction to the Team Software Process Boston Addison Wesley

Humphrey W (2006) TSP Leading a Development Team Boston Addison Wesley

Jackson D (2009) A direct path to dependable software Communications of the ACM 52 (4)

Kan SH (1995) Metrics and Models in Software Quality Engineering Reading MA Addison-Wesley

Keeni B (2000) IEEE Software 16 (4)

117

Click to edit Master title style References 2 Larman C (2004) Agile and Iterative Development Boston Addison Wesley

Lydon T (1995) Productivity drivers Process and capital In Proceedings of the 1995 SEPG Conference Pittsburgh Software Engineering Institute Carnegie Mellon University

McGarry J et al (2002) Practical Software Measurement Reading MA Addison Wesley

Paulk M Weber C Curtis B amp Chrissis M (1995) The Capability Maturity Model Guidelines for Improving the Software Process Reading MA Addison-Wesley

Peterman W (2000) IEEE Software JulyAugust issue

Poppendieck M amp Poppendieck T (2007) Implementing Lean Software Development Boston Addison-Wesley

Sakamoto K Kishida K amp Nakakoji K (1996) Cultural adaptation of the CMM In Fuggetta A amp Wolf A (Eds) Software Process Chichester UK Wiley 137-154

Sinha M (2006) Performance models ndash Enabling organizations prediction capability Proceedings of SEPGrsquo06 Pittsburgh Software Engineering Institute Carnegie Mellon University

Schwaber K amp Beedle M (2002) Agile Software Development Using Scrum

Spinellis D (2006) Code Quality Addison-Wesley

Van Epps S amp Marshall R (2012) Using TSP at the end of the development cycle Proceedings of TSP 2012 Pittsburgh Software Engineering Institute Carnegie Mellon University

Vu JD (1996) Software process improvement A business case In Proceedings of the European SEPG Conference Milton Keynes UK European Software Process Improvement Foundation

118

  • Best Practices for Software Process and Product Measurement
  • Instructor Dr Bill Curtis
  • Agenda
  • Section 1
  • Why Does Measurement Languish
  • Measurement Process Guidance
  • Measurement Categories
  • Productivity Analysis Objectives
  • Productivity Analysis Measures
  • Software Productivity
  • Software Size Measures
  • How Many Lines of Code
  • Function Point Estimation
  • Effort mdash Weakest Measure
  • How Quality Affects Productivity
  • Carry-forward Rework
  • Example of Quality Impact
  • Quality-Adjusted Productivity
  • Best Practices for Analysis
  • 1 Evaluate Demographics
  • 2 Segment Baselines
  • 3 Beware Sampling Effects
  • 4 Understand Variation
  • 5 Investigate Distributions
  • 6 Account for Maturity Effects
  • 7 Caution for External Data Sets
  • Slide Number 27
  • Measuring Project Progress
  • Requirements Change Impact
  • Phase Defect Removal Model
  • SCRUM Board
  • Burndown Chart and Velocity
  • Analyzing Velocity
  • Measuring and Managing Flow
  • Trends Over Sprints
  • Sources and Measures
  • Recommended Books
  • Personal Software Process (PSP)
  • Team Software Process (TSP)
  • Aggregated Personal Data Best
  • Defect Detection at Intuit
  • TSP Benefits
  • Section 3
  • CMMI Maturity Transitions
  • Enhanced Maturity Framework
  • Maturity Level Transformation
  • Measurement by Level
  • Average Improvement Results
  • Schedule Performance
  • Cost Performance
  • Crosbyrsquos Cost of Quality Model
  • Raytheons Cost of Quality
  • Raytheonrsquos Productivity
  • Raytheonrsquos Cost Reduction
  • Raytheonrsquos Cost Predictability
  • OMRONrsquos Reuse gains
  • Section 4
  • What about the Product
  • Six Sigmarsquos Challenge
  • Testing is Not Enough
  • CASTrsquos Application Intelligence Platform
  • Modern Apps are a Technology Stack
  • Architecturally Complex Defects
  • Propagated Risk
  • Transaction Risk
  • Risk-Based Prioritization
  • Static Analysis in US DoD
  • Reducing Operational Incidents amp Costs
  • Reducing Operational Losses
  • Slide Number 70
  • The Technical Debt Metaphor
  • Tech Debt by Quality Factor
  • Join CISQ wwwit-cisqorg
  • Section
  • References 1
  • References 2
  • References 2
Page 75: Best Practices for Software Process and Product Measurement · Software Process and Product Measurement Dr. Bill Curtis ... measurement best practice • Best guidance for starting

Click to edit Master title style Factors Affecting Structural Quality

2017 CRASH Report Request from infocastsoftwarecom

Factor apps Robust Security Perform Change Transfer

Technology 1606 20 3 16 26 7

Java-EE

Industry 677 5 5 7 4 6

App type 510 1 1 1 1 1

Source 580 1 1 4

Shore 540

Maturity 72 28 25 15 24 12

Method 208 10 6 14 6

Team size 186 7 5 8 8

of users 262 4 3 5 4

Percentage of variation in structural quality scores accounted for by various factors

Click to edit Master title style The Technical Debt Metaphor

Structural quality problems in production code

Technical Debt

Principal borrowed

Interest on the debt

Business Risk Liability from debt

Opportunity cost

Interestmdashcontinuing IT costs attributable to the violations causing technical debt ie higher maintenance costs greater resource usage etc

Principalcost of fixing problems remaining in the code after release that must be remediated

Opportunity costmdashbenefits that could have been achieved had resources been put on new capability rather than retiring technical debt

Liabilitymdashbusiness costs related to outages breaches corrupted data etc

Technical Debt the future cost of fixing defects remaining in code at release a component of the cost of ownership

Todayrsquos talk focuses on the principal

Curtis et al (2012) IEEE Software

Click to edit Master title style Tech Debt by Quality Factor

Transferability

40

Changeability

30

Security 7

Robustness

18

70 of Technical Debt is in IT Cost (Transferability Changeability)

30 of Technical Debt is in Business Risk

(Robustness Performance Security)

Health Factor proportions are mostly

consistent across technologies

Curtis et al (2012) IEEE Software

Click to edit Master title style Join CISQ wwwit-cisqorg

Click to edit Master title style Section

References

115

Click to edit Master title style References 1 Anderson DJ (2010) Kanban Sequim WA Blue Hole Press

Basili V Trendowicz A Kowalczyk M Heidrich J Seaman C Munch J amp Rombach D (2014) Aligning Organizations through Measurement Heidelberg Springer

Boehm B (1987) Increasing software productivity IEEE Computer 20 (9) 43-57

Boehm B et al (2000) Software Cost Estimation with COCOMO II Prentice-Hall

Chrissis MB Konrad M amp Shrum S (2011) CMMI Third Edition Guidelines for Process Integration and Product Improvement Reading MA Addison Wesley

Cohn M (2006) Agile Estimating and Planning Upper Saddle River NJ Prentice-Hall

Crosby P (1979) Quality Is Free New York McGraw-Hill

Curtis B Sapiddi J amp Szynkarski A (2012) Estimating the principle of an applicationrsquos technical debt IEEE Software 29 (6) 34-42

Dion R (1993) Process improvement and the corporate balance sheet IEEE Software 10 (4) 28-35

Duncker R (1992) Presentation at the 25th Annual Conference of the Singapore Computer Society Singapore November 1992

Dyne K (1999) ESSI Benefits from Continuous Software Improvements In Proceedings of ESEPGrsquo99 Milton Keyes UK ESPI Foundation

Flowe RM amp Thordahl JB (1994) A Correlational Study of the SEIrsquos Capability Maturity Model and Software Development Performance in DoD Contracts (AFITGSSLAR94D-2) Dayton OH Air Force Institute of Technology Wright Patterson Air Force Base

116

Click to edit Master title style References 2 Haley T Ireland B Wojtaszek E Nash D amp Dion R (1995) Raytheon Electronic Systems

Experience in Software Process Improvement (Tech Rep CMUSEI-95-TR-017) ) Pittsburgh Software Engineering Institute Carnegie Mellon University

Ebert C amp Dumke R (2007) Software Measurement Berlin Springer-Verlag

Fagan E amp Davis N (2005) TSP PSP at Intuit Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Fenton N (2006) New directions for Software Metrics CIO Symposium on Software Best Practices

Garmus D amp Herron D (2001) Function Point Analysis Boston Addison-Wesley

George E (2005) Project management with TSP Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Herbsleb J Carleton A Rozum J Siegel J amp Zubrow D (1994) Benefits of CMM-Based Software Process Improvement Initial Results (Tech Rep CMUSEI-94-TR-13) Pittsburgh Software Engineering Institute Carnegie Mellon University

Humphrey W (1997) Introduction to the Personal Software Process Boston Addison Wesley

Humphrey W (2000) Introduction to the Team Software Process Boston Addison Wesley

Humphrey W (2006) TSP Leading a Development Team Boston Addison Wesley

Jackson D (2009) A direct path to dependable software Communications of the ACM 52 (4)

Kan SH (1995) Metrics and Models in Software Quality Engineering Reading MA Addison-Wesley

Keeni B (2000) IEEE Software 16 (4)

117

Click to edit Master title style References 2 Larman C (2004) Agile and Iterative Development Boston Addison Wesley

Lydon T (1995) Productivity drivers Process and capital In Proceedings of the 1995 SEPG Conference Pittsburgh Software Engineering Institute Carnegie Mellon University

McGarry J et al (2002) Practical Software Measurement Reading MA Addison Wesley

Paulk M Weber C Curtis B amp Chrissis M (1995) The Capability Maturity Model Guidelines for Improving the Software Process Reading MA Addison-Wesley

Peterman W (2000) IEEE Software JulyAugust issue

Poppendieck M amp Poppendieck T (2007) Implementing Lean Software Development Boston Addison-Wesley

Sakamoto K Kishida K amp Nakakoji K (1996) Cultural adaptation of the CMM In Fuggetta A amp Wolf A (Eds) Software Process Chichester UK Wiley 137-154

Sinha M (2006) Performance models ndash Enabling organizations prediction capability Proceedings of SEPGrsquo06 Pittsburgh Software Engineering Institute Carnegie Mellon University

Schwaber K amp Beedle M (2002) Agile Software Development Using Scrum

Spinellis D (2006) Code Quality Addison-Wesley

Van Epps S amp Marshall R (2012) Using TSP at the end of the development cycle Proceedings of TSP 2012 Pittsburgh Software Engineering Institute Carnegie Mellon University

Vu JD (1996) Software process improvement A business case In Proceedings of the European SEPG Conference Milton Keynes UK European Software Process Improvement Foundation

118

  • Best Practices for Software Process and Product Measurement
  • Instructor Dr Bill Curtis
  • Agenda
  • Section 1
  • Why Does Measurement Languish
  • Measurement Process Guidance
  • Measurement Categories
  • Productivity Analysis Objectives
  • Productivity Analysis Measures
  • Software Productivity
  • Software Size Measures
  • How Many Lines of Code
  • Function Point Estimation
  • Effort mdash Weakest Measure
  • How Quality Affects Productivity
  • Carry-forward Rework
  • Example of Quality Impact
  • Quality-Adjusted Productivity
  • Best Practices for Analysis
  • 1 Evaluate Demographics
  • 2 Segment Baselines
  • 3 Beware Sampling Effects
  • 4 Understand Variation
  • 5 Investigate Distributions
  • 6 Account for Maturity Effects
  • 7 Caution for External Data Sets
  • Slide Number 27
  • Measuring Project Progress
  • Requirements Change Impact
  • Phase Defect Removal Model
  • SCRUM Board
  • Burndown Chart and Velocity
  • Analyzing Velocity
  • Measuring and Managing Flow
  • Trends Over Sprints
  • Sources and Measures
  • Recommended Books
  • Personal Software Process (PSP)
  • Team Software Process (TSP)
  • Aggregated Personal Data Best
  • Defect Detection at Intuit
  • TSP Benefits
  • Section 3
  • CMMI Maturity Transitions
  • Enhanced Maturity Framework
  • Maturity Level Transformation
  • Measurement by Level
  • Average Improvement Results
  • Schedule Performance
  • Cost Performance
  • Crosbyrsquos Cost of Quality Model
  • Raytheons Cost of Quality
  • Raytheonrsquos Productivity
  • Raytheonrsquos Cost Reduction
  • Raytheonrsquos Cost Predictability
  • OMRONrsquos Reuse gains
  • Section 4
  • What about the Product
  • Six Sigmarsquos Challenge
  • Testing is Not Enough
  • CASTrsquos Application Intelligence Platform
  • Modern Apps are a Technology Stack
  • Architecturally Complex Defects
  • Propagated Risk
  • Transaction Risk
  • Risk-Based Prioritization
  • Static Analysis in US DoD
  • Reducing Operational Incidents amp Costs
  • Reducing Operational Losses
  • Slide Number 70
  • The Technical Debt Metaphor
  • Tech Debt by Quality Factor
  • Join CISQ wwwit-cisqorg
  • Section
  • References 1
  • References 2
  • References 2
Page 76: Best Practices for Software Process and Product Measurement · Software Process and Product Measurement Dr. Bill Curtis ... measurement best practice • Best guidance for starting

Click to edit Master title style The Technical Debt Metaphor

Structural quality problems in production code

Technical Debt

Principal borrowed

Interest on the debt

Business Risk Liability from debt

Opportunity cost

Interestmdashcontinuing IT costs attributable to the violations causing technical debt ie higher maintenance costs greater resource usage etc

Principalcost of fixing problems remaining in the code after release that must be remediated

Opportunity costmdashbenefits that could have been achieved had resources been put on new capability rather than retiring technical debt

Liabilitymdashbusiness costs related to outages breaches corrupted data etc

Technical Debt the future cost of fixing defects remaining in code at release a component of the cost of ownership

Todayrsquos talk focuses on the principal

Curtis et al (2012) IEEE Software

Click to edit Master title style Tech Debt by Quality Factor

Transferability

40

Changeability

30

Security 7

Robustness

18

70 of Technical Debt is in IT Cost (Transferability Changeability)

30 of Technical Debt is in Business Risk

(Robustness Performance Security)

Health Factor proportions are mostly

consistent across technologies

Curtis et al (2012) IEEE Software

Click to edit Master title style Join CISQ wwwit-cisqorg

Click to edit Master title style Section

References

115

Click to edit Master title style References 1 Anderson DJ (2010) Kanban Sequim WA Blue Hole Press

Basili V Trendowicz A Kowalczyk M Heidrich J Seaman C Munch J amp Rombach D (2014) Aligning Organizations through Measurement Heidelberg Springer

Boehm B (1987) Increasing software productivity IEEE Computer 20 (9) 43-57

Boehm B et al (2000) Software Cost Estimation with COCOMO II Prentice-Hall

Chrissis MB Konrad M amp Shrum S (2011) CMMI Third Edition Guidelines for Process Integration and Product Improvement Reading MA Addison Wesley

Cohn M (2006) Agile Estimating and Planning Upper Saddle River NJ Prentice-Hall

Crosby P (1979) Quality Is Free New York McGraw-Hill

Curtis B Sapiddi J amp Szynkarski A (2012) Estimating the principle of an applicationrsquos technical debt IEEE Software 29 (6) 34-42

Dion R (1993) Process improvement and the corporate balance sheet IEEE Software 10 (4) 28-35

Duncker R (1992) Presentation at the 25th Annual Conference of the Singapore Computer Society Singapore November 1992

Dyne K (1999) ESSI Benefits from Continuous Software Improvements In Proceedings of ESEPGrsquo99 Milton Keyes UK ESPI Foundation

Flowe RM amp Thordahl JB (1994) A Correlational Study of the SEIrsquos Capability Maturity Model and Software Development Performance in DoD Contracts (AFITGSSLAR94D-2) Dayton OH Air Force Institute of Technology Wright Patterson Air Force Base

116

Click to edit Master title style References 2 Haley T Ireland B Wojtaszek E Nash D amp Dion R (1995) Raytheon Electronic Systems

Experience in Software Process Improvement (Tech Rep CMUSEI-95-TR-017) ) Pittsburgh Software Engineering Institute Carnegie Mellon University

Ebert C amp Dumke R (2007) Software Measurement Berlin Springer-Verlag

Fagan E amp Davis N (2005) TSP PSP at Intuit Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Fenton N (2006) New directions for Software Metrics CIO Symposium on Software Best Practices

Garmus D amp Herron D (2001) Function Point Analysis Boston Addison-Wesley

George E (2005) Project management with TSP Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Herbsleb J Carleton A Rozum J Siegel J amp Zubrow D (1994) Benefits of CMM-Based Software Process Improvement Initial Results (Tech Rep CMUSEI-94-TR-13) Pittsburgh Software Engineering Institute Carnegie Mellon University

Humphrey W (1997) Introduction to the Personal Software Process Boston Addison Wesley

Humphrey W (2000) Introduction to the Team Software Process Boston Addison Wesley

Humphrey W (2006) TSP Leading a Development Team Boston Addison Wesley

Jackson D (2009) A direct path to dependable software Communications of the ACM 52 (4)

Kan SH (1995) Metrics and Models in Software Quality Engineering Reading MA Addison-Wesley

Keeni B (2000) IEEE Software 16 (4)

117

Click to edit Master title style References 2 Larman C (2004) Agile and Iterative Development Boston Addison Wesley

Lydon T (1995) Productivity drivers Process and capital In Proceedings of the 1995 SEPG Conference Pittsburgh Software Engineering Institute Carnegie Mellon University

McGarry J et al (2002) Practical Software Measurement Reading MA Addison Wesley

Paulk M Weber C Curtis B amp Chrissis M (1995) The Capability Maturity Model Guidelines for Improving the Software Process Reading MA Addison-Wesley

Peterman W (2000) IEEE Software JulyAugust issue

Poppendieck M amp Poppendieck T (2007) Implementing Lean Software Development Boston Addison-Wesley

Sakamoto K Kishida K amp Nakakoji K (1996) Cultural adaptation of the CMM In Fuggetta A amp Wolf A (Eds) Software Process Chichester UK Wiley 137-154

Sinha M (2006) Performance models ndash Enabling organizations prediction capability Proceedings of SEPGrsquo06 Pittsburgh Software Engineering Institute Carnegie Mellon University

Schwaber K amp Beedle M (2002) Agile Software Development Using Scrum

Spinellis D (2006) Code Quality Addison-Wesley

Van Epps S amp Marshall R (2012) Using TSP at the end of the development cycle Proceedings of TSP 2012 Pittsburgh Software Engineering Institute Carnegie Mellon University

Vu JD (1996) Software process improvement A business case In Proceedings of the European SEPG Conference Milton Keynes UK European Software Process Improvement Foundation

118

  • Best Practices for Software Process and Product Measurement
  • Instructor Dr Bill Curtis
  • Agenda
  • Section 1
  • Why Does Measurement Languish
  • Measurement Process Guidance
  • Measurement Categories
  • Productivity Analysis Objectives
  • Productivity Analysis Measures
  • Software Productivity
  • Software Size Measures
  • How Many Lines of Code
  • Function Point Estimation
  • Effort mdash Weakest Measure
  • How Quality Affects Productivity
  • Carry-forward Rework
  • Example of Quality Impact
  • Quality-Adjusted Productivity
  • Best Practices for Analysis
  • 1 Evaluate Demographics
  • 2 Segment Baselines
  • 3 Beware Sampling Effects
  • 4 Understand Variation
  • 5 Investigate Distributions
  • 6 Account for Maturity Effects
  • 7 Caution for External Data Sets
  • Slide Number 27
  • Measuring Project Progress
  • Requirements Change Impact
  • Phase Defect Removal Model
  • SCRUM Board
  • Burndown Chart and Velocity
  • Analyzing Velocity
  • Measuring and Managing Flow
  • Trends Over Sprints
  • Sources and Measures
  • Recommended Books
  • Personal Software Process (PSP)
  • Team Software Process (TSP)
  • Aggregated Personal Data Best
  • Defect Detection at Intuit
  • TSP Benefits
  • Section 3
  • CMMI Maturity Transitions
  • Enhanced Maturity Framework
  • Maturity Level Transformation
  • Measurement by Level
  • Average Improvement Results
  • Schedule Performance
  • Cost Performance
  • Crosbyrsquos Cost of Quality Model
  • Raytheons Cost of Quality
  • Raytheonrsquos Productivity
  • Raytheonrsquos Cost Reduction
  • Raytheonrsquos Cost Predictability
  • OMRONrsquos Reuse gains
  • Section 4
  • What about the Product
  • Six Sigmarsquos Challenge
  • Testing is Not Enough
  • CASTrsquos Application Intelligence Platform
  • Modern Apps are a Technology Stack
  • Architecturally Complex Defects
  • Propagated Risk
  • Transaction Risk
  • Risk-Based Prioritization
  • Static Analysis in US DoD
  • Reducing Operational Incidents amp Costs
  • Reducing Operational Losses
  • Slide Number 70
  • The Technical Debt Metaphor
  • Tech Debt by Quality Factor
  • Join CISQ wwwit-cisqorg
  • Section
  • References 1
  • References 2
  • References 2
Page 77: Best Practices for Software Process and Product Measurement · Software Process and Product Measurement Dr. Bill Curtis ... measurement best practice • Best guidance for starting

Click to edit Master title style Tech Debt by Quality Factor

Transferability

40

Changeability

30

Security 7

Robustness

18

70 of Technical Debt is in IT Cost (Transferability Changeability)

30 of Technical Debt is in Business Risk

(Robustness Performance Security)

Health Factor proportions are mostly

consistent across technologies

Curtis et al (2012) IEEE Software

Click to edit Master title style Join CISQ wwwit-cisqorg

Click to edit Master title style Section

References

115

Click to edit Master title style References 1 Anderson DJ (2010) Kanban Sequim WA Blue Hole Press

Basili V Trendowicz A Kowalczyk M Heidrich J Seaman C Munch J amp Rombach D (2014) Aligning Organizations through Measurement Heidelberg Springer

Boehm B (1987) Increasing software productivity IEEE Computer 20 (9) 43-57

Boehm B et al (2000) Software Cost Estimation with COCOMO II Prentice-Hall

Chrissis MB Konrad M amp Shrum S (2011) CMMI Third Edition Guidelines for Process Integration and Product Improvement Reading MA Addison Wesley

Cohn M (2006) Agile Estimating and Planning Upper Saddle River NJ Prentice-Hall

Crosby P (1979) Quality Is Free New York McGraw-Hill

Curtis B Sapiddi J amp Szynkarski A (2012) Estimating the principle of an applicationrsquos technical debt IEEE Software 29 (6) 34-42

Dion R (1993) Process improvement and the corporate balance sheet IEEE Software 10 (4) 28-35

Duncker R (1992) Presentation at the 25th Annual Conference of the Singapore Computer Society Singapore November 1992

Dyne K (1999) ESSI Benefits from Continuous Software Improvements In Proceedings of ESEPGrsquo99 Milton Keyes UK ESPI Foundation

Flowe RM amp Thordahl JB (1994) A Correlational Study of the SEIrsquos Capability Maturity Model and Software Development Performance in DoD Contracts (AFITGSSLAR94D-2) Dayton OH Air Force Institute of Technology Wright Patterson Air Force Base

116

Click to edit Master title style References 2 Haley T Ireland B Wojtaszek E Nash D amp Dion R (1995) Raytheon Electronic Systems

Experience in Software Process Improvement (Tech Rep CMUSEI-95-TR-017) ) Pittsburgh Software Engineering Institute Carnegie Mellon University

Ebert C amp Dumke R (2007) Software Measurement Berlin Springer-Verlag

Fagan E amp Davis N (2005) TSP PSP at Intuit Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Fenton N (2006) New directions for Software Metrics CIO Symposium on Software Best Practices

Garmus D amp Herron D (2001) Function Point Analysis Boston Addison-Wesley

George E (2005) Project management with TSP Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Herbsleb J Carleton A Rozum J Siegel J amp Zubrow D (1994) Benefits of CMM-Based Software Process Improvement Initial Results (Tech Rep CMUSEI-94-TR-13) Pittsburgh Software Engineering Institute Carnegie Mellon University

Humphrey W (1997) Introduction to the Personal Software Process Boston Addison Wesley

Humphrey W (2000) Introduction to the Team Software Process Boston Addison Wesley

Humphrey W (2006) TSP Leading a Development Team Boston Addison Wesley

Jackson D (2009) A direct path to dependable software Communications of the ACM 52 (4)

Kan SH (1995) Metrics and Models in Software Quality Engineering Reading MA Addison-Wesley

Keeni B (2000) IEEE Software 16 (4)

117

Click to edit Master title style References 2 Larman C (2004) Agile and Iterative Development Boston Addison Wesley

Lydon T (1995) Productivity drivers Process and capital In Proceedings of the 1995 SEPG Conference Pittsburgh Software Engineering Institute Carnegie Mellon University

McGarry J et al (2002) Practical Software Measurement Reading MA Addison Wesley

Paulk M Weber C Curtis B amp Chrissis M (1995) The Capability Maturity Model Guidelines for Improving the Software Process Reading MA Addison-Wesley

Peterman W (2000) IEEE Software JulyAugust issue

Poppendieck M amp Poppendieck T (2007) Implementing Lean Software Development Boston Addison-Wesley

Sakamoto K Kishida K amp Nakakoji K (1996) Cultural adaptation of the CMM In Fuggetta A amp Wolf A (Eds) Software Process Chichester UK Wiley 137-154

Sinha M (2006) Performance models ndash Enabling organizations prediction capability Proceedings of SEPGrsquo06 Pittsburgh Software Engineering Institute Carnegie Mellon University

Schwaber K amp Beedle M (2002) Agile Software Development Using Scrum

Spinellis D (2006) Code Quality Addison-Wesley

Van Epps S amp Marshall R (2012) Using TSP at the end of the development cycle Proceedings of TSP 2012 Pittsburgh Software Engineering Institute Carnegie Mellon University

Vu JD (1996) Software process improvement A business case In Proceedings of the European SEPG Conference Milton Keynes UK European Software Process Improvement Foundation

118

  • Best Practices for Software Process and Product Measurement
  • Instructor Dr Bill Curtis
  • Agenda
  • Section 1
  • Why Does Measurement Languish
  • Measurement Process Guidance
  • Measurement Categories
  • Productivity Analysis Objectives
  • Productivity Analysis Measures
  • Software Productivity
  • Software Size Measures
  • How Many Lines of Code
  • Function Point Estimation
  • Effort mdash Weakest Measure
  • How Quality Affects Productivity
  • Carry-forward Rework
  • Example of Quality Impact
  • Quality-Adjusted Productivity
  • Best Practices for Analysis
  • 1 Evaluate Demographics
  • 2 Segment Baselines
  • 3 Beware Sampling Effects
  • 4 Understand Variation
  • 5 Investigate Distributions
  • 6 Account for Maturity Effects
  • 7 Caution for External Data Sets
  • Slide Number 27
  • Measuring Project Progress
  • Requirements Change Impact
  • Phase Defect Removal Model
  • SCRUM Board
  • Burndown Chart and Velocity
  • Analyzing Velocity
  • Measuring and Managing Flow
  • Trends Over Sprints
  • Sources and Measures
  • Recommended Books
  • Personal Software Process (PSP)
  • Team Software Process (TSP)
  • Aggregated Personal Data Best
  • Defect Detection at Intuit
  • TSP Benefits
  • Section 3
  • CMMI Maturity Transitions
  • Enhanced Maturity Framework
  • Maturity Level Transformation
  • Measurement by Level
  • Average Improvement Results
  • Schedule Performance
  • Cost Performance
  • Crosbyrsquos Cost of Quality Model
  • Raytheons Cost of Quality
  • Raytheonrsquos Productivity
  • Raytheonrsquos Cost Reduction
  • Raytheonrsquos Cost Predictability
  • OMRONrsquos Reuse gains
  • Section 4
  • What about the Product
  • Six Sigmarsquos Challenge
  • Testing is Not Enough
  • CASTrsquos Application Intelligence Platform
  • Modern Apps are a Technology Stack
  • Architecturally Complex Defects
  • Propagated Risk
  • Transaction Risk
  • Risk-Based Prioritization
  • Static Analysis in US DoD
  • Reducing Operational Incidents amp Costs
  • Reducing Operational Losses
  • Slide Number 70
  • The Technical Debt Metaphor
  • Tech Debt by Quality Factor
  • Join CISQ wwwit-cisqorg
  • Section
  • References 1
  • References 2
  • References 2
Page 78: Best Practices for Software Process and Product Measurement · Software Process and Product Measurement Dr. Bill Curtis ... measurement best practice • Best guidance for starting

Click to edit Master title style Join CISQ wwwit-cisqorg

Click to edit Master title style Section

References

115

Click to edit Master title style References 1 Anderson DJ (2010) Kanban Sequim WA Blue Hole Press

Basili V Trendowicz A Kowalczyk M Heidrich J Seaman C Munch J amp Rombach D (2014) Aligning Organizations through Measurement Heidelberg Springer

Boehm B (1987) Increasing software productivity IEEE Computer 20 (9) 43-57

Boehm B et al (2000) Software Cost Estimation with COCOMO II Prentice-Hall

Chrissis MB Konrad M amp Shrum S (2011) CMMI Third Edition Guidelines for Process Integration and Product Improvement Reading MA Addison Wesley

Cohn M (2006) Agile Estimating and Planning Upper Saddle River NJ Prentice-Hall

Crosby P (1979) Quality Is Free New York McGraw-Hill

Curtis B Sapiddi J amp Szynkarski A (2012) Estimating the principle of an applicationrsquos technical debt IEEE Software 29 (6) 34-42

Dion R (1993) Process improvement and the corporate balance sheet IEEE Software 10 (4) 28-35

Duncker R (1992) Presentation at the 25th Annual Conference of the Singapore Computer Society Singapore November 1992

Dyne K (1999) ESSI Benefits from Continuous Software Improvements In Proceedings of ESEPGrsquo99 Milton Keyes UK ESPI Foundation

Flowe RM amp Thordahl JB (1994) A Correlational Study of the SEIrsquos Capability Maturity Model and Software Development Performance in DoD Contracts (AFITGSSLAR94D-2) Dayton OH Air Force Institute of Technology Wright Patterson Air Force Base

116

Click to edit Master title style References 2 Haley T Ireland B Wojtaszek E Nash D amp Dion R (1995) Raytheon Electronic Systems

Experience in Software Process Improvement (Tech Rep CMUSEI-95-TR-017) ) Pittsburgh Software Engineering Institute Carnegie Mellon University

Ebert C amp Dumke R (2007) Software Measurement Berlin Springer-Verlag

Fagan E amp Davis N (2005) TSP PSP at Intuit Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Fenton N (2006) New directions for Software Metrics CIO Symposium on Software Best Practices

Garmus D amp Herron D (2001) Function Point Analysis Boston Addison-Wesley

George E (2005) Project management with TSP Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Herbsleb J Carleton A Rozum J Siegel J amp Zubrow D (1994) Benefits of CMM-Based Software Process Improvement Initial Results (Tech Rep CMUSEI-94-TR-13) Pittsburgh Software Engineering Institute Carnegie Mellon University

Humphrey W (1997) Introduction to the Personal Software Process Boston Addison Wesley

Humphrey W (2000) Introduction to the Team Software Process Boston Addison Wesley

Humphrey W (2006) TSP Leading a Development Team Boston Addison Wesley

Jackson D (2009) A direct path to dependable software Communications of the ACM 52 (4)

Kan SH (1995) Metrics and Models in Software Quality Engineering Reading MA Addison-Wesley

Keeni B (2000) IEEE Software 16 (4)

117

Click to edit Master title style References 2 Larman C (2004) Agile and Iterative Development Boston Addison Wesley

Lydon T (1995) Productivity drivers Process and capital In Proceedings of the 1995 SEPG Conference Pittsburgh Software Engineering Institute Carnegie Mellon University

McGarry J et al (2002) Practical Software Measurement Reading MA Addison Wesley

Paulk M Weber C Curtis B amp Chrissis M (1995) The Capability Maturity Model Guidelines for Improving the Software Process Reading MA Addison-Wesley

Peterman W (2000) IEEE Software JulyAugust issue

Poppendieck M amp Poppendieck T (2007) Implementing Lean Software Development Boston Addison-Wesley

Sakamoto K Kishida K amp Nakakoji K (1996) Cultural adaptation of the CMM In Fuggetta A amp Wolf A (Eds) Software Process Chichester UK Wiley 137-154

Sinha M (2006) Performance models ndash Enabling organizations prediction capability Proceedings of SEPGrsquo06 Pittsburgh Software Engineering Institute Carnegie Mellon University

Schwaber K amp Beedle M (2002) Agile Software Development Using Scrum

Spinellis D (2006) Code Quality Addison-Wesley

Van Epps S amp Marshall R (2012) Using TSP at the end of the development cycle Proceedings of TSP 2012 Pittsburgh Software Engineering Institute Carnegie Mellon University

Vu JD (1996) Software process improvement A business case In Proceedings of the European SEPG Conference Milton Keynes UK European Software Process Improvement Foundation

118

  • Best Practices for Software Process and Product Measurement
  • Instructor Dr Bill Curtis
  • Agenda
  • Section 1
  • Why Does Measurement Languish
  • Measurement Process Guidance
  • Measurement Categories
  • Productivity Analysis Objectives
  • Productivity Analysis Measures
  • Software Productivity
  • Software Size Measures
  • How Many Lines of Code
  • Function Point Estimation
  • Effort mdash Weakest Measure
  • How Quality Affects Productivity
  • Carry-forward Rework
  • Example of Quality Impact
  • Quality-Adjusted Productivity
  • Best Practices for Analysis
  • 1 Evaluate Demographics
  • 2 Segment Baselines
  • 3 Beware Sampling Effects
  • 4 Understand Variation
  • 5 Investigate Distributions
  • 6 Account for Maturity Effects
  • 7 Caution for External Data Sets
  • Slide Number 27
  • Measuring Project Progress
  • Requirements Change Impact
  • Phase Defect Removal Model
  • SCRUM Board
  • Burndown Chart and Velocity
  • Analyzing Velocity
  • Measuring and Managing Flow
  • Trends Over Sprints
  • Sources and Measures
  • Recommended Books
  • Personal Software Process (PSP)
  • Team Software Process (TSP)
  • Aggregated Personal Data Best
  • Defect Detection at Intuit
  • TSP Benefits
  • Section 3
  • CMMI Maturity Transitions
  • Enhanced Maturity Framework
  • Maturity Level Transformation
  • Measurement by Level
  • Average Improvement Results
  • Schedule Performance
  • Cost Performance
  • Crosbyrsquos Cost of Quality Model
  • Raytheons Cost of Quality
  • Raytheonrsquos Productivity
  • Raytheonrsquos Cost Reduction
  • Raytheonrsquos Cost Predictability
  • OMRONrsquos Reuse gains
  • Section 4
  • What about the Product
  • Six Sigmarsquos Challenge
  • Testing is Not Enough
  • CASTrsquos Application Intelligence Platform
  • Modern Apps are a Technology Stack
  • Architecturally Complex Defects
  • Propagated Risk
  • Transaction Risk
  • Risk-Based Prioritization
  • Static Analysis in US DoD
  • Reducing Operational Incidents amp Costs
  • Reducing Operational Losses
  • Slide Number 70
  • The Technical Debt Metaphor
  • Tech Debt by Quality Factor
  • Join CISQ wwwit-cisqorg
  • Section
  • References 1
  • References 2
  • References 2
Page 79: Best Practices for Software Process and Product Measurement · Software Process and Product Measurement Dr. Bill Curtis ... measurement best practice • Best guidance for starting

Click to edit Master title style Section

References

115

Click to edit Master title style References 1 Anderson DJ (2010) Kanban Sequim WA Blue Hole Press

Basili V Trendowicz A Kowalczyk M Heidrich J Seaman C Munch J amp Rombach D (2014) Aligning Organizations through Measurement Heidelberg Springer

Boehm B (1987) Increasing software productivity IEEE Computer 20 (9) 43-57

Boehm B et al (2000) Software Cost Estimation with COCOMO II Prentice-Hall

Chrissis MB Konrad M amp Shrum S (2011) CMMI Third Edition Guidelines for Process Integration and Product Improvement Reading MA Addison Wesley

Cohn M (2006) Agile Estimating and Planning Upper Saddle River NJ Prentice-Hall

Crosby P (1979) Quality Is Free New York McGraw-Hill

Curtis B Sapiddi J amp Szynkarski A (2012) Estimating the principle of an applicationrsquos technical debt IEEE Software 29 (6) 34-42

Dion R (1993) Process improvement and the corporate balance sheet IEEE Software 10 (4) 28-35

Duncker R (1992) Presentation at the 25th Annual Conference of the Singapore Computer Society Singapore November 1992

Dyne K (1999) ESSI Benefits from Continuous Software Improvements In Proceedings of ESEPGrsquo99 Milton Keyes UK ESPI Foundation

Flowe RM amp Thordahl JB (1994) A Correlational Study of the SEIrsquos Capability Maturity Model and Software Development Performance in DoD Contracts (AFITGSSLAR94D-2) Dayton OH Air Force Institute of Technology Wright Patterson Air Force Base

116

Click to edit Master title style References 2 Haley T Ireland B Wojtaszek E Nash D amp Dion R (1995) Raytheon Electronic Systems

Experience in Software Process Improvement (Tech Rep CMUSEI-95-TR-017) ) Pittsburgh Software Engineering Institute Carnegie Mellon University

Ebert C amp Dumke R (2007) Software Measurement Berlin Springer-Verlag

Fagan E amp Davis N (2005) TSP PSP at Intuit Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Fenton N (2006) New directions for Software Metrics CIO Symposium on Software Best Practices

Garmus D amp Herron D (2001) Function Point Analysis Boston Addison-Wesley

George E (2005) Project management with TSP Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Herbsleb J Carleton A Rozum J Siegel J amp Zubrow D (1994) Benefits of CMM-Based Software Process Improvement Initial Results (Tech Rep CMUSEI-94-TR-13) Pittsburgh Software Engineering Institute Carnegie Mellon University

Humphrey W (1997) Introduction to the Personal Software Process Boston Addison Wesley

Humphrey W (2000) Introduction to the Team Software Process Boston Addison Wesley

Humphrey W (2006) TSP Leading a Development Team Boston Addison Wesley

Jackson D (2009) A direct path to dependable software Communications of the ACM 52 (4)

Kan SH (1995) Metrics and Models in Software Quality Engineering Reading MA Addison-Wesley

Keeni B (2000) IEEE Software 16 (4)

117

Click to edit Master title style References 2 Larman C (2004) Agile and Iterative Development Boston Addison Wesley

Lydon T (1995) Productivity drivers Process and capital In Proceedings of the 1995 SEPG Conference Pittsburgh Software Engineering Institute Carnegie Mellon University

McGarry J et al (2002) Practical Software Measurement Reading MA Addison Wesley

Paulk M Weber C Curtis B amp Chrissis M (1995) The Capability Maturity Model Guidelines for Improving the Software Process Reading MA Addison-Wesley

Peterman W (2000) IEEE Software JulyAugust issue

Poppendieck M amp Poppendieck T (2007) Implementing Lean Software Development Boston Addison-Wesley

Sakamoto K Kishida K amp Nakakoji K (1996) Cultural adaptation of the CMM In Fuggetta A amp Wolf A (Eds) Software Process Chichester UK Wiley 137-154

Sinha M (2006) Performance models ndash Enabling organizations prediction capability Proceedings of SEPGrsquo06 Pittsburgh Software Engineering Institute Carnegie Mellon University

Schwaber K amp Beedle M (2002) Agile Software Development Using Scrum

Spinellis D (2006) Code Quality Addison-Wesley

Van Epps S amp Marshall R (2012) Using TSP at the end of the development cycle Proceedings of TSP 2012 Pittsburgh Software Engineering Institute Carnegie Mellon University

Vu JD (1996) Software process improvement A business case In Proceedings of the European SEPG Conference Milton Keynes UK European Software Process Improvement Foundation

118

  • Best Practices for Software Process and Product Measurement
  • Instructor Dr Bill Curtis
  • Agenda
  • Section 1
  • Why Does Measurement Languish
  • Measurement Process Guidance
  • Measurement Categories
  • Productivity Analysis Objectives
  • Productivity Analysis Measures
  • Software Productivity
  • Software Size Measures
  • How Many Lines of Code
  • Function Point Estimation
  • Effort mdash Weakest Measure
  • How Quality Affects Productivity
  • Carry-forward Rework
  • Example of Quality Impact
  • Quality-Adjusted Productivity
  • Best Practices for Analysis
  • 1 Evaluate Demographics
  • 2 Segment Baselines
  • 3 Beware Sampling Effects
  • 4 Understand Variation
  • 5 Investigate Distributions
  • 6 Account for Maturity Effects
  • 7 Caution for External Data Sets
  • Slide Number 27
  • Measuring Project Progress
  • Requirements Change Impact
  • Phase Defect Removal Model
  • SCRUM Board
  • Burndown Chart and Velocity
  • Analyzing Velocity
  • Measuring and Managing Flow
  • Trends Over Sprints
  • Sources and Measures
  • Recommended Books
  • Personal Software Process (PSP)
  • Team Software Process (TSP)
  • Aggregated Personal Data Best
  • Defect Detection at Intuit
  • TSP Benefits
  • Section 3
  • CMMI Maturity Transitions
  • Enhanced Maturity Framework
  • Maturity Level Transformation
  • Measurement by Level
  • Average Improvement Results
  • Schedule Performance
  • Cost Performance
  • Crosbyrsquos Cost of Quality Model
  • Raytheons Cost of Quality
  • Raytheonrsquos Productivity
  • Raytheonrsquos Cost Reduction
  • Raytheonrsquos Cost Predictability
  • OMRONrsquos Reuse gains
  • Section 4
  • What about the Product
  • Six Sigmarsquos Challenge
  • Testing is Not Enough
  • CASTrsquos Application Intelligence Platform
  • Modern Apps are a Technology Stack
  • Architecturally Complex Defects
  • Propagated Risk
  • Transaction Risk
  • Risk-Based Prioritization
  • Static Analysis in US DoD
  • Reducing Operational Incidents amp Costs
  • Reducing Operational Losses
  • Slide Number 70
  • The Technical Debt Metaphor
  • Tech Debt by Quality Factor
  • Join CISQ wwwit-cisqorg
  • Section
  • References 1
  • References 2
  • References 2
Page 80: Best Practices for Software Process and Product Measurement · Software Process and Product Measurement Dr. Bill Curtis ... measurement best practice • Best guidance for starting

Click to edit Master title style References 1 Anderson DJ (2010) Kanban Sequim WA Blue Hole Press

Basili V Trendowicz A Kowalczyk M Heidrich J Seaman C Munch J amp Rombach D (2014) Aligning Organizations through Measurement Heidelberg Springer

Boehm B (1987) Increasing software productivity IEEE Computer 20 (9) 43-57

Boehm B et al (2000) Software Cost Estimation with COCOMO II Prentice-Hall

Chrissis MB Konrad M amp Shrum S (2011) CMMI Third Edition Guidelines for Process Integration and Product Improvement Reading MA Addison Wesley

Cohn M (2006) Agile Estimating and Planning Upper Saddle River NJ Prentice-Hall

Crosby P (1979) Quality Is Free New York McGraw-Hill

Curtis B Sapiddi J amp Szynkarski A (2012) Estimating the principle of an applicationrsquos technical debt IEEE Software 29 (6) 34-42

Dion R (1993) Process improvement and the corporate balance sheet IEEE Software 10 (4) 28-35

Duncker R (1992) Presentation at the 25th Annual Conference of the Singapore Computer Society Singapore November 1992

Dyne K (1999) ESSI Benefits from Continuous Software Improvements In Proceedings of ESEPGrsquo99 Milton Keyes UK ESPI Foundation

Flowe RM amp Thordahl JB (1994) A Correlational Study of the SEIrsquos Capability Maturity Model and Software Development Performance in DoD Contracts (AFITGSSLAR94D-2) Dayton OH Air Force Institute of Technology Wright Patterson Air Force Base

116

Click to edit Master title style References 2 Haley T Ireland B Wojtaszek E Nash D amp Dion R (1995) Raytheon Electronic Systems

Experience in Software Process Improvement (Tech Rep CMUSEI-95-TR-017) ) Pittsburgh Software Engineering Institute Carnegie Mellon University

Ebert C amp Dumke R (2007) Software Measurement Berlin Springer-Verlag

Fagan E amp Davis N (2005) TSP PSP at Intuit Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Fenton N (2006) New directions for Software Metrics CIO Symposium on Software Best Practices

Garmus D amp Herron D (2001) Function Point Analysis Boston Addison-Wesley

George E (2005) Project management with TSP Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Herbsleb J Carleton A Rozum J Siegel J amp Zubrow D (1994) Benefits of CMM-Based Software Process Improvement Initial Results (Tech Rep CMUSEI-94-TR-13) Pittsburgh Software Engineering Institute Carnegie Mellon University

Humphrey W (1997) Introduction to the Personal Software Process Boston Addison Wesley

Humphrey W (2000) Introduction to the Team Software Process Boston Addison Wesley

Humphrey W (2006) TSP Leading a Development Team Boston Addison Wesley

Jackson D (2009) A direct path to dependable software Communications of the ACM 52 (4)

Kan SH (1995) Metrics and Models in Software Quality Engineering Reading MA Addison-Wesley

Keeni B (2000) IEEE Software 16 (4)

117

Click to edit Master title style References 2 Larman C (2004) Agile and Iterative Development Boston Addison Wesley

Lydon T (1995) Productivity drivers Process and capital In Proceedings of the 1995 SEPG Conference Pittsburgh Software Engineering Institute Carnegie Mellon University

McGarry J et al (2002) Practical Software Measurement Reading MA Addison Wesley

Paulk M Weber C Curtis B amp Chrissis M (1995) The Capability Maturity Model Guidelines for Improving the Software Process Reading MA Addison-Wesley

Peterman W (2000) IEEE Software JulyAugust issue

Poppendieck M amp Poppendieck T (2007) Implementing Lean Software Development Boston Addison-Wesley

Sakamoto K Kishida K amp Nakakoji K (1996) Cultural adaptation of the CMM In Fuggetta A amp Wolf A (Eds) Software Process Chichester UK Wiley 137-154

Sinha M (2006) Performance models ndash Enabling organizations prediction capability Proceedings of SEPGrsquo06 Pittsburgh Software Engineering Institute Carnegie Mellon University

Schwaber K amp Beedle M (2002) Agile Software Development Using Scrum

Spinellis D (2006) Code Quality Addison-Wesley

Van Epps S amp Marshall R (2012) Using TSP at the end of the development cycle Proceedings of TSP 2012 Pittsburgh Software Engineering Institute Carnegie Mellon University

Vu JD (1996) Software process improvement A business case In Proceedings of the European SEPG Conference Milton Keynes UK European Software Process Improvement Foundation

118

  • Best Practices for Software Process and Product Measurement
  • Instructor Dr Bill Curtis
  • Agenda
  • Section 1
  • Why Does Measurement Languish
  • Measurement Process Guidance
  • Measurement Categories
  • Productivity Analysis Objectives
  • Productivity Analysis Measures
  • Software Productivity
  • Software Size Measures
  • How Many Lines of Code
  • Function Point Estimation
  • Effort mdash Weakest Measure
  • How Quality Affects Productivity
  • Carry-forward Rework
  • Example of Quality Impact
  • Quality-Adjusted Productivity
  • Best Practices for Analysis
  • 1 Evaluate Demographics
  • 2 Segment Baselines
  • 3 Beware Sampling Effects
  • 4 Understand Variation
  • 5 Investigate Distributions
  • 6 Account for Maturity Effects
  • 7 Caution for External Data Sets
  • Slide Number 27
  • Measuring Project Progress
  • Requirements Change Impact
  • Phase Defect Removal Model
  • SCRUM Board
  • Burndown Chart and Velocity
  • Analyzing Velocity
  • Measuring and Managing Flow
  • Trends Over Sprints
  • Sources and Measures
  • Recommended Books
  • Personal Software Process (PSP)
  • Team Software Process (TSP)
  • Aggregated Personal Data Best
  • Defect Detection at Intuit
  • TSP Benefits
  • Section 3
  • CMMI Maturity Transitions
  • Enhanced Maturity Framework
  • Maturity Level Transformation
  • Measurement by Level
  • Average Improvement Results
  • Schedule Performance
  • Cost Performance
  • Crosbyrsquos Cost of Quality Model
  • Raytheons Cost of Quality
  • Raytheonrsquos Productivity
  • Raytheonrsquos Cost Reduction
  • Raytheonrsquos Cost Predictability
  • OMRONrsquos Reuse gains
  • Section 4
  • What about the Product
  • Six Sigmarsquos Challenge
  • Testing is Not Enough
  • CASTrsquos Application Intelligence Platform
  • Modern Apps are a Technology Stack
  • Architecturally Complex Defects
  • Propagated Risk
  • Transaction Risk
  • Risk-Based Prioritization
  • Static Analysis in US DoD
  • Reducing Operational Incidents amp Costs
  • Reducing Operational Losses
  • Slide Number 70
  • The Technical Debt Metaphor
  • Tech Debt by Quality Factor
  • Join CISQ wwwit-cisqorg
  • Section
  • References 1
  • References 2
  • References 2
Page 81: Best Practices for Software Process and Product Measurement · Software Process and Product Measurement Dr. Bill Curtis ... measurement best practice • Best guidance for starting

Click to edit Master title style References 2 Haley T Ireland B Wojtaszek E Nash D amp Dion R (1995) Raytheon Electronic Systems

Experience in Software Process Improvement (Tech Rep CMUSEI-95-TR-017) ) Pittsburgh Software Engineering Institute Carnegie Mellon University

Ebert C amp Dumke R (2007) Software Measurement Berlin Springer-Verlag

Fagan E amp Davis N (2005) TSP PSP at Intuit Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Fenton N (2006) New directions for Software Metrics CIO Symposium on Software Best Practices

Garmus D amp Herron D (2001) Function Point Analysis Boston Addison-Wesley

George E (2005) Project management with TSP Proceedings of the TSP 2005 Pittsburgh PA Software Engineering Institute Carnegie Mellon University

Herbsleb J Carleton A Rozum J Siegel J amp Zubrow D (1994) Benefits of CMM-Based Software Process Improvement Initial Results (Tech Rep CMUSEI-94-TR-13) Pittsburgh Software Engineering Institute Carnegie Mellon University

Humphrey W (1997) Introduction to the Personal Software Process Boston Addison Wesley

Humphrey W (2000) Introduction to the Team Software Process Boston Addison Wesley

Humphrey W (2006) TSP Leading a Development Team Boston Addison Wesley

Jackson D (2009) A direct path to dependable software Communications of the ACM 52 (4)

Kan SH (1995) Metrics and Models in Software Quality Engineering Reading MA Addison-Wesley

Keeni B (2000) IEEE Software 16 (4)

117

Click to edit Master title style References 2 Larman C (2004) Agile and Iterative Development Boston Addison Wesley

Lydon T (1995) Productivity drivers Process and capital In Proceedings of the 1995 SEPG Conference Pittsburgh Software Engineering Institute Carnegie Mellon University

McGarry J et al (2002) Practical Software Measurement Reading MA Addison Wesley

Paulk M Weber C Curtis B amp Chrissis M (1995) The Capability Maturity Model Guidelines for Improving the Software Process Reading MA Addison-Wesley

Peterman W (2000) IEEE Software JulyAugust issue

Poppendieck M amp Poppendieck T (2007) Implementing Lean Software Development Boston Addison-Wesley

Sakamoto K Kishida K amp Nakakoji K (1996) Cultural adaptation of the CMM In Fuggetta A amp Wolf A (Eds) Software Process Chichester UK Wiley 137-154

Sinha M (2006) Performance models ndash Enabling organizations prediction capability Proceedings of SEPGrsquo06 Pittsburgh Software Engineering Institute Carnegie Mellon University

Schwaber K amp Beedle M (2002) Agile Software Development Using Scrum

Spinellis D (2006) Code Quality Addison-Wesley

Van Epps S amp Marshall R (2012) Using TSP at the end of the development cycle Proceedings of TSP 2012 Pittsburgh Software Engineering Institute Carnegie Mellon University

Vu JD (1996) Software process improvement A business case In Proceedings of the European SEPG Conference Milton Keynes UK European Software Process Improvement Foundation

118

  • Best Practices for Software Process and Product Measurement
  • Instructor Dr Bill Curtis
  • Agenda
  • Section 1
  • Why Does Measurement Languish
  • Measurement Process Guidance
  • Measurement Categories
  • Productivity Analysis Objectives
  • Productivity Analysis Measures
  • Software Productivity
  • Software Size Measures
  • How Many Lines of Code
  • Function Point Estimation
  • Effort mdash Weakest Measure
  • How Quality Affects Productivity
  • Carry-forward Rework
  • Example of Quality Impact
  • Quality-Adjusted Productivity
  • Best Practices for Analysis
  • 1 Evaluate Demographics
  • 2 Segment Baselines
  • 3 Beware Sampling Effects
  • 4 Understand Variation
  • 5 Investigate Distributions
  • 6 Account for Maturity Effects
  • 7 Caution for External Data Sets
  • Slide Number 27
  • Measuring Project Progress
  • Requirements Change Impact
  • Phase Defect Removal Model
  • SCRUM Board
  • Burndown Chart and Velocity
  • Analyzing Velocity
  • Measuring and Managing Flow
  • Trends Over Sprints
  • Sources and Measures
  • Recommended Books
  • Personal Software Process (PSP)
  • Team Software Process (TSP)
  • Aggregated Personal Data Best
  • Defect Detection at Intuit
  • TSP Benefits
  • Section 3
  • CMMI Maturity Transitions
  • Enhanced Maturity Framework
  • Maturity Level Transformation
  • Measurement by Level
  • Average Improvement Results
  • Schedule Performance
  • Cost Performance
  • Crosbyrsquos Cost of Quality Model
  • Raytheons Cost of Quality
  • Raytheonrsquos Productivity
  • Raytheonrsquos Cost Reduction
  • Raytheonrsquos Cost Predictability
  • OMRONrsquos Reuse gains
  • Section 4
  • What about the Product
  • Six Sigmarsquos Challenge
  • Testing is Not Enough
  • CASTrsquos Application Intelligence Platform
  • Modern Apps are a Technology Stack
  • Architecturally Complex Defects
  • Propagated Risk
  • Transaction Risk
  • Risk-Based Prioritization
  • Static Analysis in US DoD
  • Reducing Operational Incidents amp Costs
  • Reducing Operational Losses
  • Slide Number 70
  • The Technical Debt Metaphor
  • Tech Debt by Quality Factor
  • Join CISQ wwwit-cisqorg
  • Section
  • References 1
  • References 2
  • References 2
Page 82: Best Practices for Software Process and Product Measurement · Software Process and Product Measurement Dr. Bill Curtis ... measurement best practice • Best guidance for starting

Click to edit Master title style References 2 Larman C (2004) Agile and Iterative Development Boston Addison Wesley

Lydon T (1995) Productivity drivers Process and capital In Proceedings of the 1995 SEPG Conference Pittsburgh Software Engineering Institute Carnegie Mellon University

McGarry J et al (2002) Practical Software Measurement Reading MA Addison Wesley

Paulk M Weber C Curtis B amp Chrissis M (1995) The Capability Maturity Model Guidelines for Improving the Software Process Reading MA Addison-Wesley

Peterman W (2000) IEEE Software JulyAugust issue

Poppendieck M amp Poppendieck T (2007) Implementing Lean Software Development Boston Addison-Wesley

Sakamoto K Kishida K amp Nakakoji K (1996) Cultural adaptation of the CMM In Fuggetta A amp Wolf A (Eds) Software Process Chichester UK Wiley 137-154

Sinha M (2006) Performance models ndash Enabling organizations prediction capability Proceedings of SEPGrsquo06 Pittsburgh Software Engineering Institute Carnegie Mellon University

Schwaber K amp Beedle M (2002) Agile Software Development Using Scrum

Spinellis D (2006) Code Quality Addison-Wesley

Van Epps S amp Marshall R (2012) Using TSP at the end of the development cycle Proceedings of TSP 2012 Pittsburgh Software Engineering Institute Carnegie Mellon University

Vu JD (1996) Software process improvement A business case In Proceedings of the European SEPG Conference Milton Keynes UK European Software Process Improvement Foundation

118

  • Best Practices for Software Process and Product Measurement
  • Instructor Dr Bill Curtis
  • Agenda
  • Section 1
  • Why Does Measurement Languish
  • Measurement Process Guidance
  • Measurement Categories
  • Productivity Analysis Objectives
  • Productivity Analysis Measures
  • Software Productivity
  • Software Size Measures
  • How Many Lines of Code
  • Function Point Estimation
  • Effort mdash Weakest Measure
  • How Quality Affects Productivity
  • Carry-forward Rework
  • Example of Quality Impact
  • Quality-Adjusted Productivity
  • Best Practices for Analysis
  • 1 Evaluate Demographics
  • 2 Segment Baselines
  • 3 Beware Sampling Effects
  • 4 Understand Variation
  • 5 Investigate Distributions
  • 6 Account for Maturity Effects
  • 7 Caution for External Data Sets
  • Slide Number 27
  • Measuring Project Progress
  • Requirements Change Impact
  • Phase Defect Removal Model
  • SCRUM Board
  • Burndown Chart and Velocity
  • Analyzing Velocity
  • Measuring and Managing Flow
  • Trends Over Sprints
  • Sources and Measures
  • Recommended Books
  • Personal Software Process (PSP)
  • Team Software Process (TSP)
  • Aggregated Personal Data Best
  • Defect Detection at Intuit
  • TSP Benefits
  • Section 3
  • CMMI Maturity Transitions
  • Enhanced Maturity Framework
  • Maturity Level Transformation
  • Measurement by Level
  • Average Improvement Results
  • Schedule Performance
  • Cost Performance
  • Crosbyrsquos Cost of Quality Model
  • Raytheons Cost of Quality
  • Raytheonrsquos Productivity
  • Raytheonrsquos Cost Reduction
  • Raytheonrsquos Cost Predictability
  • OMRONrsquos Reuse gains
  • Section 4
  • What about the Product
  • Six Sigmarsquos Challenge
  • Testing is Not Enough
  • CASTrsquos Application Intelligence Platform
  • Modern Apps are a Technology Stack
  • Architecturally Complex Defects
  • Propagated Risk
  • Transaction Risk
  • Risk-Based Prioritization
  • Static Analysis in US DoD
  • Reducing Operational Incidents amp Costs
  • Reducing Operational Losses
  • Slide Number 70
  • The Technical Debt Metaphor
  • Tech Debt by Quality Factor
  • Join CISQ wwwit-cisqorg
  • Section
  • References 1
  • References 2
  • References 2