1 some topics in measurement/quantitative analysis john e. gaffney, jr., pe...

26
1 Some Topics In Measurement/Quantitative Analysis John E. Gaffney, Jr., PE [email protected] 301- 509-8552 November 3, 2010

Upload: brooke-anthony

Post on 02-Jan-2016

215 views

Category:

Documents


1 download

TRANSCRIPT

Page 1: 1 Some Topics In Measurement/Quantitative Analysis John E. Gaffney, Jr., PE gaffney123@verizon.netgaffney123@verizon.net 301-509-8552 November 3, 2010

1

Some Topics In Measurement/Quantitative Analysis

John E. Gaffney, Jr., [email protected] 301-509-8552

November 3, 2010

Page 2: 1 Some Topics In Measurement/Quantitative Analysis John E. Gaffney, Jr., PE gaffney123@verizon.netgaffney123@verizon.net 301-509-8552 November 3, 2010

2

Topics

• Some Topics In Measurement/Quantitative Analysis:– Function Points– Risk and Cost Estimation, COSYSMOR– Defect Analysis, Software Reliability

• Some Observations

Page 3: 1 Some Topics In Measurement/Quantitative Analysis John E. Gaffney, Jr., PE gaffney123@verizon.netgaffney123@verizon.net 301-509-8552 November 3, 2010

3

Measurement/Quantitative Management

• Measurement is all about supporting good management:– Knowing where you want to go; having the resources in place to get

there; determining whether you are there or are likely to get there; and taking action if you are diverging from your goals

• Quantitative Management is a ”closed-loop” process: 1.Establish goals, 2.Compare actuals with goals, 3.Take action if appropriate (e.g. a metric out of expected range; don’t act if O.K.)

• The closed-loop (feedback) approach tends to compensate for “noise” in the system, analogous to feedback in control systems (electrical eng.)

• Quantitatively characterize process, product, performance, personnel, tools/methodology

Page 4: 1 Some Topics In Measurement/Quantitative Analysis John E. Gaffney, Jr., PE gaffney123@verizon.netgaffney123@verizon.net 301-509-8552 November 3, 2010

4

Function Points: Some Observations• Invented by Allen J. Albrecht, IBM, in the1970’s.• Developed to capture “..user/customer requirements in a way that is more easily

understood by the user/customer than SLOC.”*• Originally used to estimate development costs for a particular IBM organization for

business applications; absorbed both size and productivity• Later, in more broadly-based usage, FP’s used as the metric for application size; then

development effort was based on the “universal” size measure and the particular productivity

• The usage of FP’s very quickly spread• For many, the use of FP’s in lieu of SLOC became virtually a religious matter• There are many variants of function points, e.g., feature points (Capers Jones),

simplified function points (Gaffney) – Some related measures are: story points and Halstead’s “unique I/O count”

• Function points may not be particularly well suited to use in highly calculation-intensive, “engineering” type software

• SLOC count and function point count/value typically highly correlated

* Albrecht,Gaffney, IEEE,TSE,Nov. ‘83

Page 5: 1 Some Topics In Measurement/Quantitative Analysis John E. Gaffney, Jr., PE gaffney123@verizon.netgaffney123@verizon.net 301-509-8552 November 3, 2010

5

Kwrkhrs. vs. KLOCy = 0.3793x - 2.9133

R2 = 0.7459 y = -0.0008x2 + 0.6231x - 12.074R2 = 0.7736

0.0

20.0

40.0

60.0

80.0

100.0

120.0

0 50 100 150 200 250 300 350

KLOC

Kw

rkhr

s.

From Albrecht/Gaffney Data

Page 6: 1 Some Topics In Measurement/Quantitative Analysis John E. Gaffney, Jr., PE gaffney123@verizon.netgaffney123@verizon.net 301-509-8552 November 3, 2010

6

Kwrkhrs. vs. fn pts.

y = 0.0578x - 13.189

R2 = 0.8948y = 4E-05x2 - 0.0146x + 9.0905

R2 = 0.9579

0.0

20.0

40.0

60.0

80.0

100.0

120.0

0 500 1000 1500 2000

fp

kwrk

hrs.

From Albrecht/Gaffney Data

Page 7: 1 Some Topics In Measurement/Quantitative Analysis John E. Gaffney, Jr., PE gaffney123@verizon.netgaffney123@verizon.net 301-509-8552 November 3, 2010

7

Kwkhrs.vs. I+P+Files+Inq

y = 0.3142x - 14.711R2 = 0.9134

y = 0.0003x2 + 0.1817x - 6.2328R2 = 0.9226

0

20

40

60

80

100

120

0 50 100 150 200 250 300 350 400 450

I+P+F+ INQ

Kw

khrs

.

From Albrecht/Gaffney Data

Page 8: 1 Some Topics In Measurement/Quantitative Analysis John E. Gaffney, Jr., PE gaffney123@verizon.netgaffney123@verizon.net 301-509-8552 November 3, 2010

8

Kwrkhrs. vs.inputs+ outputs

y = 0.4326x - 13.29R2 = 0.8157

y = 0.0001x2 + 0.3886x - 11.289R2 = 0.8165

0

20

40

60

80

100

120

0 50 100 150 200 250 300 350

I+P

kwrk

hrs

.

From Albrecht/Gaffney Data

Page 9: 1 Some Topics In Measurement/Quantitative Analysis John E. Gaffney, Jr., PE gaffney123@verizon.netgaffney123@verizon.net 301-509-8552 November 3, 2010

9

Defect Management Process

• Defect management is best executed as a ”closed-loop” process– On many projects it is not, however, and defect content, etc. are treated as “free

variables,” with no goals established for their values

• Defect management should be viewed as just as much a part of management as are cost and schedule management

– Defect detection, analysis (including root cause analysis), correction, and avoidance have cost and schedule implications

• We need to know: where we are going (goal); measure progress against goals, determine whether we are likely to achieve each goal or already have done so (measure and compare); take corrective action as necessary

• A good measurement program is key to a successful Defect Management Process

• Active Defect Management provides potential early indication (headlight metrics) of success and/or of problems

• Software Reliability estimation: based on estimates of latent (delivered, relevant) defects and prospective rate of discovery

Page 10: 1 Some Topics In Measurement/Quantitative Analysis John E. Gaffney, Jr., PE gaffney123@verizon.netgaffney123@verizon.net 301-509-8552 November 3, 2010

10

Defect Tools Overview• Purpose of the tools: fit defect discovery data to a curve to make predictions about

discovery later in the development/testing process from data obtained earlier; track progress against goals, early course correction as required

- Weibull curves: e.g. Rayleigh, exponential

• The tools are a key to implementing the software defect management process• Two types of tools: activity-based and time-based• Provide “headlight” metrics; e.g., can indicate defect discovery and content

objectives are not likely to be realized; can provide better mid-course cost/schedule predictions

• Activity-based tools fit data obtained from development activity verification, e.g., code inspection, and from testing

– A key value add: provide early prediction of testing results and latent defect content before testing has started and during testing; this can help minimize rework costs

• Tools (Gaffney) evolution history: STEER I (IBM,1985; key point, made an activity-based tool; before then, time-based fits only, to JEG’s knowledge); SWEEP (Software Productivity Consortium, 1988 et seq.); STEER II Lockheed Martin, 1997 et seq.); DEFT ( Post-Lockheed 2010)

Page 11: 1 Some Topics In Measurement/Quantitative Analysis John E. Gaffney, Jr., PE gaffney123@verizon.netgaffney123@verizon.net 301-509-8552 November 3, 2010

11

27-Oct-10 ;Defect Data and Weibull 2 Fit For Example Activity-Based Defect Discovery

0

1

2

3

4

5

6

7

8

Reqm'ts Design Code/UT SWIT SIT SAT UAT Latent

Activity

Def

ects

Per

KS

LOC

Data

Fit To Data/Projection

Page 12: 1 Some Topics In Measurement/Quantitative Analysis John E. Gaffney, Jr., PE gaffney123@verizon.netgaffney123@verizon.net 301-509-8552 November 3, 2010

12

Incr.Rel.Error %

0%

20%

40%

60%

80%

1.0 1.5 2.0 2.5 3.0 3.5 4.0

Weibull Model Number

Avg

. Rel

. % F

it Er

ror

Page 13: 1 Some Topics In Measurement/Quantitative Analysis John E. Gaffney, Jr., PE gaffney123@verizon.netgaffney123@verizon.net 301-509-8552 November 3, 2010

13

27-Oct-10 ;Defect Data and Weibull 2 Fit For Example PTRs Opened In Month Data Fit

0

200

400

600

800

1000

1200

1400

1600

1800

Feb-08 Mar-08 Apr-08 May-08 Jun-08 Jul-08 Aug-08 Sep-08 Oct-08 Latent

Month

Cum

ulat

ive

Def

ect C

ount

s Cumulative Fit To Data

Cumulative Data

Page 14: 1 Some Topics In Measurement/Quantitative Analysis John E. Gaffney, Jr., PE gaffney123@verizon.netgaffney123@verizon.net 301-509-8552 November 3, 2010

14

Some Aspects of Uncertainty Management and Risk Modeling• Better management of uncertainty starts with recognizing its

existence and in estimating cost/schedule/quality and other “risks”• Use the risk information to make better management decisions

during both business acquisition and program execution– Serve as a basis for discussion of alternatives within the project as well

as with the customer

– A key aspect of “affordability” • Recognize that the actual project outcomes, e.g, size, cost,

duration/schedule, quality, are uncertain as are many of the key determining parameters, e.g., productivity, size

• Thus, the risk of not achieving desired objectives should be quantified to the degree possible.

• “Risk” where smaller values are desirable, e.g., development effort, as used in COSYSMOR:

Risk=Prob [Actual Effort Will Be >Value]Confidence=100%-Risk

• “Risk” where larger values are desirable, MTBF and reliability: Risk=Prob [Actual Reliability Will Be <Desired Value]Confidence=100%-Risk

Page 15: 1 Some Topics In Measurement/Quantitative Analysis John E. Gaffney, Jr., PE gaffney123@verizon.netgaffney123@verizon.net 301-509-8552 November 3, 2010

15

Systems Engineering Person Hours Risk

y = -1E-15x3 + 3E-10x2 - 3E-05x + 1.1888

R2 = 0.9635

0%

10%

20%

30%

40%

50%

60%

70%

80%

90%

100%

0 20000 40000 60000 80000 100000 120000 140000

Person Hours

Per

son

Hou

rs R

isk

(=P

rob.

Tha

t A

ctua

l PH

Will

Exe

ed X

-Axi

s V

alue

)A COSYSMOR PLOT: Person Hours Risk(Larger values of effort are less desirable)

Target

Page 16: 1 Some Topics In Measurement/Quantitative Analysis John E. Gaffney, Jr., PE gaffney123@verizon.netgaffney123@verizon.net 301-509-8552 November 3, 2010

16

Person Hours Overrun Risk For Target Person Hours= 46904

y = -1E-15x3 + 2E-10x2 - 9E-06x + 0.2486

R2 = 0.9635

5%10%15%20%25%30%35%40%45%50%55%60%65%70%75%80%85%90%95%

0 2500 5000 7500 10000 12500 15000 17500 20000

Person Hours Overrun

Pers

on H

ours

Ove

rrun

Ris

k (=

Prob

Tha

t Tar

get P

H V

alue

W

ill B

e Ex

ceed

ed)

A COSYSMOR PLOT: Person Hours Overrun Risk Tail of the Previous Plot (Larger values of effort overrun are less desirable)

Page 17: 1 Some Topics In Measurement/Quantitative Analysis John E. Gaffney, Jr., PE gaffney123@verizon.netgaffney123@verizon.net 301-509-8552 November 3, 2010

17

Example Reliaibility Risk

0%

10%

20%

30%

40%

50%

60%

70%

80%

90%

100%

0.00 0.10 0.20 0.30 0.40 0.50 0.60 0.70 0.80 0.90 1.00

Reliability

Rel

iab

ilit

y R

isk

(=P

rob

. T

hat

A

ctu

al R

el.W

ill

Be

< X

-Axi

s V

alu

e)

Reliability: Prob of no failures during some time interval, starting atsome point in time after system/software/item goes into service and the clock starts (Smaller values are less desirable)

Page 18: 1 Some Topics In Measurement/Quantitative Analysis John E. Gaffney, Jr., PE gaffney123@verizon.netgaffney123@verizon.net 301-509-8552 November 3, 2010

18

Sample Project ENTER SIZE PARAMETERS FOR SYSTEM OF INTEREST

Low Likely* High

Easy Nominal Difficult Easy Nominal Difficult Easy Nominal Difficult# of System Requirements 9 10 1 10 11 2 11 12 1# of System Interfaces 2 10 3 2 11 4 2 13 5# of Algorithms 3 9 2 4 10 3 5 11 7# of Operational Scenarios 4 5 4 2 5 5 4 6 6Equivalent Total Requirements Size 351.90 402.90 520.40Equivalent New Requirements Size 299.58 341.39 437.77

SELECT COST PARAMETERS FOR SYSTEM OF INTEREST

Likely Value* High Value Size Parameter Range Values *

Requirements Understanding VH 0.60 N 1.00 L 1.36 * Note: If you do not wish to use the range or uncertainty (i.e., Low, Likely, High) values Architecture Understanding H 0.81 N 1.00 L 1.27 for the size and cost parameters, but wish to just enter a single value (e.g., the nominal estimate),Level of Service Requirements L 0.79 N 1.00 N 1.00 then enter such values in the "Likely" size parameter and cost parameter columns. Then, eitherMigration Complexity N 1.00 N 1.00 EH 1.92 simply ignore the range estimates produced or make the Low and High values for each parameter Technology Risk L 0.84 N 1.00 H 1.32 equal to the Likely value.Documentation L 0.91 N 1.00 VH 1.28# and diversity of installations/platforms N 1.00 N 1.00 EH 1.86 ** You can specify the proportions of each of the four requirements categories, e.g., system interfaces that are new, modified, reused, or deleted as well as # of recursive levels in the design VL 0.80 N 1.00 H 1.21 their cost per requirement relative to that for a new requirement. You can determine the proportions using the "Reuse" sheet or Stakeholder team cohesion H 0.81 N 1.00 VL 1.50 you can enter your estimate in the "Requirements Types" table above. Using the Table, you provide the percent of each requirementsPersonnel/team capability H 0.81 N 1.00 VL 1.48 category, e.g., "Systems Requirements", that is "Modified," "Reused" and "Deleted." Note that the percent that is "New" results fromPersonnel experience/continuity VH 0.67 N 1.00 L 1.21 these three data entries automatically. Also, enter the "Relative Cost" for the "Modified," the "Reused" and the "Deleted" portions Process capability VH 0.77 N 1.00 L 1.21 of each requirements category. The count of "Deleted" is expected to be zero for initial estimates. However, estimates revised during Multisite coordination VH 0.80 N 1.00 L 1.15 the course of execution of a project might well have some requirements "deleted," which would have to be reflected in a revised Tool support H 0.85 N 1.00 L 1.16 EAC (Estimate-At-Completion).The "Relative Cost" is the unit cost relative to that of the "New" requirements type. For example, if this Composite effort multipliers 0.05 1.00 54.58 figure were 10%, that would mean that that type, e.g., "Reused," costs per requirement 10% of what a N.B.:COSYSMOR requires that the Low, Likely, and High values for any parameter be new requirement of that category costs.such that: Low Value ≤ Likely Value ≤High Value.This means that you must enter “VH," If the % of "Modified" or "Reused" or "Deleted" differs among the "Easy," "Nominal," and "Difficult" in a "H," or "N,” values for the cost drivers 1,2, and 9-14, those indicated in pink, using the drop down category, e.g., "# of System Requirements," then you should enter the average of the three percentages.menus in the "Low" column. The "H," "VH,"and "EH" values for the other cost drivers are entered in the "High" column.COSYSMO MODEL PARAMETERS Low Likely High Nominal *Equivalent Size, S (=Equivalent New) 318 338 396 341Unit Effort Constant, A:Baseline 38.550 38.550 38.550 38.55Unit Effort Constant, A:User** 49.136 49.136 49.136 38.55

Unit Effort Constant, A Selected 49.136 49.136 49.136 38.55Size Exponent, E 1.000 1.060 1.100 1.06Cost Parameter Product,D 0.364 1.202 3.752 1.00SYSTEMS ENGINEERING PERSON MONTHS 37.4 186.6 874.7 122.9

5679 28361 132952 18675

COSYSMO MODEL FORM

PM, PH=A*(SE)*D

SYSTEMS ENGINEERING PERSON HOURS

PM=Person MonthsPH=Person Hours

Low Value

Original COSYSMOR Parameter Inputs

Low, Likely and High values for each parameter; defines probability distribution using Keefer and Bodily (1983) non-parametric method

Page 19: 1 Some Topics In Measurement/Quantitative Analysis John E. Gaffney, Jr., PE gaffney123@verizon.netgaffney123@verizon.net 301-509-8552 November 3, 2010

19

Some Observations-1• Expect your work life to offer you alternatives (and thus real

possibilities for personal and professional growth) that you cannot plan for

• Your education is never complete; look for opportunities to learn (formal and informal)

• Don’t be afraid to challenge “accepted wisdom” if you believe that you are correct– Perhaps, no one else has done so or others are afraid to do so

– Ask questions of “ancient worthies;” just because the boss or someone else has more experience or seemingly greater credentials, doesn’t necessarily make him or her correct

(But, please do be polite !)– Ask questions of yourself, of your assumptions and conclusions

Page 20: 1 Some Topics In Measurement/Quantitative Analysis John E. Gaffney, Jr., PE gaffney123@verizon.netgaffney123@verizon.net 301-509-8552 November 3, 2010

20

Some Observations-2

• Determine the use for the metrics to be collected/developed; what information are they going to provide/what questions are they going to answer or help to answer; who wants the information and what decisions are they going to make using that information

• Some keys to getting good metrics: use a development process that has well-defined activities, artifacts produced by each, well-defined entry/exit criteria; good definitions for the metrics; good collection and analysis processes; training of all personnel involved; strong commitment by both management and team members; recognition by project team members that the metrics are used to help manage the project and to make a product that has predictable attributes using a process that also has predictable attributes

• Observation 1: Lack of good historical and current data on which to characterize processes, products, performance and thence to support bases of estimates

• Observation 2: Mathematics/statistics skills are often found in project personnel at a lesser than desirable level; can lead to not knowing what to expect/over-confidence/no confidence

Page 21: 1 Some Topics In Measurement/Quantitative Analysis John E. Gaffney, Jr., PE gaffney123@verizon.netgaffney123@verizon.net 301-509-8552 November 3, 2010

21

Backup

Page 22: 1 Some Topics In Measurement/Quantitative Analysis John E. Gaffney, Jr., PE gaffney123@verizon.netgaffney123@verizon.net 301-509-8552 November 3, 2010

22

27-Oct-10 ;Defect Data and Weibull 2 Fit For Example Activity-Based Defect Discovery

0

5

10

15

20

25

30

Reqm'ts Design Code/UT SWIT SIT SAT UAT Latent

Activity

Cum

ulat

ive

Defe

cts

Per K

SLO

C

Cumulative Fit To Data

Cumulative Data

Page 23: 1 Some Topics In Measurement/Quantitative Analysis John E. Gaffney, Jr., PE gaffney123@verizon.netgaffney123@verizon.net 301-509-8552 November 3, 2010

23

27-Oct-10 ;Defect Data and Weibull 2 Fit For Example PTRs Opened In Month Data Fit

0

50

100

150

200

250

300

350

Month

Def

ect C

ount

s

Data

Fit To Data

Lower Range Fit Value

Upper Range Fit Value

Data 22 76 92 310 218 102 118

Fit To Data 30 85 127 151 156 145 123 97 71 118

Low er Range Fit Value 12 33 49 58 60 56 47 37 27 45

Upper Range Fit Value 47 134 201 238 246 228 194 152 111 186

Feb-08 Mar-08 Apr-08 May-08 Jun-08 Jul-08 Aug-08 Sep-08 Oct-08 Latent

Page 24: 1 Some Topics In Measurement/Quantitative Analysis John E. Gaffney, Jr., PE gaffney123@verizon.netgaffney123@verizon.net 301-509-8552 November 3, 2010

24

Additional Functions Provided By COSYSMOR COSYSMOR provides four major additional functions beyond those provided by Academic COSYSMO:1. Estimation of Cost/Effort and Schedule Uncertainties/Risk and

Confidence: Provides quantification of the impacts of uncertainties in the values of key model parameter values. Provides multiple cost and schedule values with associated probabilities.

Risk=Prob [Actual Effort Will Be >Estimated Effort]Confidence=100%-Risk

2. Representation of Multiple Types of Size Drivers: Provides for entering counts of: new, modified, reused, and deleted types for each of the four size driver categories.

3. Labor Scheduling: Provides the spread of systems engineering labor for the five systems engineering activities and across four the development phases (time).

4. Labor Allocation: Provides for the user to select the percentage allocations of the twenty activity/phase pairs or effort elements.

Page 25: 1 Some Topics In Measurement/Quantitative Analysis John E. Gaffney, Jr., PE gaffney123@verizon.netgaffney123@verizon.net 301-509-8552 November 3, 2010

2525

Affordability• “Affordability” is a measure of a system’s effectiveness• “Affordability” means that a given set of needs

(performance requirements) can be met within stated cost and schedule constraints.

• “Affordability” can also be defined as the probability (confidence) of achieving a stated set of needs at a stated cost and schedule (effort).

• The associated “risk” is determined (estimated) on the basis of the capability of the organization to meet this set of needs.

– “Risk” equals 100% minus “Confidence”

25

From presentation by Cole, Gaffney and Schimmoller at the 2009 PSM Conference

Page 26: 1 Some Topics In Measurement/Quantitative Analysis John E. Gaffney, Jr., PE gaffney123@verizon.netgaffney123@verizon.net 301-509-8552 November 3, 2010

26

Keefer and Bodily Three-Point Approximation To A

Continuous Probability Distribution*

0.05 Fractile↔0.185 Prob

0.50 Fractile↔0.630 Prob

0.95 Fractile↔0.185 Prob

*D.L. Keefer and S.E. Bodily, 3-Point Approximations For Continuous Random Variables,

Management Science, 2995), 1983, 595-609