agile metrics - modeling, simulation, and data mining

Post on 07-Dec-2014

2.812 Views

Category:

Technology

4 Downloads

Preview:

Click to see full reader

DESCRIPTION

Modeling, Simulation & Data Mining: Answering Tough Cost, Date & Staff Forecasts Questions

TRANSCRIPT

Modeling, Simulation & Data Mining: Answering Tough Cost, Date

& Staff Forecasts Questions

Troy Magennis (Focused Objective)

Larry Maccherone (Rally)

Pain Point

My Boss “Needs” A Date…

Getting Quantitative

Evidence

Assessing & Communicating

Risk / Uncertainty

Arm my teams (and yours) with the tools and techniques

to solve these problems

My Mission

2 Minutes About Larry

• Larry is a Pisces who enjoys skiing, reading and wine (red, or white in outdoor setting)

• We have a lot in common… over to Larry!

Metrics & Measurement

Why measure?

Feedback

Diagnostics

Forecasting

Lever

When to NOT take a shot

Good players?

• Monta Ellis

– 9th highest scorer (8th last season)

• Carmelo Anthony (Melo)

– 8th highest scorer (3rd last season)

The ODIM Framework

better Measurement

better Insight

better Decisions

better Outcomes

What is normal?

Cumulative -> 0.1% 2.3% 15.9% 50.0% 84.1% 97.7% 99.9%

Roughly -> 85% 98%

Are you normal?

You will be wrong by…

• 3x-10x when assuming Normal distribution

• 2.5x-5x when assuming Poisson distribution

• 7x-20x if you use Shewhart’s method

Heavy tail phenomena are not incomprehensible… but they cannot be

understood with traditional statistical tools. Using the wrong tools is incomprehensible.

~ Roger Cooke and Daan Nieboer

Bad application of control chart Control is an illusion, you infantile

egomaniac. Nobody knows what's gonna happen next: not on a freeway, not in an airplane, not inside our own bodies and

certainly not on a racetrack with 40 other infantile egomaniacs.

~Days of Thunder

Time in Process (TIP) Chart A good alternative to control chart

Collection

• Perceived cost is high

• Little need for explicit collection activities

• Use a 1-question NPS survey for customer and employee satisfaction

• Plenty to learn in passive data from ALM and other tools

• How you use the tools will drive your use of metrics from them

Summary of how to make good metric choices

• Start with outcomes and use ODIM to make metrics choices.

• Make sure your metrics are balanced so you don’t over-emphasize one at the cost of others.

• Be careful in your analysis. The TIP chart is a good alternative to control chart. Troy’s approach is excellent for forecasting. We’ve shown that there are many out there that are not so good.

• Consider collection costs. Get maximal value out of passively gathered data.

Data visualization is like photography. Impact is a function of perspective, illumination, and focus.

~Larry Maccherone

Flaw of Averages, Risk & Monte Carlo Sim

A model is a tool used to mimic a

real world process

A tool for low-cost experimentation

Monte Carlo Simulation

Monte Carlo Simulation?

Performing a simulation of a model multiple times using

random input conditions and recording the frequency of

each result occurrence

Scrum

Backlog This Iteration Deployed

2 5

8

Run Sim Total Iterations

1 3

2 2

3 5

4 3

5 4

6 2

… …

Kanban

Backlog Design Develop Test Deployed

2

1 – 2 days 1 – 5 days 1 – 2 days

Run Time Total

1 5

2 4

3 3

4 9

5 5

6 6

… …

Result versus Frequency (50 runs)

More Often

Less Often Result Values – For example, Days

15 10 20

Fre

qu

en

cy o

f R

esu

lt

1

5

10

15

20

25

Result versus Frequency (250 runs)

More Often

Less Often Result Values – For example, Days

15 10 20

Fre

qu

en

cy o

f R

esu

lt

1

5

10

15

20

25

Result versus Frequency (1000+ runs)

More Often

Less Often Result Values – For example, Days

15 10 20

Fre

qu

en

cy o

f R

esu

lt

1

5

10

15

20

25

Key Point

There is NO single forecast result

There will always be many possible results, some more likely

Time to Complete Backlog

50% Possible Outcomes

50% Possible Outcomes

When pressed for a single number, we often give the average.

Like

liho

od

Time to Complete Backlog

95% Outcomes 5%

Monte Carlo Simulation Yields More Information – 95% Common.

Like

liho

od

Key Point

“Average” is NEVER an option WARNING: Regression lines

are most often “average”

But, I.T. gets worse

1 2 3

Planned Backlog

Perf. Issues

Vendor Delay

Time to Delivery

Like

liho

od

Promised New Average

50% Possible Outcomes

Key Point

Risks play a BIG role in forecasts

Yes, more than backlog.

Velocity is NOT Linear nor is defect rate, scope-creep, story

expertise requirements, team skill, etc.

Date for likelihood

Likelihood (0-100%)

Key Point Forecasts should be presented with the

right amount of uncertainty

PAIN POINT Demo: Forecasting… My Boss “Needs” a Date…

In this demo

• Basic Scrum and Kanban Modeling

• How to build a simple model

– SimML Modeling Language

– Visual checking of models

– Forecasting Date and Cost

– The “Law of Large Numbers”

Demo: Finding What Matters Most Cost of Defects & Staff Analysis

Actively Manage

Ignore for the moment

Sensitivity Report

Staff Skill Impact Report

Explore what staff changes have the greatest impact

Key Point

Modeling helps find what matters

Fewer estimates required

In this demo

• Finding what matters most

– Manual experiments

– Sensitivity Testing

• Finding the next best 3 staff skill hires

• Minimizing and simplifying estimation

– Grouping backlog

– Range Estimates

– Deleting un-important model elements

Demo: Finding the Cost / Benefit of Outsourcing

Outsourcing Cost & Benefits

• Outsourcing often controversial

– Often fails when pursued for cost savings alone

– Doesn’t always reduce local employment

– An important tool to remain competitive

– I.Q. has no geographic boundaries

• Many models

– Entire project

– Augmentation of local team

Build Date & Cost Matrix

1 x Estimates

1.5 x Estimates

2 x Estimates

1 x Staff Best Case

1.5 x Staff Midpoint

2 x Staff Worst Case

Benefit = (Baseline Dev Cost – New Dev Cost) - Cost of Delay + Local Staff Cost Savings

$(150,000)

$(100,000)

$(50,000)

$-

$50,000

$100,000

$150,000

1 1.5 2

1x Multiplier

1.5x Multiplier

2x Multiplier

NOT LINEAR & NOT YOUR PROJECT

In this demo

• Model the impact of various outsourcing models

New Project Rules of Thumb…

• Cost of Delay plays a significant role

– High cost of delay project poor candidates

– Increase staffing some compensation

• Knowledge transfer and ramp-up time critical

– Complex products poor candidates

– Captive teams better choices for these projects

• NEVER as simple as direct lower costs!

Assessing and Communicating Risk

Speaking Risk To Executives

• Buy them a copy of “Flaw of Averages” • Show them you are tracking & managing risk • Do

– “We are 95% certain of hitting date x” – “With 1 week of analysis, that may drop to date y” – “We identified risk x, y & z that we will track weekly”

• Don’t – Give them a date without likelihood

• “February 29th 2013”

– Give them a date without risk factors considered • “To do the backlog of features, February 29th, 2013”

We spend all our time estimating here

1 2 3

**Major risk events have the predominate role in deciding where deliver actually occurs **

Plan Performance Issues

External Vendor Delay

Risk likelihood changes constantly

1 2 3

95th Confidence

Interval

Risk likelihood changes constantly

1 2 3

95th Confidence

Interval

Risk likelihood changes constantly

1 2 3

95th Confidence

Interval

Risk likelihood changes constantly

1 2 3

95th Confidence

Interval

Key Points

• There is no single release date forecast

• Never use Average as a quoted forecast

• Risk factors play a major role (not just backlog)

• Data has shape: beware of Non-Normal data

• Measurement → Insight → Decisions → Outcomes : Work Backwards!

• Communicate Risk early with executive peers

Call to action

• Read these books

• Download the software FocusedObjective.com

• Follow @AgileSimulation

• Follow @LMaccherone

Please Submit an Eval Form!

We want to learn too!

BEST PRACTICES

Model (a little)

Visually Test

Monte-Carlo Test

Sensitivity Test

The Model Creation

Cycle

Baseline

Make Single

Change

Compare Results

Make Informed

Decision(s)

The Experiment

Cycle

Best Practice 1

Start simple and add ONE input condition at a time.

Visually / Monte-carlo test

each input to verify it works

Best Practice 2

Find the likelihood of major events and estimate delay E.g. vendor dependencies,

performance/memory issues, third party component

failures.

Best Practice 3

Only obtain and add detailed estimates and opinion to a

model if Sensitivity Analysis says that input is material

Best Practice 4

Use a uniform random input distribution UNTIL sensitivity

analysis says that input is influencing the output

Best Practice 5

Educate your managers’ about risk. They will still want a “single” date for planning, but let them decide 75th or

95th confidence level (average is NEVER an option)

SIMULATION EXAMPLES

Return to main presentation…

unlikely

certain

Forecasts Return to main presentation…

unlikely

certain

Forecasts

50% Possible

Outcomes

50% Possible Outcomes

Return to main presentation…

Actively Manage

Ignore for the moment

Sensitivity Report Return to main presentation…

Staff Skill Impact Report

Explore what staff changes have the greatest impact

Return to main presentation…

Return to main presentation…

Focused Objective

• Risk Tools for Software Dev

• Scrum/Agile Simulation

• Kanban/Lean Simulation

• Forecasting Staff, Date & Cost

• Automated Sensitivity Analysis

• Data Reverse Engineering

• Consulting / Training

• Book

We Use & Recommend: EasyFit

• MathWave.com

• Invaluable for

– Analyzing data

– Fitting Distributions

– Generating Random Numbers

– Determining Percentiles

top related