dave castellano - ndiastorage.blob.core.usgovcloudapi.net...dave castellano deputy director,...
TRANSCRIPT
DoD Systemic Root Cause Analysis
Dave CastellanoDeputy Director, Assessments and Support
Laura M. DwinnellSystemic Analysis Team Leader
SYSTEMS & SOFTWARE ENGINEERINGOffice of the Deputy Under Secretary of Defense
for Acquisition and Technology
23 October 2007
NDIA SE 07
VERSION 1.01
Page 2 of 32NDIA-SRCA v1.01
Systems and Software Engineering… What are we all about?
• Help shape portfolio solutions and promote early corporate planning
• Promote the application of sound systems and software engineering, developmental test and evaluation, and related technical disciplines across the Department's acquisition community and programs
• Raise awareness of the importance of effective systems and software engineering, and drive the state-of-the-practice into program planning and execution
• Establish policy, guidance, best practices, education, and training in collaboration with academia, industry, and government communities
• Provide technical insight to the leadership to support effective and efficient decision making
Based on USD(AT&L) 2004 Imperative…“Provide context within which I can make decisions about individual programs.”
Acquisition Program Excellence through sound systems and software engineering…
Page 3 of 32NDIA-SRCA v1.01
Providing Value Added Oversight & Support
• Best Practices• Other Processes (JCIDS, etc)
• Oversight (DABS/ITAB)• Execution (staffing)
“a”
“A”
PEOs & PMs… ASResults
Acquisition Leadership
• Tactical, Program and Portfolio Management
• Strategic Management
DoD Acquisition Community • Systemic Issues & Risks• Systemic Strengths & Indicators
• Policy/Guidance• Education & Training
Recommendations
Improved AcquisitionSupport to WarfighterImproved Acquisition
Support to Warfighter
Improved Program Execution thru…
Program Unique Recommendations
• PSR• AOTR• SEP • TEMP • DAES
Achieved thru• Open Communication/Debate• Insight & Information Sharing
• Understanding of Consequences
• Data Driven, Fact-basedInformationSynthesis
Improved Acquisition Decision Making thru…
• Greater Program Transparency• Acquisition Insight
Page 4 of 32NDIA-SRCA v1.01
Systemic Analysis: Data Model Rev1
Steps 1a, 1b, 2-4Underway…
Steps 1a, 1b, 2-4Underway…
• Policy/Guidance• Education & Training• Best Practices
• Other Processes (JCIDS, etc)• Oversight (DABS/ITAB)• Execution (staffing)
DoD AcquisitionCommunity
1a
Program Review Findings
ProgramUnique
Solutions
ProgramCauses-Effects &
Root Causes
SEPFindings
1bDAES
Findings
1c
T&EFindings
1dOther
Findings
1eTBDTBD
Systemic Issues
SystemicRoot
Causes
Corrective Actions
2 3 4
Tactical, Program andPortfolio Mgt
Strategic Management
Valu
e A
dded
Ove
rsig
ht
Page 5 of 32NDIA-SRCA v1.01
Current focus of Systemic
Analysis
Program Support Review (PSR)Taxonomy of Classifications
Findings Findings
Positive
⌐ Issue
~ Risk
Neutral
~ Risk
Recommendation(s) Root Cause(s) Impact(s)
Impact(s)
May be a candidate for Process Improvement Recommendation
− Negative
May be a candidate for Best Practice
Potential
Root Cause(s) Recommendation(s)
Impact(s)Root Cause(s) Recommendation(s)
Positive Neutral
− Negative ⌐ Issue ~ Risk
~3700 Findings from Program Reviews
Page 6 of 32NDIA-SRCA v1.01
Top 10 Emerging Systemic Issues(from 52 Program Reviews since Mar 04)
Major contributors to poor program performance
1. Management • IPT roles, responsibilities, authority, poor communication• Inexperienced staff, lack of technical expertise
2. Requirements • Creep/stability• Tangible, measurable, testable
3. Systems Engineering • Lack of a rigorous approach, technical expertise• Process compliance
4. Staffing • Inadequate Government program office staff
5. Reliability • Ambitious growth curves, unrealistic requirements• Inadequate “test time” for statistical calculations
6. Acquisition Strategy • Competing budget priorities, schedule-driven• Contracting issues, poor technical assumptions
7. Schedule • Realism, compression
8. Test Planning • Breadth, depth, resources
9. Software • Architecture, design/development discipline• Staffing/skill levels, organizational competency (process)
10. Maintainability/Logistics • Sustainment costs not fully considered (short-sighted)• Supportability considerations traded
Page 7 of 32NDIA-SRCA v1.01
Observations Since Last Year
• Programs fail because we don’t…– Start them right– Manage them right
Page 8 of 32NDIA-SRCA v1.01
…We Don’t Start Them Right
• Requirements creep/stability – not tangible, measurable, testable, defined
• Acquisition strategies based on poor technical assumptions, competing budget prioritities, and unrealistic expectations
• Budget not properly phased• Lack of rigorous systems engineering approach• Schedule realism – success oriented, concurrent, poor estimation
and/or planning• Inadequate test planning – breadth, depth, resources• Optimistic/realistic reliability growth – not a priority during
development• Inadequate software architectures, design/development discipline,
and organizational competencies• Sustainment/life-cycle costs not fully considered (short-sighted)
Page 9 of 32NDIA-SRCA v1.01
…We Don’t Manage Them Right
• Insufficient trade space – resources, schedule, performance, requirements
• Inadequate IMP, IMS, EVMS• Insufficient risk management• Concurrent test program• Inadequate government PMO staff• Inexperienced and/or limited staffing• Poorly defined IPT roles, responsibilities and
authority• Poor communications
Page 10 of 32NDIA-SRCA v1.01
Root Cause Effects Model
SystemicRoot Cause SymptomsSystemic Issues
Management
Requirements
Systems Engineering
Staffing
Acquisition Strategy
Schedule
Test Planning
Software
Maintainability &Logistics
Technical Process
Management Process
Acquisition Practices
Requirements Process
Competing Priorities
Staff
Communication
Program Realism
Contract Structure & Execution
• Increased program execution risk
• Potential schedule and cost breach
• Shared engineering functions not given proper attention
• Rework• Insufficient system
performance information to make informed milestone decision
• Potential for lower readiness levels and higher maintainer workload
• Etc…
Who’s AffectedSolution Set
Policy/Guidance
Education & Training
BestPractices
Governance
ComponentAcqExec
ComponentRep
PEO
PM
Recommendations Must Address Root Causes at Their Source
Etc...
Page 11 of 32NDIA-SRCA v1.01
Systemic Analysis Milestones
Oct 06 Jan 07 Feb – Jul 07 Sep 07Aug 07
Develop & pilot root cause terms
Conduct SRCA Workshop (Part I)
Conduct SRCA Workshop (Part II)
Apply Root Cause Structure to program findings
Analyze preliminary results
• Categorized root cause textual descriptions• Terminology developed by small team, limited • Pilot effort proved that terms lacked proper
structure and definition
• Redefined Root Cause Type: 3 Tier• Terminology developed by workshop
participants representing DoD and Industry • RCT structure informally tested on 4 programs
from different domain areas
• Pilot RCT on program reviews: past effort and go-forward
• Definitions enhanced, terminology revised • Analysis of trends and applicability;
• Validate pilot on root cause method/structure• Formulate systemic root cause
recommendations • Feedback on SA model and root cause
methodology
Coming Up:Oct 07: Present results to SE community (NDIA-SE Conference)Nov07: Present results to acquisition community (PEO SYSCOM)Dec 07: Formalize and standardize methodologyMar 08: Incorporate other data sources (SEP, Triage, etc)
• Expand analysis to complete data set• Establish NDIA Working Group on SRCA
Page 12 of 32NDIA-SRCA v1.01
Root Cause TypesRecap of Part I Results
• Root Cause Types needed to categorize and discuss root causes
• Root Cause Type structure defined– Tier 1: Root Cause
» Textual description; documented by PSR team» Perceived program root cause
– Tier 2: Systemic Root Cause» From pre-defined list; assigned by PSR team» Can be "A" or "a". Conditions that are outside the PMO below the
Defense/Service Acquisition Executive level. This would include lateral activities, such as Service staff functions (OPNAV, Air Staff, etc.) and the system commands.
– Tier 3: Core Root Cause» From pre-defined list; assigned by PSR team» At the "A“ level. Something at the DAE level (3 Star level and above) –
Issues resolved through DAE coordination with Congress, DoD, Services, Industry, etc,
Root Cause Analysis is Crux of Systemic SolutionsRoot Cause Analysis is Crux of Systemic Solutions
PilotUnderway
Page 13 of 32NDIA-SRCA v1.01
Root Cause Type Structure
• Systemic Root Cause (Tier 2)
1. Ineffective communication2. Competing priorities3. CONOPs change4. Definition of enterprise5. Engagement of supply base in SE process6. Expectations not defined7. Inadequate baseline management8. Inadequate contract structure and execution9. Inadequate cost metrics e.g. EVMS10. Lack of accountability11. Lack of capital investment12. Lack of enterprise wide perspective13. Lack of appropriate staff14. Lack of trade space/constraints15. Lack of trust and willingness to share information16. Obfuscating bad news17. Ineffective organization 18. Poorly defined roles/responsibilities19. Process - Management 20. Process - Production 21. Process - Requirements 22. Process - Technical 23. Program realism24. Responsibility w/o authority25. Poor Acquisition Practices
• Core Root Cause (Tier 3)
1. Acq Reform: Loss of govt. capital investment2. Acq Reform: Loss of MS A requirement3. Acq Reform: Transferred Authority4. Enabling infrastructure5. Budget POM process (PBBE)6. Culture7. Rotations / continuity8. Inadequate JCIDS process9. Pool of clearable skilled people10. External influences11. Poor business practices
Note: “Other”
and “Unknown”
are viable
selections
Page 14 of 32NDIA-SRCA v1.01
SADB Features
• Relational• Web-enabled• Excel output• Embedded charts• Search wizards• Data query• Data quality reporting
Databasecontains
~3700 program findings
from52 reviews
Welcome “Username”
Page 15 of 32NDIA-SRCA v1.01
Systemic Root Cause Analysis Preliminary Results
• Analysis performed on 44 program reviews• SRCA applied to negative findings: ~ 48% of
total set, ~1500 findings• Trends shown by:
– (1) Systemic Root Cause (SRC)– (2) DAPS areas related to leading SRC– (3) Core Root Cause (CRC)– (4) SRCs as related to:
» CRC = Poor Business Practice» CRC = Culture
See Next 5 Slides for Results…
Page 16 of 32NDIA-SRCA v1.01
Categorization by Systemic Root Cause
Number of Programs that Experienced the SRC (Part 1 of 2)
3531
26 25 23 23 22 21 21 2015 15 12
05
10152025303540
Tech
nica
l(p
roce
ss)
Man
agem
ent
(pro
cess
)Ac
quis
ition
Prac
tices
Req
uire
men
ts(p
roce
ss)
Com
petin
gpr
iorit
ies
Staf
f (La
ck o
fap
prop
riate
)O
rgan
izat
ion
(inef
fect
ive
)C
omm
unic
atio
n(in
effe
ctiv
e)Pr
ogra
mR
ealis
mC
ontra
ctSt
ruct
ure
and
Ente
rpris
eW
ide
Expe
ctat
ions
not d
efin
edC
apita
lIn
vest
men
t
Systemic Root Cause
No.
of P
rogr
ams
•Occurred on 50% or more of the programs
Note: Chart shows 13 of 26 SRCs
Page 17 of 32NDIA-SRCA v1.01
For SRC = Technical Process: Number of Programs that Experience Issues by DAPS
17 17 17
12 11 9 8 86 6
42 2 2 2 2 2 1 1 1
45
0
5
10
15
20
3.3 P rgm & Proj Mg
4.5 Sys Int, T
est & Ver
5.3 System At tributes
3.2 P ro ject Planning
5.1 System Descript io
n
4.7 P ro cess Imp ro.. .
4.3 Func Ana lysis &..
5.2 System Pe rfor...
3.1 Acq S trategy/P
...
4.4 Des ign Synthesis1.1 O
p Reqm ts
3.5 Communicatio
n
4.2 Reqmts Develo...
2.2 Personne l
2.4 Engineering Tool
3.4 Con tr & Subcontr
4.1 Tech Assmt &
T..
6.1 S tat & Reg Envm
7.0 Other
1.3 Capabil ities
2.3 Facil ities
4.6 Transition to
De.
DAPS Area
No. o
f Pro
gram
s 30 to 40%
Systemic Root Cause: Technical Process
Program Issues
• Aggressive, success-oriented, highly concurrent test schedule• Reliability not progressing as planned or has failed to achieve requirements• Software reuse was significantly less than planned or expected• Testing and verification approach are inadequate• Program has inadequate systems engineering process
Root Cause Themes
TechnicalProcess
Systemic Root Cause
Page 18 of 32NDIA-SRCA v1.01
Categorization by Core Root Cause
Number of Programs that Experienced the CRC
31 30 30
26
21 2017 16
1310
6 6
0
5
10
15
20
25
30
35
Poo
rbu
sine
sspr
actic
es
Cul
ture
Ext
erna
lIn
fluen
ces
Ena
blin
gIn
frast
ruct
ure
Inad
equa
teJC
IDS
proc
ess
Acq
refo
rm:
Loss
of g
ovt
capi
tal
inve
stm
ent
Oth
er
Bud
get P
OM
proc
ess
(PB
BE
)
Poo
l of
clea
rabl
esk
illed
peo
ple
Acq
refo
rm:
Tran
sfer
red
too
muc
hau
thor
ity to
Acq
refo
rm:
Loss
of M
S A
requ
irem
ent
Rot
atio
ns /
cont
inui
ty
Core Root Cause
No.
of P
rogr
ams
Occurred on 50% or more
of the programs
Page 19 of 32NDIA-SRCA v1.01
Relationship between CRC and SRC
For CRC = Poor Business Practices: Number of Programs that Experienced SRC
19 19
11 10 9 8 7 7 6 6 5 5 5 5 4 4 3 2 1 1 1 1 1 1 1
02468
101214161820
Man
agem
ent
(pro
cess
)Te
chni
cal (
proc
ess)
Con
tract
Stru
ctur
ean
d Ex
ecut
ion
Req
uire
men
ts(p
roce
ss)
Acqu
isitio
n Pr
actic
es(p
oor)
Com
mun
icat
ion
(inef
fect
ive)
Base
line
Man
agem
ent
Rol
es/R
espo
nsib
ilitie
s(p
oorly
def
ined
)C
ost M
etric
s(in
adeq
uate
)O
rgan
izat
ion
(inef
fect
ive
)C
ompe
ting
prio
ritie
sEn
terp
rise
Wid
ePe
rspe
ctiv
e (la
ck o
f)Ex
pect
atio
ns n
otde
fined
Prog
ram
Rea
lism
Prod
uctio
n (p
roce
ss)
Staf
f (La
ck o
fap
prop
riate
)C
apita
l Inv
estm
ent
(lack
of)
Acco
unta
bility
(lac
kof
)C
onop
s ch
ange
Def
initio
n of
ente
rpris
eEn
gage
men
t of
supp
ly b
ase
in S
EO
bfus
catin
g ba
dne
ws O
ther
Res
pons
ibilit
y w
/oau
thor
ityTr
ust &
willi
ngne
ss to
shar
e in
form
atio
n
Systemic Root Cause
No.
of P
rogr
ams
SRCA Workshop (II) focused on these
Page 20 of 32NDIA-SRCA v1.01
Relationship between CRC and SRC
For CRC = Culture: Number of Programs that Experienced the SRC
1211 11
9 9
7 7 7 76
54
3 3 3 3 3 32
1 1 1 10 0 0
0
2
4
6
8
10
12
14
Com
mun
icat
ion
Acqu
isitio
n Pr
actic
es
Tech
nica
l (pr
oces
s)
Man
agem
ent (
proc
ess)
Org
aniz
atio
n
Com
petin
g pr
iorit
ies
Con
tract
Stru
ctur
e an
d
Expe
ctat
ions
not
Staf
f (La
ck o
f
Ente
rpris
e W
ide
Prod
uctio
n (p
roce
ss)
Rol
es/R
espo
nsib
ilitie
s
Cap
ital I
nves
tmen
t (la
ck
Cos
t Met
rics
Enga
gem
ent o
f sup
ply
Prog
ram
Rea
lism
Req
uire
men
ts
Trus
t & w
illing
ness
to
Trad
espa
ce/C
onst
rain
ts
Base
line
Man
agem
ent
Con
ops
chan
ge
Def
initio
n of
ent
erpr
ise
Obf
usca
ting
bad
new
s
Acco
unta
bility
(lac
k of
)
Oth
er
Res
pons
ibilit
y w
/o
Systemic Root Cause
No. o
f Pro
gram
s
SRCA Workshop (II) focused on these
Page 21 of 32NDIA-SRCA v1.01
SRCA Workshop Participants(Part II) 25-26 Sep 07
• Approximately 33 participants representing government and industry
• Non-OSD participants included…– Government
» Col Horejsi, US Air Force (PEO)» Mr. George Mooney, USAF CSE» Ms. Kathy Lundeen, DCMA» Mr. John Snoderly, DAU
– Industry» Mr. Bob Rassa, NDIA/Raytheon» Mr. Brian Wells, Raytheon» Mr. Rick Neupert & Mr. Jamie Burgess, Boeing» Mr. Stephen Henry, Northrop Grumman» Mr. Per Kroll, IBM» Mr. Paul Robitaille, Lockheed Martin» Dr. Dinesh Verma, Stevens Institute of Technology» Mr. Dan Ingold, University of Southern California
Page 22 of 32NDIA-SRCA v1.01
SRCA Workshop (Part II ) Objective
• Primary SRCA Workshop II objective:– Formulate systemic root cause recommendations
• Participants focused on manageable subset of analysis results – 2 CRC areas and their top 4-5 SRCs
CULTURECULTURE
Communications
Organization
Acquisition PracticesManagement
Technical Processes
POOR BUSINESS
PRACTICES
POOR BUSINESS
PRACTICES
Requirements
Contracting
Page 23 of 32NDIA-SRCA v1.01
Root Cause Model(e.g., Poor Business Practices)
Systemic Root Cause(Top 4)
Core RootCause Solution Set
FIN
DIN
GS
Management Process
Technical Process
Contract Structure &
Execution
Requirements
----------------------------------------------------
----------------------------------------------------
----------------------------------------------------
---------------------------------------------------
Poor BusinessPractices Who?
What? When?
Where? Why?
Source
Policy/Guidance
Education & Training
BestPractices
Governance
-----------------------------------------------------------------
-----------------------------------------------------------------
-----------------------------------------------------------------
-----------------------------------------------------------------
REC
OM
MEN
DA
TION
S
Recommendations Must Address Root Causes at Their Source
Page 24 of 32NDIA-SRCA v1.01
Initial Thoughts on Systemic Improvement…
Policy/Guidance
Education & Training
BestPractices
Governance
Page 25 of 32NDIA-SRCA v1.01
SRCA Workshop Part II - Results
• Over 50 recommendations– Varied level of detail– Directed at variety of
sources» Acquirer &
Developer» PM, PEO, Comp.
Rep., Acq. Exec» Senior
Management to Systems Engineer
Industry panel will discuss top 5 next!
Page 26 of 32NDIA-SRCA v1.01
Next Steps
• Develop Action Plan– Prioritize the emerging recommendations– Assign stakeholders– Establish timelines
• Complete analysis on remaining CRC areas• Formalize NDIA Working Group to continue
recommendation development on CRC analysis
Page 27 of 32NDIA-SRCA v1.01
Questions/Discussion
Contact Information:
Dave Castellano Laura DwinnellODUSD(A&T) Systems & Software Engineering SSE/AS Support Deputy Director, Assessments and Support Systemic Analysis Team [email protected] [email protected]
Page 28 of 32NDIA-SRCA v1.01
Systemic Root Cause AnalysisIndustry Panel Discussion Panel Moderator: Mr. Bob Rassa
Dave CastellanoDeputy Director, Assessments and Support
Laura M. DwinnellSystemic Analysis Team Leader
SYSTEMS & SOFTWARE ENGINEERINGOffice of the Deputy Under Secretary of Defense
for Acquisition and Technology
23 October 2007
NDIA SE 07
VERSION .9
Page 29 of 32NDIA-SRCA v1.01
Industry Panel Members
• Mr. Stephen Henry– Northrop Grumman: Principal Engineer
• Mr. Brian Wells– Raytheon: Chief Systems Engineer
• Mr. Per Kroll – IBM: Manager - Methods IBM Rational
• Mr. Paul Robitaille – Lockheed Martin: Director of Systems Engineering
Lockheed Martin Corporate Headquarters; President, INCOSE
• Mr. James Burgess – Boeing: Systems Engineering Senior Manager,
Leader of the Boeing Systems Engineering Best Practices Initiative Boeing Integrated Defense Systems
Page 30 of 32NDIA-SRCA v1.01
Results – SRCA Workshop Part II
5 “Heavy Hitter” recommendations include:
1. Increase or improve competition – down select at SRR/PDR/CDR
2. Provide mechanisms for better performance & Implement consequences for non-performance» Increase use of toll gate reviews with off-ramps and specific
guidance/requirements3. Ensure better definition and verification of
requirements. E.g. use meta-language, SE-based modeling, etc.
4. Require more close coupling of the IMPs/IMS/WBS5. Increase acquisition workforce and expertise
» Use “green teams” to augment needed acquisition expertise
Page 31 of 32NDIA-SRCA v1.01
When is Extended Competition Cost Effective?
-42%-2%12%Down Select Cost
Savings Medium Low
106%111%122%144%Medium Low Complexity
Holchin Level 3*
-3%31%34%Down Select Cost
Savings Medium High
111%122%144%188%Medium High Complexity
Holchin Level 7*
CDRPDRSRRATPProgram Complexity &
SW Growth
-42%-2%12%Down Select Cost
Savings Medium Low
106%111%122%144%Medium Low Complexity
Holchin Level 3*
-3%31%34%Down Select Cost
Savings Medium High
111%122%144%188%Medium High Complexity
Holchin Level 7*
CDRPDRSRRATPProgram Complexity &
SW Growth
* SW Growth Based on Holchin Growth Curve Average Growth
Page 32 of 32NDIA-SRCA v1.01
Questions/Discussion
Contact Information:
Dave Castellano Laura DwinnellODUSD(A&T) Systems & Software Engineering SSE/AS Support Deputy Director, Assessments and Support Systemic Analysis Team [email protected] [email protected]