u.s. army test and evaluation command€¦ · 2. determine key evaluation metrics you know when you...

17
U.S. ARMY TEST AND EVALUATION COMMAND In-Stride Operational Assessments Prepared for the 33 rd Annual National Test and Evaluation Conference Prepared by Christopher M. Wilcox 17 May 2018

Upload: others

Post on 21-Jul-2020

0 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: U.S. ARMY TEST AND EVALUATION COMMAND€¦ · 2. Determine Key Evaluation Metrics You know when you are done…when you have a list of key metrics that will be analyzed using data

U.S. ARMY TEST AND EVALUATION COMMAND

In-Stride Operational Assessments

Prepared for the 33rd Annual National Test and Evaluation Conference

Prepared by Christopher M. Wilcox

17 May 2018

Page 2: U.S. ARMY TEST AND EVALUATION COMMAND€¦ · 2. Determine Key Evaluation Metrics You know when you are done…when you have a list of key metrics that will be analyzed using data

Observations

• Operational Testing is Lengthy

• Systems Not Ready for Initial Operational Test and Evaluation

• Call to “Integrate” and “Streamline” Testing

I plan to update existing DOT&E guidance to incorporate an integrated

testing philosophy. In my independent assessments, I intend to use all

credible information to provide the warfighter and the Congress a complete

understanding of how the systems the Department acquires will improve

the readiness and lethality of our military forces. (FY2017 Annual Report)

- Robert F Behler

Director, Operational Test and Evaluation

2

Page 3: U.S. ARMY TEST AND EVALUATION COMMAND€¦ · 2. Determine Key Evaluation Metrics You know when you are done…when you have a list of key metrics that will be analyzed using data

• Continuous Evaluation of Critical Capabilities

• Phased Planning and Execution by PM, FORSCOM, ATEC

• Phased Reporting at Key Programmatic Decision Points

• Benefits:

―More efficient use of all available data.

―Enables planning for and visibility of risk reduction during development.

―Increases probability of successful IOT.

―Reduces the impact on the operational force.

In-Stride Operational Assessments

Create understanding of the maturity of a system continuously in order to

increase the system’s chance of passing and reduce the scope (schedule) of IOT.

3

Page 4: U.S. ARMY TEST AND EVALUATION COMMAND€¦ · 2. Determine Key Evaluation Metrics You know when you are done…when you have a list of key metrics that will be analyzed using data

Overarching Purpose and StepsEnds: create understanding of the maturity of a system continuously throughout it’s developmental

life cycle in order to increase the system’s chance of passing and to reduce the scope (schedule) of

IOT …

Ways: by planning, conducting and reporting the progress of system development against one or

more critical capabilities and creating a phased DT/OT approach …

Means: through use of an aggregation evaluation methodology and all available data (contractor

testing, developmental testing, operational testing, M&S, etc.) and integrate Soldiers into early

testing.

Steps: 1. Determine Critical Capabilities

2. Determine Key Evaluation Metrics

3. Develop Aggregation Methodology

4. Develop Evaluation Schedule

5. Execution and Reporting4

Page 5: U.S. ARMY TEST AND EVALUATION COMMAND€¦ · 2. Determine Key Evaluation Metrics You know when you are done…when you have a list of key metrics that will be analyzed using data

1. Determine Critical Capabilities

You know when you are done…when you have a list of capabilities that are critical to

Soldiers and are the purpose for the acquisition of this system.

Sources:

• R-/P-Forms

• Critical Operational Issues

• Capability Gaps

Thoughts: best when…

• described as operational capabilities, i.e. Soldier task outcome,

• can be linked to performance (operational or technical) that will be testing during CT, DT,

etc., and

• limited in number.

Advanced Integrated Air and Missile Defense

1. Dynamic Force Tailoring

2. Joint Engagements

3. Extended Battlespace

4. Flexible Interceptor Selection

5. Reliability

6. Enhanced EA Defense

5

Page 6: U.S. ARMY TEST AND EVALUATION COMMAND€¦ · 2. Determine Key Evaluation Metrics You know when you are done…when you have a list of key metrics that will be analyzed using data

2. Determine Key Evaluation Metrics

You know when you are done…when you have a list of key metrics that will be

analyzed using data from test, M&S, etc., and are linked to the critical capabilities.

Sources:

• Requirements Documents

• System Evaluation Plan

• Developmental Evaluation Framework

Thoughts: best when…

• limited in number for each critical capability, and

• data will be available throughout development.

AIAMD Example

- Extended Battlespace Critical

Capability

1. MOE: # of engagements using

fused sensor data

2. MOP: % of remote tracks

presented

3. MOP: Slant range of engagements

6

Page 7: U.S. ARMY TEST AND EVALUATION COMMAND€¦ · 2. Determine Key Evaluation Metrics You know when you are done…when you have a list of key metrics that will be analyzed using data

3. Develop Aggregation Methodology

You know when you are done…when you have normalized and developed a

method to aggregate the key metrics into progress towards meeting the Critical

Capability.

Sources:

• None (you have to figure it out)

Thoughts: best when…

• shows progress towards desired capability at IOT.

AIAMD Example

- Extended Battlespace Critical

Capability

=

𝑖=1

𝑛𝑀𝑂𝐸/𝑃𝑛

𝑛 × 100

Where each MOE/MOP is normalized.

i.e. Slant Range requirement 5 to 50 km is

normalized to 5 = 0% and 50 = 100%.

7

Page 8: U.S. ARMY TEST AND EVALUATION COMMAND€¦ · 2. Determine Key Evaluation Metrics You know when you are done…when you have a list of key metrics that will be analyzed using data

AIAMD Example

4. Develop Evaluation Schedule

You know when you are done…when you have

an outline graph for each capability that shows

expected performance growth and data

sources scheduled.

Sources:

• Systems Engineering Plan

• PM, Contractor

Thoughts: best when…

• critical capabilities show continuous

improvement throughout schedule.

IOT

LUT 1

GSIL

TODAY

SCOE

WTISCOE

LUT2GSIL

LUT: Limited User Test

GSIL: Government System Integration Lab

WTI: Weapons Tactics Instruction

SCOE: Soldier Checkout Event

IOT: Initial Operational Test8

Page 9: U.S. ARMY TEST AND EVALUATION COMMAND€¦ · 2. Determine Key Evaluation Metrics You know when you are done…when you have a list of key metrics that will be analyzed using data

Example: Shape of the solution…

And then a miracle happens…

0

10

20

30

40

50

60

70

80

90

100

0 2 4 6 8 10 12 14 16 18 20 22 24 26 28 30 32 34 36

Pe

rcen

t C

ap

ab

ility

Ob

se

rved

Months

Critical Operational Capability

Baseline Observed Performance

Contractor

DT

OTGovernment

DT

0

10

20

30

40

50

60

70

80

90

100

0 2 4 6 8 10 12 14 16 18 20 22 24 26 28 30 32 34 36

Pe

rce

nt C

ap

ab

ility O

bse

rve

d

Months

Critical Operational Capability

Baseline Observed Performance

OT

Government

DT

That was easy…

9

Page 10: U.S. ARMY TEST AND EVALUATION COMMAND€¦ · 2. Determine Key Evaluation Metrics You know when you are done…when you have a list of key metrics that will be analyzed using data

Example: Risk Mitigation vs. Capability

Low

Med

High

Ris

k

CA

Sys Qual / PPT

Now

Comp Qual / EDT

CDRCTVRFP

Phase 1 (12-14 mos)

DRR

Phase 2 (36 mos)

GTV

MS CEnd of

Phase 1

Program

Start

Eventline Not to Scale

PDR

Second Source Seeker Option

• Proposal Deliveries

• Down Select to One

Contractor (two Domes)

• Produceability Effort

Initiated

• ManTech Results

• Final Design Selected

(Most Producible Dome)

• Lab Testing

• Environmental Testing

• Tower Testing, CFT of

RRS

• HWIL

• Initial E3 Testing

• Production Prove Out

• Tower Test, CFT, DBF

• EDT Flight Tests

• CFT • Flight

Tests

Prime Contractor

Selects Second

Source Seeker 90

Days After CA

(with Dome and IFS

Model)

10

RFP: Request for Proposal

CA: Contract Award

CTV: Captive Test Vehicle

CDR: Critical Design Review

PDR: Preliminary Design Review

DRR: Design Readiness Review

PPT: Production Prove out Testing

EDT: Engineering Design Test

Page 11: U.S. ARMY TEST AND EVALUATION COMMAND€¦ · 2. Determine Key Evaluation Metrics You know when you are done…when you have a list of key metrics that will be analyzed using data

5. Execution and Reporting

You know when you are done…when you have an

graph that shows progress towards meeting the

critical capabilities and can be summarized in

effectiveness, suitability and survivability.

Sources:

• All previous information

• DT data and analysis

0

10

20

30

40

50

60

70

80

90

100

0 2 4 6 8 10 12 14 16 18 20 22 24 26 28 30 32 34 36

Perc

ent C

apab

ility

On-

Trac

k

Months

Key Capability

Baseline Observed Performance

EffectivenessR1/COI 2

R2/COI 3 R3/COI 1

Suitability

Reliability COI 4

Survivability

Cybersecurity COI 5

Thoughts: best when…

• progress is related to the overall desired effectiveness, suitability and survivability.

11

Page 12: U.S. ARMY TEST AND EVALUATION COMMAND€¦ · 2. Determine Key Evaluation Metrics You know when you are done…when you have a list of key metrics that will be analyzed using data

Example: AIAMD• ATEC will assess progress against key AIAMD capabilities and system

performance after each test event and summarize results yearly.

Results

% progress

towards R-Form

promises and

Critical

Operational

Issues

17 related

AIAMD key

capabilities and

attributes from

CDD & gaps

related KPP,

KSA, and AA

measures

R1/COI 2: Common Mission Command

R2/COI 3: Distributed Operations

R3/COI 1: Plug and Fight

Dynamic Force Tailoring

Joint Engagements

Defense Design & Planning

Extended Battlespace

360 Engagement

Improved Aerial Target ID

Flexible Interceptor Selection

Span of Control

Track Accuracy & Loading

COI 4: Suitability

COI 5: Survivability

Reliability (Software Stability)

Built In Test

Enhanced Realistic Training /

Improved Soldier, Leader, Team

Safety (currently high risk rating)

Tech Manual Readiness

Enhanced Defense against EA

Cybersecurity

KPPs

KSAs

Additional

Attributes (AA)

Measures of

Effectiveness

Measures of

Performance

arethrough

examining

using quantitative

assessment of

EffectivenessR1/COI 2

R2/COI 3 R3/COI 1

Suitability

Reliability COI 4

Survivability

Cybersecurity COI 5

12

Page 13: U.S. ARMY TEST AND EVALUATION COMMAND€¦ · 2. Determine Key Evaluation Metrics You know when you are done…when you have a list of key metrics that will be analyzed using data

• Continuous Evaluation of Critical Capabilities

• Phased Planning and Execution by PM, FORSCOM, ATEC

• Phased Reporting at Key Programmatic Decision Points

• Benefits:

―More efficient use of all available data.

―Enables planning for and visibility of risk reduction during development.

―Increases probability of successful IOT.

―Reduces the impact on the operational force.

In-Stride Operational Assessments

Create understanding of the maturity of a system continuously in order to

increase the system’s chance of passing and reduce the scope (schedule) of IOT.

13

Page 14: U.S. ARMY TEST AND EVALUATION COMMAND€¦ · 2. Determine Key Evaluation Metrics You know when you are done…when you have a list of key metrics that will be analyzed using data

14

Page 15: U.S. ARMY TEST AND EVALUATION COMMAND€¦ · 2. Determine Key Evaluation Metrics You know when you are done…when you have a list of key metrics that will be analyzed using data

backup

Page 16: U.S. ARMY TEST AND EVALUATION COMMAND€¦ · 2. Determine Key Evaluation Metrics You know when you are done…when you have a list of key metrics that will be analyzed using data

0

10

20

30

40

50

60

70

80

90

100

0 2 4 6 8 10 12 14 16 18 20 22 24 26 28 30 32 34 36

Per

cent

Cap

abili

ty O

bser

ved

Months

Critical Operational Capability

Baseline Observed Performance

Contractor

DT

OTGovernment

DT

Page 17: U.S. ARMY TEST AND EVALUATION COMMAND€¦ · 2. Determine Key Evaluation Metrics You know when you are done…when you have a list of key metrics that will be analyzed using data

0

10

20

30

40

50

60

70

80

90

100

0 2 4 6 8 10 12 14 16 18 20 22 24 26 28 30 32 34 36

Per

cent

Cap

abili

ty O

bser

ved

Months

Critical Operational Capability

Baseline Observed Performance

OT

Government

DT