dt&e: evaluation framework to statistical integrated test

31
DT&E: Evaluation Framework to Statistical Integrated Test Design First the “E” then the “T” Dr Suzanne M. Beers MITRE Corp support to DASD(DT&E) 14 November 2013 ITEA Annual Symposium, Tech Track 2

Upload: others

Post on 28-Feb-2022

4 views

Category:

Documents


0 download

TRANSCRIPT

DT&E: Evaluation Framework to

Statistical Integrated Test Design

First the “E” then the “T”

Dr Suzanne M. Beers MITRE Corp support to DASD(DT&E)

14 November 2013

ITEA Annual Symposium, Tech Track 2

TEMP should articulate a logical evaluation & test plan that informs the program’s decisions

– How acquisition, technical and

programmatic decisions will be informed by evaluation

– How system will be evaluated

– How test and M&S events will provide data for evaluation

– What resources are required to execute test, conduct evaluation, and inform decisions

2

Purpose and Overview

Decisions

Evaluation

Test / M&S

Resources Schedule

Defi

ne

Info

rm

Defi

ne

Data

Defi

ne E

xecu

te

DT&E: First focus on the “E”; then plan the “T”

Testers naturally want to test… BUT, T&E’s purpose is to inform –

decisions, product development, etc. SO, first step should be to define an

evaluation strategy – How to evaluate system performance against

relevant requirements to inform decisions?

THEN, define the test events that will generate data to feed the system evaluation

…to inform the decisions that need to be made

3

Why Focus on Evaluation?

Evaluate system capability to support decisions

PM, DT&E and SE Evaluation Framework

CDIs: Critical Developmental Issues (CDIs) are the questions to be answered to inform decisions

DEOs: Developmental Evaluation Objectives (DEOs) are system capabilities evaluated to answer CDIs

TMs: Technical Measures (TMs) are measured during test to quantify performance

USD(AT&L)

Service Acquisition Executive

Program Executive

Officer

Program Manager

Acquisition Decisions

Programmatic Decisions

Technical Decisions

Evaluation Strategy Informs Decisions

Inform Decisions – Early & Often

Early: Use Evaluation Framework

support to decisions (in the initial

TEMP) to inform the RFP for initial

contract in advance of the first major

milestone. Industry then knows how

they will be evaluated.

Often: Inform decisions

based on performance,

reliability, interoperability, and

cyber (DT&E Assessments)

throughout lifecycle

GFE acceptance … Down selects … P3I determination … Long-lead item approval

Award fee determination … Run for record readiness review … Shipment approval

Component integration … Trade study decisions … ATO/IATO … M&S VV&A

Technology readiness … EMD / production readiness

DT&E provides decision-maker with system capability information, with operational “so what?” and support for acquisition “now what?”

TRD provides system capability requirement definition Consider mission capabilities (KPP/KSA; CDD, STAR, CONOPs, Enabling

Concept)

Evaluate System Capability in a Mission Context

Level of detail

for the TEMP

Developmental Evaluation Framework

Capability questions

Critical Dev Issue (CDI)

DEO 1 DEO 2 DEO 3

TM 1 TM 2 *TM 3

System capabilities

KPP/KSA-related

Technical measures

System Engineering decomposition - evaluating system capability - inform decisions

Illustrates how DT&E will evaluate the system’s capabilities and inform acquisition, technical, program decisions

8

Developmental Evaluation Framework

CDI #1 CDI #2 CDI #3 CDI #4

DEO #1 *TM12

DEO #2

TM14

TM15 TM13

DEO #3 TM1

TM3

TM4

DEO #4 TM16

DEO #5 TM2

DEO #6

TM9

TM10

TM11

DEO #7 *TM5

DEO #8 *TM7

DEO #9

TM17

TM18

TM19

DEO #10 TM8

Critical Developmental Issues (CDI): Key questions to focus developmental evaluation to inform decisions Developmental Evaluation Objectives (DEO): System capabilities Technical Measures (TM): Measures of critical system characteristics Bold and asterisk TM (*TM): Key technical measures. Related to CDD’s KPP/KSA or technically important system characteristic

9

Example 1 - Space Fence

Ground-based S-band radar to detect, track, and report on space objects to provide space situational awareness

Radar

Coverage Accuracy Sensitivity

Time Accuracy Angle Accuracy Range Accuracy

Space Event Proc

Correlation Detect Track

# Detect Size Detect Accurate Detect

Operational Requirements

OT Framework Technical Requirements

DT Framework

Space Fence Tech & Ops Decomposition

Technical & Operational

Issues

Technical & Operational

Measures

Technical & Operational Objectives

COI’s – Inform Ops Evaluation Decisions

11

Mission: Space Fence supports the space control mission by providing space surveillance data

Does SF provide data to support space event processing?

Does SF provide data to support resident space object processing?

Does SF provide data to support characterization?

Does SF support net-centric operations?

Can SF be sustained in the operational environment?

Can SF be controlled to meet mission taskings?

CDI’s – Inform Capability Decisions

12

Technical Mission Statement: Design and build a ground based radar system to provide LEO and MEO coverage to meet space situational awareness mission requirements

Does the radar provide coverage, sensitivity, and accuracy sufficient to detect and track LEO

and MEO objects?

Is the radar data processing, handling, and storage sufficient to characterize, correlate,

track, and report space objects?

Are command and control and interfaces sufficient to provide tasking to the radar and surveillance information to the SSA customer

Are environmental effects sufficiently planned for and executed?

Are Life Cycle Cost factors considered and balanced with other design factors sufficient to provide a reliability, maintainable, available, and

economical system?

Are planned and executed system and information protections sufficient to ensure information assurance and physical security?

Space Fence DT Evaluation Framework

13

Critical Developmental

Issues

Developmental

Test Objectives

CDI #1: Does the radar

provide coverage,

sensitivity, and accuracy

sufficient to detect and

track LEO and MEO objects?

CDI #2: Is the radar data

processing, handling, and

storage sufficient to

characterize, correlate,

track, and report space

objects?

CDI#3: Are command and

control and interfaces

sufficient to provide tasking

to the radar and

surveillance information to

the SSA customer

CDI #4: Are environmental

effects sufficiently planned

for and executed?

CDI #5: Are planned and

executed system and

information protections

sufficient to ensure

information assurance and

physical security?

CDI#6: Are Life Cycle Cost

factors considered and

balanced with other design

factors sufficient to provide

a reliability, maintainable,

available, and economical

system?

*LEO uncued search

coverage

*LEO cued search coverage

*Coverage flexibility

*LEO sensitivity

*MEO sensitivity

LEO/MEO/HEO

simultaneous operations

Closely spaced operations

resolution

*Angle (az/el) accuracy

*Range accuracy

*Time accuracy

*RCS accuracy

*Obs tagging integrity

(includes correlate & tag)

Atmospheric calibration

Systematic error calibration

RCS calibration

Radar calibration

Metric obs formation and

dissemination

RCS determination and

dissemination

Space object identification

Continuous surveillence

fence

Unknown object detection

and tracking

Re-entry (decay/de-orbit)

processing

On-orbit maneuver support

Break-up detection and

processing

Anti-satellite support

Proximity operations

support

Cluster procession

Launch processing

On-orbit event processing

Sensor metric obs per day

Sensor space object storage

capacity

SOC metric obs storage

capacity

SOC space object storage

capacity

Radar capacity

System growth

Metric obs processing

latency

*Observation throughput

*Tasking latency

Client request latency

Client request throughput

Computational latency

Computational throughput

Catalog synchronization

across the SSN

Remote operations

UCT processing

Protection and

prevention/negation ops

support

Database management

Health and status

monitoring

Intra-Space Fence handoffs

& self tasking

Intra-Space Surveillence

Network handoffs & cueing

Direct site tasking/cueing

Flash element set

scheduling

Self tasking

*External interface

*Data and metadata

Data flow

*Detect & Report EMI

Reduce interference from

external sources

Negate in-band RFI from

external sources

*Restrict RF output (Not

cause detrimental

interference)

Electromagnetically

compatible w/itself & ops

envir

*Operate within S-band

Operator settable exclusion

zones

RF power limits

Element set data

management

Metric and characterization

data management

Datastore maintenance

*Physical security

Data security

Communication security

*IA controls

*Continuity controls

*Security design and

configuration controls

*Enclave boundary defense

controls

*Enclave and computing

environment controls

*Identification and

authentication controls

*Physical and

environmental controls

Program protection

planning

*Anti-tamper controls

*Mean time between

failure (MTBF)

*Mean time between

critical failure (MTBCF)

Mission-essential functions

list (MEFL)

Service life

Independence of failures

Failure detection

Operational restoration

Fractions of failures isolated

*Mean time to repair

(MTTR)

*Maximum maintenance

time(MMAX)

*Mean time between

downing events (MTBDE)

*Mean down time (MDT)

Safety

Human Factors Engineering

Logistics design

consideration

Preventative maintenance

Materials (includes

corrosion control

Documentation Ops and maintenance TO's

*Sim-over-Live Capability

*Training capability

Radio frequency compliance

Uncued operations capability

Cued operations capability

System capacity

Technical Mission Statement: Design and build a ground based radar system to provide LEO and MEO coverage to meet space situational awareness mission requirements

Radar coverage

Radar sensitivity

Observation accuracy

System calibration

Surveillence and

Characterization process

Latency and throughput

Space Operations Center (SOC)

capability

Sensor tasking capability

Interface

Electromagnetic Environment

Effects (E3)

Operational availability

Logistics/Maintenance/LCC

Training

Data reception and retention

Security

Information assurance

Program Protection Planning

Reliability

Maintainability

DT Measures

Developmental Evaluation Objectives

Mission & CDIs

14

Example 2 – GPS Enterprise

Global Positioning System (GPS) Enterprise modernization : satellite (GPS III), control segment (OCX), user equipment (MGUE)

COI’s – Inform Operational Decisions

15

Mission: GPS provides precise information to properly equipped users in support of specific mission objectives

Can GPS broadcast PNT data that supports the mission of properly equipped users?

Can the warfighter employ PNT data?

Does GPS command, control, and monitoring support all functions of GPS

operations?

Does GPS provide MP in EW environments to properly equipped users?

Does GPS sustainment support mission operations?

CDI’s – Inform Operational Readiness & Program Acquisition Decisions

16

Enterprise EF CDI – Guide cross-segment evaluation for operational readiness decisions

Segment EF CDI – Guide evaluation for segment

acquisition decisions

Can GPS provide accurate PNT data to users?

Does GPS support NAVWAR operations?

Can GPS support secondary payload missions?

Can the control segment command and control the

constellation?

Is GPS secure?

Is GPS sustainable?

Can MGUE support both legacy and modernized

signals?

Can MGUE be integrated into lead platforms to support msn

ops?

Is MGUE secure?

Can MGUE operate in a NAVWAR environment?

Is MGUE sustainable?

GPS DT Evaluation Framework

17

CDI #1: PNT CDI#2: NAVWAR CDI#3: Secondary Payload CDI #4: C2 CDI #5: Security CDI #6: Sustainable

Signal-in-Space (SIS) SV Integration Test Proc C/O SV Integration Test Proc C/O Text Msg Transmit Time Analysis

User Range Error (URE) Analysis GNST Integration Test OOH Procedure C/O A-1: Simulated Operations

NAV Upload SV01 Integration Test GNST Integration Test User Range Error (URE) Analysis

A-1: Simulated Operations Text Msg Transmit Time Analysis SV01 Early Compatability Test User Range Acceleration Error (URAE)

User Range Rate (URRE) Analysis A-1: Simulated Operations SV01 Integration Test

User Range Acceleration Error OOT Procedure C/O

SV Integration Test Proc C/O SV01 Launch Compatibility Test

OOT Procedure C/O Reference Frame

GNST Integration Test

SV01 Early Compatability Test

SV01 Launch Compatibility Test

GPS Time

Passive Contact

OCX Transition Performance

G-7: GPS III SV to OCX OO C/O

Special Method E1 - Integrity Status

A-1: Simulated Operations

SV01 Integration Test

User Range Accuracy (URA) Analysis

A-5: NAV Upload

LCC/LCS Readiness Test (LRT) - 1b

Exercise 3; Exercise 5

Rehearsal 2; Rehearsal 3

Special Method E2 - Single Pt Failures

Special Method E6: Failure Prop

PSICA Reqmts Analysis

Rehearsal 5 Space Vehicle Outages

Mission Dress Rehearsal (MDR)

Constellation Management

SV Integration Test Proc C/O PSICA Reqmts Analysis

OOH Procedure C/O A-1: Simulated Operations

GNST Integration Test Signal Power

SV01 Early Compatability Test Special Method E3 - Failure Behavior

SV01 Integration Test System Failures

SI Demo 3: L-Band Back Compat

A-1: Simulated Operations

G-6: NAV Upload Over WSMR

Height/Cross/Along-Track Error

User Range Rate (URRE) Analysis

User Range Acceleration Error

PSICA Reqmts Analysis

A-5: NAV Upload

OOT Procedure C/O

COMSEC

A-3: NOOP Command

SV Integration Test Proc C/O

GNST Integration Test

SV01 Integration Test

OOH Procedure C/O

A-1: Simulated Operations

OTAR Transmission

G-6: NAV Upload Over WSMR

Net Centric ServicesNet-Centric Enterprise Services

(NCES) Compliance

SV Integration Test Proc C/O

GNST Integration Test

SV01 Early Compatability Test

SV01 Integration Test

G-7: GPS III SV to OCX OO C/O

G-4: NAV Init, USNDS Command

OOH Procedure C/O

A-1: Simulated Operations

SIS Monitoring

A-2: Passive Contact

SIS RF Power Measurement

Additional Payloads

SIS Monitoring

Position, Velocity,

Time (PVT)

Signal in Space

Availability &

Continuity

Backwards

Compatibility

GPS Key Management

Developmental Evaluation Objectives

Critical Developmental Issues

Test & Analysis Events

18

Example 3 - SBIRS

Space-based infrared sensors and ground-based control and processing to provide missile warning, missile defense, technical intelligence, and battlespace awareness

COI’s – Inform Ops Readiness Decisions

19

Mission: SBIRS supports strategic and TMW, missile defense operations, technical intelligence, and battlespace awareness

# COI

1 Can SBIRS support North American and rest-of-world MW operations in a non-nuclear environment?

2 Can SBIRS survivable and endurable assets support MW/NA, global attack, and defensive operations through all phases of nuclear conflict?

3 Can SBIRS support homeland and theater MD operations?

4 Can SBIRS support theater combatant commanders though theater MW operations?

5 Can SBIRS collect, process, archive, and disseminate exploitable IR event information to the TI community?

6 Can SBIRS enhance operational/intelligence assessments to support the combatant commander through timely characterization of the battlespace?

7 Does SBIRS enable Ops to manage Ground / space elements to support mission tasking /system health?

8 Do SBIRS maintenance and sustainment support continuous operations?

CDI’s – Upcoming Acquisition Phases Crossed with Mission Areas

20

Technical Mission Statement: Design and build satellites, infrared sensors, ground command and control, mission data processing to inform MW, MD, TI, and BA mission with IR information

Ground Block 10.3

Ground Block 20 (Inc 2)

Space Vehicle Readiness

Mobile Ground System (S2E2)

Missile Warning

Missile Defense

Technical Intelligence

Battlespace Awareness

SBIRS DT Evaluation Framework

21

Mission Areas Supported

Critical Developmental Issues

Developmental Evaluation Objectives

Measure for CDI/DEO eval

Technical Measures

Example 4 – Enhanced Polar System

22

Protected SATCOM (EHF) for polar-region users consisting of 4 segments:

EPS Payload Segment, EPS Terminal Segment, EPS Control and Planning Segment

(CAPS), EPS Gateway Segment

COI’s – Inform Ops Readiness Decisions

23

Does EPS provide protected, extended data rate

communication to users in the North Polar Region?

Can EPS be controlled to support warfighting operations?

Does EPS maintenance and sustainment support mission

operations?

CDI’s – Inform Capability & Integration Decisions

24

Can the terminals communicate with the payloads?

Is CAPS capable of mission planning?

Can CAPS command and control PL using in-band?

Is CAPS capable of utilizing out-of-band T&C through the Host Interface?

Is EPS secure?

Is the Gateway capable of connecting polar users and mid-lat users?

Is EPS sustainable?

EPS DT Evaluation Framework

25

System capabilities

(DEOs)

TM to evaluate DEO/CDI

Linked Integrated

System Tests

KPP/KSA associated TM

highlighted

Enterprise CDIs

Integrated Test (IT) is intended to… – Combine test resources (events, assets, ranges)

– Generate data to evaluate using DT or OT evaluation framework – independent evaluation

– Inform DT or OT decision-makers – different decisions

Integrated Test is NOT intended to be… – DT&E graduation exercise

– OT&E pre-exam

How should I design an analytically-rigorous IT? – At objective level, define common input factors/conditions,

output measures of interest

– Develop input, process, output (IPO) diagram to illustrate IT design

– Apply STAT to generate common test cases

26

THEN Plan the Test -- Integrated DT/OT

OT EF DT EF Tech perf questions

Crit Dev Issue (CDI)

DEO 1 DEO 2 DEO 3

TM 1 TM 2 *TM 3

Technical capabilities

KPP/KSA related

Technical measures

Ops perf questions Crit Ops Issue (COI)

OT Obj 1 OT Obj 2 OT Obj 3

MOE 1 MOE 2 *MOS 3

Operational capabilities

Some are KPPs

Operational measures

27

EFs Defines IT Data Needs

Potential common

data for IT

IT Design – Objective Comparison

28

Objective (Capability) Measure Description Measure Quantitative Value Factors Note

Coverage

Number of tracks per object

(KPP)

# vary by altitude (CDD

Table 6-1, pg25)

Altitude, Inclination, Cued,

Uncued, Time (27 hour

period), Orbit shape

Number of objects

simultaneously tracked ≥200

Similar DT measure in

Capacity objective

Detectable target size (KPP) Size, altitude

Object discrimination "best available"

Similar DT measure in

Sensitivity objective

Radar coverage Range Min = 100Km, Max ≥ 40K Km TRD Para 3.1.4.5.1 pg 18

Track angle ±70 degrees

Configurable Operator configurable

LEO Uncued surveillance

coverage

Altitude, Inclination, Time

(27 hour period), Orbit

shape

LEO Cued surveillance

coverage

Altitude, Inclination, Time

(27 hour period), Orbit

shape

Coverage flexibility

Enhanced sensitivity over a

settable region in space to

meet LEO performance

requirements

Target size, polarization,

altitude

TRD Table 3, pg 20

Measure may be a better fit

in Sensitivity DT Obj

OT&E

DT&E

Common DT and OT objectives

(process)

Common factors & levels (input)

Associated measures (output)

IT Design Example – IPO Diagram

29

Common DT & OT objective, factors, levels create test design

DT measures

OT measures

Summary

DT EF focuses system evaluation (in mission context) to inform programmatic, acquisition and capability decisions

– CDI (decision) DEO (capability) TM (measure)

Design test plans to generate data to feed EF

– Use STAT / DOE to design rigorous and complete test campaigns

Analyze data to assess performance against DT measures and objectives; inform decisions (CDIs)

Document evaluation framework, test phasing, resources in TEMP

30

Decisions

Evaluation

Test / M&S

Resources Schedule

Defi

ne

Info

rm

Defi

ne

Data

Defi

ne E

xecu

te

Questions/Comments?

Suzanne M. Beers | [email protected] | 719-572-8257