meeting the challenges of unmanned and autonomous system test and evaluation thomas tenorio, subject...
Post on 21-Dec-2015
218 views
TRANSCRIPT
Meeting the Challenges of Unmanned and Autonomous System Test and Evaluation
Thomas Tenorio, Subject Matter Expert for UAST Executing Agent
10 March 2010, USC
Activities• Working Group
– Roadmapping, Surveys, Networking, Tech Eval• BAA Supply Space Surveys• UAST Roadmap
– FY2009 Unmanned Integrated Roadmap Findings– FY2010 UAST Roadmap
• Community / Professional Organization Engagement– Test Communities– Standards Communities– ITEA: tutorials and workshops– AUVSI– INCOSE (International Council of System Engineers)
The Test Resource Management Center
• TRMC under AT&L• UAST part of T&E S&T
3
T&E S&T6.3
CTEIP6.4
JMETC6.5
UAST
DETMST
NIISET
NST
TRMC Mission and Vision• Mission
– "Plan for and assess the adequacy of the…MRTFB…[and] to provide adequate testing in support of development, acquisition, fielding, and sustainment of defense systems; and, maintain awareness of other T&E facilities and resources, within and outside the Department, and their impacts on DOD requirements."
• Vision – The DoD T&E ranges and facilities will be fully capable of supporting the Department with quality
products and services in a responsive and affordable manner.
4Major Range and Test Facility Base
22 rangesTagged as National Asset
UAS Test Bed and Environment S&T for physical test
capabilities associated with Test Bed and Environment
– Stimulus selection or generation
– Sensors– Data Acquisition and
Management– Data compression and
characterization (signatures)– Data
reduction/analysis/interpretation
– Power Technologies: providing increased mission time and capability without increasing the logistics footprint.
– Test conducting including situation awareness
– Test operations safety– FAA and other civil
coordination– C4ISR interoperability
TRMC Research and Interaction• Test and Evaluation,
Science and Technology
• Three groups organized by TRL level– T&E S&T
• TRL 3-6 or 6.3 funding
• $95M / year
– CTEIP• TRL 6-9 or 6.4
funding• $140M / year
– JMETC• 6.5 funding• $10M / year
6
TRL Technology Readiness Levels• Criteria used
to assess project eligibility and status– Measures
system maturity
– Non-linear in time, money and effort
– Not well understood by community
7
T&E S&T• Test and Evaluation, Science and Technology
• 7 Focus Areas– UAST– SET– NST– NII– MST– HSHT– DET
8
Unmanned Systems Integrated Roadmap
• The UAS Challenge– Capabilities (311 JCA
named targets)– Systems (138
systems)– Performance
Envelope Aspects (41)
– Technologies (17)
04/18/23 9
UAST Augmented Study Methodology
Refined Drivers, Use Cases, Test Concept ,Test Plans
BAA, RFI, White Papers& Proposals
Facility Specific Descriptions &Requirements (Hard Numbers)
Tri-Service Baseline Capability
Draft Test ResourceNeed Statements
Roadmap
Refined Test ResourceNeed Statements
Draft Drivers, Use Cases, Test Concept ,Test PlansDraft Test Requirements
Validation
Test Resource Requirements
Test Resource Survey Baseline
Needs Analysis (Gap)
Validation
Develop Solutions
Prioritize Solutions
Documented (PoR) UAS Missions and Technologies per USIR
Refined and Extended (beyond PoR) UAS Missions and Technologies with Working Group
T&E Needed to Test Specific UAS Technologies
Baseline T&E Capabilities (we do not want to conduct an exhaustive survey of all the T&E capabilities that exist and could support UAS T&E)
Gaps sensitive to New Approaches and New Paradigms
Refined Needs Statements
BAAs, etc
Road-Map
Relevant Commentary on Test
Urgent Needs for UAS
Outcome Timelines
Cross-Domain Commonality ……......... Specificity
SpaceAirGroundMaritime (surface/under)
UAST Tools & Techniques
Assessing Effects and Capabilities
Reference Data Sets Ground Truth Decision & Behavior
Predicting Behavior
Test and Evaluation of UAS as Highly Complex Systems
UAST Technologies
Protocols & Design
Emulating Mission & Environmental Complexity with Assured Safety
Near Mid Far
Risks
Constraints
UAS Safety, Suitability, Survivability, Effectiveness
Means, Ways, and Ends to UAST as a Value Proposition
V&VWhat if --?
IntegrityAssessment
Surrogate & Simulate
EmulateLive
ContrivedLive
in situ
Test Bed and Environment
Use Case Driven Investment
04/18/23 15
The Target for UAST
04/18/23 16
BAA Technology Investment Categories
04/18/23 17
Integrating Findings of the FY2009 Unmanned Integrated Roadmap
04/18/23 18
Ensure test capabilities support the fielding of unmanned systems that are effective, suitable, and survivable
UAS for Operational Necessity (rapid tempo)
• Evolutionary acquisition with JUTLS & JUONS
• Booming capability development
• Capability challenge of 311 named systems
• Majority system non-PORs• Fielding tech in months: 4-6
months for joint operational necessity
Test of UAS (increasing tempo for Integrated T&E)
• Accelerate incrementally improving T&E
• Augment legacy and improvised capability
• Emerging arguments for UAST value proposition
• AS-IS: OT&E emphasis with emergent Joint T&E
• Fielding UAST in years: 3 year in S&T and 4 years in T&E
UAST: KnowledgeRisk Reduction
UAS: WarfighterCapability
The Interacting Communities of UAS Advantage
UAS: WarfighterCapability
UAST: KnowledgeRisk Reduction
UAST S&T:Next Gen TechFor UAST
Value Proposition of UAST• UAST to further
ensure Safe, Suitable, Effective, and Survivable UAS
• Pace and tempo to secure advantage for UAS
• Advancing Knowledge Generation capabilities for risk reduction in the production of UAS
WarfighterOperations
Effects
designtest
productiondeployment
CapabilityTest
DevelopmentWarfighterKnowledge
1) Must get inside the Capability Evolution Loop2) Must endure throughout the Capability operational life cycle
12
Capability Development
Concept
Concept
The Tester Evolution Loop
Knowledge
UAST Driver Overview• UAST has primary drivers
– Autonomous Behavior of unmanned and autonomous systems that sense, understand and act upon the environment in which they operate
– Safety of Autonomous systems in Mission and Environment• Secondary drivers
– Sensory Capacity & Perception Loops (Observe)– Knowledge Models of Ground Truth & Behavior (Orientation)– Decision Making (Decide)– Supervised Autonomy Behavior (Action)
• Context– Systems Testing of Human Independent Behavior– Emulating UAS in Mission and Environmental Complexity with
Assured Safety– Assessing UAS Effects and Capabilities in Joint Capability Areas
UAST Exemplar Overview
Leveraging S&T for T&E Capability DevelopmentStandard Systems T&E1. System Level T&E2. Mission &
Environment T&E3. Joint Capability
Areas T&E
Autonomous System T&E1. Predicting
Autonomous Behavior2. Emulating Mission
and Environmental Complexity with Assured Safety
3. Assessing UAS Effects and Capabilities
UAS Test & Evaluation Focus Areas
Non-IntrusiveInstrumentation
Non-IntrusiveInstrumentation Spectrum EfficienciesSpectrum Efficiencies Netcentric SystemsNetcentric SystemsMulti-Spectral SensorsMulti-Spectral Sensors
Unmanned & AutonomousSystems Test
Unmanned & AutonomousSystems Test
UAST Systems EngineeringCapabilities Reference Framework
1. Predicting Unmanned and Autonomous System Behaviors (T&E/E&A)
2. Emulating Mission and Environmental Complexity with Assured Safety (T&E/E&A)
3. Assessing UAS Effects and Capabilities (E&A)
4. Autonomous Test Protocols and Design (T&E)
5. Test Bed and Environments for UAST (T&E)
6. UAST World Models (Ground Truth, Decision, & Behavior) (T&E)
7. Tools and Techniques for Systemic UAST (T&E/E&A)
Autonomous Capabilities Model
Observe Orient Decide Act
Mission ContextAutonomous System/Systems/SoS
[OODA]
Safety Effectiveness Agility Suitability Survivability
T&E Wide Knowledge Management
TestDesign
Test SystemPreparation
ReadinessAssessment
SafetyGuard
Protocol Mgmt
Sensor Mgmt
Data QualityMgmt
Data, presentation & Session managementInteroperability infrastructure
Communications and networking
T&E Capabilities Reference Framework1. Predicting Unmanned
and Autonomous System Behaviors
2. Emulating Mission and Environmental Complexity for Assured Safety
3. Assessing UAS Effects and Capabilities
4. Autonomous Test Protocols and Design
5. Test Bed and Environments for UAST
6. UAST Reference Data Sets (Ground Truth, Decision, & Behavior)
7. UAST Tools and Techniques
Essence of UAST Challenge• Determine
– False Positives and False Negatives
– Dynamic Limits of behavior– Integrity Limits of intended functions
Across UAS(s) OODA
• Regarding Mission– Effectiveness– Suitability– Survivability– Safety
• At >10-fold reduction in• Cycle time• Cost
• And exemplary ROI of S&T
S&T Opportunities for T&Eof Autonomy with Assured Safety
1) Tools for Design of Experiments in system, multisystem, and system of systems scenarios considering also implications of mission scenarios, opposition capability, and physical context.
2) Ability to incorporate UAS design models into warfighter-scope models/simulations in order to anticipate mission suitability, safety, effectiveness and survivability (including countermeasures).
3) Determining how to manipulate live physical scenarios including Red Forces. Acquiring ground truth data during actual test operations.
4) Bayesian Belief Networks and similar tools for conflating test data to mission-level expectations.
5) Ensuring that Net Centric Systems and relevant test assets are sufficiently agile to enable span and dynamics of UAS test scenarios.
6) Systems architecting and engineering of a family of composable UAST’s.
UAST Systems EngineeringCapabilities Reference Framework
1. Predicting Unmanned and Autonomous System Behaviors (T&E/E&A)
2. Emulating Mission and Environmental Complexity with Assured Safety (T&E/E&A)
3. Assessing UAS Effects and Capabilities (E&A)
4. Autonomous Test Protocols and Design (T&E)
5. Test Bed and Environments for UAST (T&E)
6. UAST World Models (Ground Truth, Decision, & Behavior) (T&E)
7. Tools and Techniques for Systemic UAST (T&E/E&A)
Autonomous Capabilities Model
Observe Orient Decide Act
Mission ContextAutonomous System/Systems/SoS
[OODA]
Safety Effectiveness Agility Suitability Survivability
T&E Wide Knowledge Management
TestDesign
Test SystemPreparation
ReadinessAssessment
SafetyGuard
Protocol Mgmt
Sensor Mgmt
Data QualityMgmt
Data, presentation & Session managementInteroperability infrastructure
Communications and networking
Readiness AssessmentReadiness
Assessment
Readiness Assessment CapabilityWhat: Discover internal bugs and vulnerabilities and external
incompatibilities in UAS’s, across UAS’s and in T&E systems.Where: In executable code, source code, data bases, system models,
mission simulations, and SoS configurations.At development, integration, warfighter and depot locations.
Why: Generate warfighter-confident knowledge. Cut test cycle time and cost in half. User controllable degree of False Positives and False Negatives.
How: Code inspection. Test beds not required.Mathematically rigorous assessment method and tools.Enabled by next generation pattern recognition semiconductor chips with throughput ≈ 1 Gb/sec
When: TRL3@2010, TRL5 @2011, TRL6@2012, TRL9@2014
UAST Systems EngineeringCapabilities Reference Framework
1. Predicting Unmanned and Autonomous System Behaviors (T&E/E&A)
2. Emulating Mission and Environmental Complexity with Assured Safety (T&E/E&A)
3. Assessing UAS Effects and Capabilities (E&A)
4. Autonomous Test Protocols and Design (T&E)
5. Test Bed and Environments for UAST (T&E)
6. UAST World Models (Ground Truth, Decision, & Behavior) (T&E)
7. Tools and Techniques for Systemic UAST (T&E/E&A)
Autonomous Capabilities Model
Observe Orient Decide Act
Mission ContextAutonomous System/Systems/SoS
[OODA]
Safety Effectiveness Agility Suitability Survivability
T&E Wide Knowledge Management
TestDesign
Test SystemPreparation
ReadinessAssessment
SafetyGuard
Protocol Mgmt
Sensor Mgmt
Data QualityMgmt
Data, presentation & Session managementInteroperability infrastructure
Communications and networking
Safety Assurance *
Safety Assurance *
Safety AssuranceWhat: Discern and Referee the contest between autonomy and safety,
both a) test safety, including, e.g., FAA, and b) Operational safety, e.g., fratricide and innocent civilians. Enable both static and evolving limits. Assess efficacy of UAS Self-test capability, resilience to cyber threats, probable error in M&S evaluations of UAS(s).
Where: Throughout 5000.02 phases and Warfighter stages. Across UAS, UAS’s, SoS. Spans both on-board and administrator functions.
Why: Avoid unintended consequences of UAS operations. Generate warfighter-confident knowledge. Cut test cycle time and cost in half. User controllable degree of False Positives and False Negatives.
How: A ‘Do No Harm’ OODA loop inside the autonomy loop of both the UAS and the UAST. Method for Preempting UAS behaviors, separate from Planner capability, preferably non-destructive.
When: for autonomy Level 1@2010, 2@2011, 3@2012, 4@2013, 5@2014