model-based design for safety critical applications · 3 ® ® attributes of safety critical...
TRANSCRIPT
©20
07 T
he M
athW
orks
, Inc
.
® ®
Model-Based Design for Safety Critical ApplicationsBill PotterThe MathWorks
2
® ®
Attributes of Safety Critical Systems
Reliably perform intended functionContain no unintended functionImplemented with redundancyContain fault detectionRobust designRobust code
3
® ®
Attributes of Safety Critical Process
Complete and correct requirementsDesign standards are appliedCoding standards are appliedBi-directional traceabilityRequirements based testingRobustness verificationCoverage analysisSafety Analysis Failure Modes and Effects Analysis (FMEA)
4
® ®
Safety-Critical Model-Based Design Workflow and Activities
Controller Design
Simulate to Verify Design
Generated Source Code
Executable Object Code
Hardware
Design Process
Coding Process
Integration Process
• Controller design• Correct• Robust• Traceable• Conforms to standards• FMEA
• Generated code• Correct• Robust• Traceable• Conforms to standards
•Software-Software integration•Hardware-Software integration•Processor in-the-loop•SystemTest•Simulink Report Generator
• Compiled code• Correct• Robust• Coverage Analysis
•Real-Time Workshop® Embedded Coder•PolySpace Verifier•Simulink Report Generator
•MATLAB•Simulink/ Stateflow•Simulink Verification & Validation•Simulink Design Verifier•SystemTest•Simulink Report Generator
Goals• Requirements Document• Plant model• Complete• Correct• Test cases• Safety Analysis
Requirements
Simulate prototype to Validate Requirements
Requirements Process•DOORS, TCE, MSWord, etc•MATLAB®•Simulink®•Stateflow®•Simulink Report Generator
5
® ®
Requirements Process for Model-Based Design
Functional, operational, and safety requirementsExist one level above the modelModels trace to requirements
Requirements validationProve requirements are complete and correctSimulation is a validation technique Traceability can identify incomplete requirementsModel coverage can identify incomplete requirements
Requirements based test casesTraceability of tests to requirements
6
® ®
Simulation example – controller and plant
7
® ®
Requirements trace example – view from DOORS to Simulink
8
® ®
Requirements trace example – view from Simulink to DOORS
9
® ®
Requirements-based test trace example – view from Simulink Signal Builder block to DOORS
10
® ®
Model coverage report example
11
® ®
Requirements Process take-aways
Early requirements validationEliminates rework typically seen at integration on projects with poor requirements
Early test case developmentValidated requirements are complete and verifiable which results in well defined test cases
Requirements management and traceabilityRequirements management interfaces provide traceability for design and test cases
12
® ®
Design Process for Model-Based DesignModel-Based Design
Create the design - Simulink and Stateflow Modular design for teams - Model ReferenceModel architecture/regression analysis - Model Dependency ViewerDocumented design - Simulink Report Generator
Conformance to standardsDesign conforms to standards – Model Advisor
TraceabilityDesign to requirements - Requirements Management Interface
13
® ®
Example detailed design including model reference and subsystems
Subsystem Reference Model
Top Model
14
® ®
Model dependency viewer
15
® ®
Example Model Advisor report
16
® ®
Design Verification for Model-Based Design
Requirements based test casesAutomated testing using SystemTest/Simulink V&VTraceability using Requirements Management InterfaceCapability to inject faults for FMEA
Robustness testing and analysisBuilt in Simulink run-time diagnosticsFormal proofs using Simulink Design Verifier
Coverage AnalysisVerify structural coverage of modelVerify data coverage of model
17
® ®
SystemTest for requirements based testing
18
® ®
SystemTest – example reportData Plotting and expected
results comparisons
Summary of results
19
® ®
Signal Builder and Assertion Blocks
20
® ®
Model coverage report example – signal ranges
21
® ®
Simulink® Design Verifier – Coverage Test
Generated Test Cases
Model Test Report
22
® ®
Simulink Design Verifier – Objective Test
Generated Test Cases
Model with Constraints and Objectives Test Report
23
® ®
Simulink Design Verifier – Property Proving
Property to be proven
Model with Assumption and Objective Report
24
® ®
Design Process take-aways
Modular reusable implementationsPlatform independent design and codeScalable to large teams
Consistent and compliant implementationsCommon design language Automated verification of standards compliance
Efficient verification processDevelop verification procedures in parallel with designAutomated analysis techniquesCoverage analysis early in the process
25
® ®
Coding Process for Model-Based Design
Incremental code generationModel Reference
TraceabilityHTML Code Report
Source code verificationComplies with standards using PolySpace MISRA-C CheckerAccurate, consistent and robust using PolySpace Verifier
26
® ®
dependent models rebuilt
model changed and rebuilt
Incrementally Generate CodeIncremental code generation is supported via Model ReferenceWhen a model is changed, only models depending on it are subject to regeneration of their code
Reduces application build times and ensure stability of a project’s codeDegree of dependency checking is configurable
27
® ®
Add Links to Requirements
Requirements appear in the code
28
® ®
Compliance history of generated code• Our MISRA-C test suite consists of several example models
• Results shown for most frequentlyviolated rules
Improving MISRA-C compliance with each release, e.g.Eliminate Stateflow goto statements (R2007a)
Compliant parentheses option available (R2006b)
Generate default case for switch-case statements (R2006b)
MathWorks MISRA-C Compliance Package available upon request http://www.mathworks.com/support/solutions/data/1-1IFP0W.html
29
® ®
Coding Process take-aways
Reusable and efficient source codeTraceabilityMISRA-C complianceStatic verification and analysis
30
® ®
Integration Process for Model-Based Design
Executable object code generationANSI/ISO C or C++ compatible compilerMakefile generation capabilityRun-time libraries provided
Executable object code verificationCapability to build interface for Processor-In-the-Loop (PIL) testingAnalyze code coverage during PILAnalyze execution time during PIL
31
® ®
Processor-in-the-Loop (PIL) Verification- Execute Generated Code on Target Hardware
Embedded Target
Simulink
Plant ModelAlgorithm
(Software Component)
Cod
e G
ener
atio
n
Execution
• on host and target• non-real-time
Communication via one of
• data link e.g. serial, CAN, TCP/IP• debugger integration with MATLAB
32
® ®
Integration Process take-aways
Integration with multiple development environmentsEfficient processor in-the-loop test capability
33
® ®
Wrap-upTools to support the entire safety critical development process
RequirementsDesignCodeExecutableVerification
MathWorks is participating on SC-205/WG-71 committee which is working on Revision C of DO-178See the various demos in the exhibit area