the etsi test description language etsi test description language introducing mbt to standardization...
TRANSCRIPT
THE ETSI TEST DESCRIPTION LANGUAGETHE ETSI TEST DESCRIPTION LANGUAGEIntroducing MBT to Standardization
Presented by Andreas Ulrich, Siemens AG for ICTSS 2017, St. Petersburg, Russia
© ETSI 2017. All rights reserved
About ETSI, MTS, CTI
European Telecommunication Standards Institute • Develops standards and test specifications for ICTDevelops standards and test specifications for ICT
• Incl. fixed, mobile, radio, aeronautical, broadcast and internet technologies
• Ensures and assesses interoperability
ETSI’s activities in testing• Technical Committee on Methods for Testing and Specification (MTS) g p ( )
• Responsible for standardizing test and specification methods and languages, guidelines, frameworks
• Testing and Test Control Notation version 3 (TTCN‐3)Testing and Test Control Notation version 3 (TTCN‐3),Test Description Language (TDL)
• Centre for Testing and Interoperability (CTI)• Evaluates test specification technologies• Provides hands‐on support and assistance to TC and projects
• Interoperability events (Plugtests Service)• Interoperability events (Plugtests Service)
© ETSI 2017. All rights reserved
Challenges in Validating ComplexSystemsSystems
CharacterizationComplex designComplex design
→ system of systemsComplex behavior
→ real-time, concurrencyComplex data
→ big data→ big data
Intelligent Transport Systems © ETSI
Adequate validation & testing methods neededProper modeling techniques
Intelligent Transport Systems © ETSI
Proper modeling techniquesProper test automationProper fault analysis
© ETSI 2017. All rights reserved
Support of agile approaches and scenario-based testing
Trend in Software Development Practices
Always shorter product release cycles leadto a merge of system development and
t ti D Osystem operation DevOps
Continuous Feedback on Quality
Shift Left Shift Right
Design Develop Integrate Test Release Deploy Operate
Adaptive Test ArchitectureAdaptive Test Architecture
1 yr 1 mo 1 wk 1 hr 1 min 1 s1 dy
Release in production frequency
© ETSI 2017. All rights reserved
Black‐Box Testing Using Scenarios
Cam Kaner on Scenario Testing,STQE Magazine, Sep./Oct. 2003
• The scenario is a story about someone trying to accomplish something with the product under test.
• Scenarios are useful to connect to documented software requirements, especially requirements q , p y qmodeled with use cases.
• A scenario test provides an end‐to‐end check on a benefit that the program is supposed to deliver.
Scenarios are used to systematically test for the correct implementation of requirements in the system.
© ETSI 2017. All rights reserved
Model‐Based Testing
Can MBT be used for the generationof conformance test suites?of conformance test suites?Evaluated tools• Microsoft Spec Explorer• Microsoft Spec Explorer• Conformiq Designer• Sepp med MBTsuiteSepp.med MBTsuite• Fraunhofer FOKUS MDTester
YES butYES, but…• Model and test generation tool become parameters of the generated
test suite• Models and generated tests differ between tools and are not
comparable Which model and tool are best?
© ETSI 2017. All rights reserved
UML and Testing
OMG UML Testing Profile• Extension of UML to support (model‐based) testingExtension of UML to support (model based) testing
Wide scope modelling notations inherited from UMLMi ht till t t ll dMight still not capture all needs, further profiles needed, e.g. MARTE
i f f l d diff i iSemantic fuzz of UML leads to different interpretations
© ETSI 2017. All rights reserved
Business Value
Support to test design before rushing to test code• Better quality of testsBetter quality of tests• Higher productivity in testing
Common test conceptsCommon test concepts• Enable development of specialized tools for
• Test documentation and visualization• Test documentation and visualization• Generation of executable test code using different strategies• Consistency verification and further analysis of test descriptions
• Harmonize test descriptions coming from different tool sources
Support for customized syntaxpp y• Graphical,• Textual,• Tabular representation of tests
© ETSI 2017. All rights reserved
TDL Usage
Specification
MBT Workflow Manual Workflow
Specification
MBT Model TDL-TO
Requirements Level
Modeling / TP LevelMBT Model TDL-TO g
Test Generator
TDL High-Level Test Cases
TTCN-3 or other Execution Language Adapters Low-Level Test Cases
Executable Tests Execution Level
© ETSI 2017. All rights reserved
Ingredients of a Software Language
AbstractSyntaxSyntax
meta-model
ConcreteSyntaxy
SemanticsBNF grammar
static and dynamicsemantics
© ETSI 2017. All rights reserved
15
Standardization in TDL
AbstractSyntaxSyntax
Standardized, incl.extension capabilitiesp
ConcreteSyntax
SemanticsStandardized graphicaland exchange syntax
Standardized(formal static,informal dyn. s.)
and exchange syntax
© ETSI 2017. All rights reserved
y )
The TDL Standard Series ETSI ES 203 119
TDL‐MM, part 1: Abstract Syntax andAssociated Semantics -G
R
-XF
O r ax
Concrete Syntax
TDL‐GR, part 2: Graphical Syntax
TDL-
TDL-
TDL-
TO
Oth
esy
nta
TDL‐XF, part 3: Exchange FormatTDL-MM
Ab t t S t
TDL‐TO, part 4: Structured Test Objective Language
Abstract Syntax
TDL‐UML, part 5: UML Profile for TDL (upcoming)
Published documentshttps://tdl etsi org/ and https://www etsi org/https://tdl.etsi.org/ and https://www.etsi.org/
© ETSI 2017. All rights reserved
TDL is Adjustable by User
Concrete syntax may cover only parts of the meta‐model
M t d l b t d d b if d iMeta‐model can be extended by a user if need arises
User extensions of the meta‐model can be subjected to further TDL standardization and maintenance
Partially implements
Extends
User-defined TDLt t
TDL meta-model witht i
Partially implements
© ETSI 2013. 1st User Conference on Advanced Automated Testing, Paris, France, 22 – 24 Oct 201318
concrete syntax user extensions
Key Concepts of a TDL Specification
Test configuration• S t f i t ti t i th l T t SUT• Set of interacting components in the roles Tester or SUT
Test description• Represents the expected foreseeable (passing) behavior, i.e. any deviation is a fail
• Expresses a test in terms of interactions of data exchanged between tester and SUT componentsp
• Interactions are totally ordered, i.e. they are implicitly synchronized among componentssynchronized among components
Test data• Represented as abstract name tuples
© ETSI 2013. 1st User Conference on Advanced Automated Testing, Paris, France, 22 – 24 Oct 201319
TDL Meta‐Model Elements
Base elementsBase elements,Concept of time,Test configuration
Simple data,Structured data,Data useata use
Test description,Atomic behavior,Combined behavior
Test objective languageextension Part 4
© ETSI 2017. All rights reserved
Example – Test Configuration
Gate type – restricts communicationto data of listed types;to data of listed types;Component types;Test configuration – wiring ofcomponents of role Tester / SUT
© ETSI 2017. All rights reserved
components of role Tester / SUT
Example – Test Data
D d fi i iData type definitions;Data instance definitions (values);Function declarations
© ETSI 2017. All rights reserved
Example – Test Description, GR
test configurationtime
test configurationbeing used
interaction
combined behavior
conditionte act o interaction
interaction time constrainttime constraint
© ETSI 2017. All rights reserved
Example – Test Description, TDLan
S d i i d i l fSame test description expressed in textual format
© ETSI 2017. All rights reserved
Interactions for Data Exchange
An interaction is atomic and instantaneous
f h fA common concept for three types of communication• Message‐based communication
Th d b i d i d i• The data represents a message being sent and received via gates
• Procedure‐based communication• The data represents a (remote) function call being invoked or its return valuesThe data represents a (remote) function call being invoked or its return values
• Shared variable access• The data represents a shared variable being read or written
© ETSI 2017. All rights reserved
Example – TDL Test Objective Language
Test objective specification (requirement to be tested)Support for Behavior Driven DevelopmentSupport for Behavior‐Driven DevelopmentProse syntax
f dDefined in TDL‐TO, part 4
GIVEN
WHEN
THEN
© ETSI 2017. All rights reserved
“Typical” Efforts in Test Automation
Specification (20%)(textual, MSC-like)
Scripting (40%)(TTCN 3 Java)
(textual, MSC like)
(TTCN-3, Java)
Execution (40%)
Figures are from an ICT project at Siemens with test team comprising about 24 membersabout 24 members.Execution consists of host tests, tests on target HW and regression tests.Additional efforts for test env / test case build and tool support existAdditional efforts for test env. / test case build and tool support exist.
© ETSI 2017. All rights reserved
Abstract vs Concrete Tests
AbstractTDL TestTest Derivation / AbstractTest Cases
TDL Test Descrip-
tionsTest Model
Test Derivation /
Test Specification
Test Code Gen. /The hardTest Scripting
The hard part!
ConcreteConcreteTest Cases
SUT© ETSI 2017. All rights reserved
SUT
TDL Tool Example –Generated Textual Editor (EMFText)Generated Textual Editor (EMFText)
Interruptinginteractions
fSequence ofinteractionsSequence
of blocksof blocks
© ETSI 2013. 1st User Conference on Advanced Automated Testing, Paris, France, 22 – 24 Oct 201331
(Partial) TDL Concrete Syntax
TDL Tool Example –Concrete Syntax VariantsConcrete Syntax Variants
Domain specific test description languages can be designed by editing the syntax rules to specific needs
I t ti {ATPSt t (516) t t t t t }Interaction {ATPStatus(516) source ts_gate target op_gate}
© ETSI 2017. All rights reserved
Message: ts_gate -> op_gate ATPStatus(516) ;
TDL Tool Example –Model to Model TransformationModel‐to‐Model Transformation
Mapping rulespp g
TDL spec State chart
© ETSI 2017. All rights reserved
A Scalable TDL‐Based Tool Architecture
TDL Model TDL TestReportUML basedTextual Editor
(incl. ES 203119-4)Analyzer TDL Test
GeneratorReportUML-based
Graph. Editor
TDL Exchange Format (ES 203119-3)
Graphical Graph.
Viewer & Test Code G t
C-code,Test PlEditor
(ES 203119-2)Doc. Gen. Generator TTCN-3Plan
Front-end tool Back-end tool Artefact (output)
© ETSI 2017. All rights reserved
TDL Open Source Project (TOP)
Lower the barrier of entry for both users and tool vendors in getting started with using TDL
Graphical and textual editors, validation facilities
© ETSI 2017. All rights reserved
Visit https://top.etsi.org/ or https://tdl.etsi.org/top
What Was Achieved?
A new, standardized approach for the design of test descriptions and test purposes in TDLdescriptions and test purposes in TDL• Graphical, textual and user‐defined syntaxes supported• Common exchange formatg
TDL harmonizes and eases development of tools for scenarioTDL harmonizes and eases development of tools for scenario‐based testing• Editors generated from TDL meta model• Editors generated from TDL meta‐model• Re‐useable back‐end tools, e.g. model analyzers – code &
documentation generatorsg• Eclipse ecosystem enables quick and low‐cost tool development
© ETSI 2017. All rights reserved
Timeline
20112011• Very first discussions on TDL
2011
• Work on TDL design and20132013
Work on TDL design and standardization started
20152015• Launch of TDL standards
2015
• Launch of TDL Open Source Project20172017
Launch of TDL Open Source Project• First applications
© ETSI 2017. All rights reserved
TDL @ UCAAT
ETSI User Conference on Advanced Automated Testing (UCAAT)Automated Testing (UCAAT)• https://ucaat.etsi.org/• TDL tutorials and applicationspp
© ETSI 2017. All rights reserved
What Comes Next?
More applications!
k blMaking TDL executable• Work on (partial) language mapping from TDL to TTCN‐3 has started at
ETSI MTSETSI MTS• No general solution, but fault‐model dependent Can the different test generation strategies be classified? Can the different test generation strategies be classified?
• Better understanding of abstraction, underspecification, fault detection, i.e. what features can be omitted from a TDL spec such that the outcome is still predictable?
• Test models in TDL
© ETSI 2017. All rights reserved
Acknowledgements
The author grateful for contributions from • Philip Makedonski (U Göttingen)Philip Makedonski (U Göttingen)• Martti Käärik (Elvior)
Thanks to the TDL Team @ ETSI MTS• Michele Carignani Anthony Wiles (ETSI)• Michele Carignani, Anthony Wiles (ETSI)• Gusztav Adamis, György Rethy (Ericsson)• Finn Kristoffersen (Cinderella ApS)• Finn Kristoffersen (Cinderella ApS)• Marc‐Florian Wendland (FhG FOKUS)• Xavier Zeitoun (CEA List)Xavier Zeitoun (CEA List)
© ETSI 2017. All rights reserved
Contact Details:Andreas UlrichAndreas Ulrich
Siemens AG, Corporate TechnologyhMunich, Germany
Благодарю вас за внимание!
© ETSI 2017. All rights reserved
Resources
TDL website, https://tdl.etsi.org/• Latest standards, documentsLatest standards, documents• Bug reporting• Mailing list, [email protected] , @ g• TDL Open Source Project (TOP)
TOP is officially launched at UCAAT 2017• The 5th ETSI User Conference on Advanced• The 5th ETSI User Conference on Advanced
Automated Testing, Berlin, 11‐13 Oct. 2017• https://top.etsi.org/p // p g/
© ETSI 2017. All rights reserved