zander eng scd_final
DESCRIPTION
Presentation of Model-in-the-Loop Testing for Embedded Systems.TRANSCRIPT
Model-based Testing of Embedded Real-Time Systems in the Automotive Domain
Eng. Sc. D. candidate: Justyna Zander-Nowicka
Supervisor: Prof. Dr.–Ing. Ina Schieferdecker (TU Berlin) Supervisor: Prof. Dr. rer. nat. Ingolf Krüger (UC San Diego)Committee Chairman: Prof. Dr.–Ing. Clemens Gühmann (TU Berlin)
Doctoral Thesis Defense: December, 19th, 2008
December, 19th, 2008 MBT of Embedded Systems Justyna Zander-Nowicka2
Motivation
Embedded systems in the automotive domain :
Addressed characteristics:
� Hybrid: continuous signal flows and discrete events
� Real-time
� Reactive
Testing :
� Demands 40-60% (EU, 2005) of development resources
� Systematic and automatic test approach starting at the earliest model phase ofsoftware development still missing
� Different companies use different technologies, methods, and tools
� The complexity of car softwaredramatically increases
� The functions are distributed� Demand to shorten time-to-market� Demand for quality assurance � safety
standards IEC 61508 and ISO 26262
Environment
Sensor Actuator
Embedded System
Embedded Software
December, 19th, 2008 MBT of Embedded Systems Justyna Zander-Nowicka3
Outline
I. Model-based Testing (MBT)
II. Signal-feature Test Paradigm
– Test Development Process
– Signal-feature Detection for Test Evaluation
– Signal-feature Application for Test Data Generation
III. The Test System
IV. Case Study
V. Summary and Outlook
December, 19th, 2008 MBT of Embedded Systems Justyna Zander-Nowicka4
Model-Based Testing
� Model-based testing is testing in which the entire test specification is derived inwhole or in part from both the system requirements and a model that describeselected functional aspects of the system under test (SUT).
� In this context, the term entire test specification covers the abstract testscenarios made concrete by the sets of test data and the expected SUT outputs.The test specification is organized in a set of test cases .
� Model-in-the-Loop for Embedded System Test (MiLEST ) proposed in this thesis.
I. Model-based Testing II. Signal-Feature Test Paradigm III. The Test System IV. Case Study V. Summary & Outlook
TEST MODEL
Test Objectives
SYSTEM MODEL Interfaces and Test Objectives
REQUIREMENTS
December, 19th, 2008 MBT of Embedded Systems Justyna Zander-Nowicka5
MBT TaxonomyI. Model-based Testing II. Signal-Feature Test Paradigm III. The Test System IV. Case Study V. Summary & Outlook
Acknowledgement: M. Utting et al. (2006)
MiLEST
December, 19th, 2008 MBT of Embedded Systems Justyna Zander-Nowicka6
Related Work Challenges and Thesis Contributions
Open Issues:
� Test data specification� automatic, but only for structural
test or state-based models� systematic for functional test, but
still only manual� Automatic test evaluation , but mainly
based on the reference signal flows
� Test process established, but not efficient enough in terms of cost, efforts, and reusability
Solutions :
� Automatic and systematic test data generation for functional test
� Automatic and online test evaluation based on reference signal partition
� Automation of MBT process jjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjj
Main goal � automatically created test design executable at the model level.
I. Model-based Testing II. Signal-Feature Test Paradigm III. The Test System IV. Case Study V. Summary & Outlook
� Signal-feature – based generation of test data
� Signal-feature – based test assessment by application of validation functions
� Application of automated hierarchical test patterns and transformations
December, 19th, 2008 MBT of Embedded Systems Justyna Zander-Nowicka7
Signal Feature – DefinitionI. Model-based Testing II. Signal-Feature Test Paradigm III. The Test System IV. Case Study V. Summary & Outlook
� A signal feature , also called signal property, is a formal description of certainpredefined attributes of a signal. It is an identifiable , descriptive property of asignal.
� A feature can be composed of or predicated by other features, logicalconnectives , or timing expressions .
Acknowledgement: E. Lehmann (2003)
f(kT)
kT
decrease
constant
increase
local max
step response characteristics
time partitioning
December, 19th, 2008 MBT of Embedded Systems Justyna Zander-Nowicka8
Proposed Test Development Process
SUT as a Model
automatic generation – step I
Test Harness Generation
manual refinement – step II
Test Specification
automatic generation – step III
Test Data & Test Control Generation
automatic execution – step IV
Verdicts Analysis
I. Model-based Testing II. Signal-Feature Test Paradigm III. The Test System IV. Case Study V. Summary & Outlook
December, 19th, 2008 MBT of Embedded Systems Justyna Zander-Nowicka9
Test HarnessI. Model-based Testing II. Signal-Feature Test Paradigm III. The Test System IV. Case Study V. Summary & Outlook
December, 19th, 2008 MBT of Embedded Systems Justyna Zander-Nowicka10
Signal-Feature Detection and Generation
transformation
I. Model-based Testing II. Signal-Feature Test Paradigm III. The Test System IV. Case Study V. Summary & Outlook
- temporal expressions (e.g., after(5ms))
- logical connectives(e.g., and)
transformation
DetectSignal Feature
GenerateSignal Feature
December, 19th, 2008 MBT of Embedded Systems Justyna Zander-Nowicka11
Increase Detection and GenerationI. Model-based Testing II. Signal-Feature Test Paradigm III. The Test System IV. Case Study V. Summary & Outlook
� Detect Increase:The Increase feature can be detected by analyzing its derivative. This can beapproximated (simplified version of the algorithm!) where the actual signal valueand the past one (backward difference):
feature(kT) = sign [signal (kT) − signal ((k − 1) * T)]
feature(kT) is positive if the signal increases.
� Generate Increase:
� Example:
December, 19th, 2008 MBT of Embedded Systems Justyna Zander-Nowicka12
Classification of Signal-Feature Detection Mechanis ms (1)
Tim
e-independent
Acknowledgement: A. Marrero-Pérez (2007)
I. Model-based Testing II. Signal-Feature Test Paradigm III. The Test System IV. Case Study V. Summary & Outlook
December, 19th, 2008 MBT of Embedded Systems Justyna Zander-Nowicka13
Classification of Signal-Feature Detection Mechanis ms (2)
Acknowledgement: A. Marrero-Pérez (2007)
Triggered
I. Model-based Testing II. Signal-Feature Test Paradigm III. The Test System IV. Case Study V. Summary & Outlook
December, 19th, 2008 MBT of Embedded Systems Justyna Zander-Nowicka14
Levels of Test Design in MiLEST
Feature Detection Level
Validation Function Level
Test Requirement Level
abstra
ctio
n
Test Harness Level
refinement
Feature Generation Level
Test Case Level
Test Requirement Level
Test Harness Level
Test Specification
designed manually
Test Data Generation
obtained automatically
Transformation
IF preconditions set THEN assertions set
IF preconditions set THEN generations set
Transformation
Transformation
Transformation
I. Model-based Testing II. Signal-Feature Test Paradigm III. The Test System IV. Case Study V. Summary & Outlook
December, 19th, 2008 MBT of Embedded Systems Justyna Zander-Nowicka15
Test Patterns Classification� Test Specification Patterns
� Test Requirement Level
� Validation Function Level
- Signal-Feature Detection Level
Test activity Test pattern name Context Problem Solutio n instance
Test specificationDetect step response features
Test of an electronic control unit (ECU)
Assessment of an ECU behavior in terms of a selected signal feature
I. Model-based Testing II. Signal-Feature Test Paradigm III. The Test System IV. Case Study V. Summary & Outlook
December, 19th, 2008 MBT of Embedded Systems Justyna Zander-Nowicka16
Pedal Interpretation Example
Interpretation of accelerator pedal position
Normalized accelerator pedal position (phi_Acc) should be interpreted asdesired driving torque (T_des_Drive). The desired driving torque is scaled inthe non-negative range in such a way that the higher the velocity (v) isgiven, the lower driving torque is obtained (Conrad, 2004).
IF v=const AND phi_Acc increasesTHEN T_des_Drive increases
IF v=const AND phi_Acc decreasesTHEN T_des_Drive decreases
IF v=const AND phi_Acc=constTHEN T_des_Drive=const
IF v increases AND phi_Acc=const AND T_des_Drive>=0
THEN T_des_Drive does not increase
I. Model-based Testing II. Signal-Feature Test Paradigm III. The Test System IV. Case Study V. Summary & Outlook
December, 19th, 2008 MBT of Embedded Systems Justyna Zander-Nowicka17
MiLEST Quality Metrics – an Example
0102030405060708090
100
%
v phi_A
cc
phi_B
rake
I. Model-based Testing II. Signal-Feature Test Paradigm III. The Test System IV. Case Study V. Summary & Outlook
# of variants for a selected SigF applied in a test design
# of all possible variants for a selected SigFVariants coverage for a SigF =
Pedal Interpretation –manual generation
0102030405060708090
100
%
v phi_A
cc
phi_B
rake
Pedal Interpretation –automatic generation
� Consider MiLEST Quality Metricsin combination (weight-based approach), not in isolation!
December, 19th, 2008 MBT of Embedded Systems Justyna Zander-Nowicka18
MiLEST SummaryI. Model-based Testing II. Signal-Feature Test Paradigm III. The Test System IV. Case Study V. Summary & Outlook
Contributions:
� Automatic and systematic test data generation for functional test based on signal-feature concept
� Automatic and online test evaluationbased mainly on signal-feature taxonomy
� Reusable test patterns constituting a test framework
� MBT methodology and automated test process
Achieved goals:
� Systematic and consistent functional test specification
� Automation of the test specification
� Novel manner of signal assessment� Continuous observation of the SUT� Automation of the test process
� Abstract and concrete views (i.e., abstract libraries concretized in the test system)
� Requirements- & model-based testing� Test execution starting at MiL level� System model and test model in
common execution environment
December, 19th, 2008 MBT of Embedded Systems Justyna Zander-Nowicka19
Future Work
� Specific for current version of MiLEST:� Test stimuli generation algorithms and their optimization� Transposition of conditional rules (i.e., automation of the test specification)� Further test patterns (e.g., incl. different numerical integration methods)� Combination of the methods for verification or failure analysis purpose
� Automation for negative testing� More case studies (i.e., test approach scalability)� Further domains (e.g., aerospace, railway, space, earth, or military systems)� Further execution platforms� Mapping to TTCN-3 es� Further test quality metrics such as:
� Cost/effort needed for constructing a test data set � Relative number of found errors in relation to the number of test cases
needed to find them
I. Model-based Testing II. Signal-Feature Test Paradigm III. The Test System IV. Case Study V. Summary & Outlook
Thank you so [email protected]
Backup slides.
December, 19th, 2008 MBT of Embedded Systems Justyna Zander-Nowicka22
Automotive Embedded System Test Dimensions
December, 19th, 2008 MBT of Embedded Systems Justyna Zander-Nowicka23
Related Work – Tools
Criteria
Selected Test Methodologies, Technologies, Tools
Test Specification
TestPatterns Support
Transformation and
Automation Facilities
Manual Test Case / Test Data
Specification
Automatic Test Case / Test Data
Generation
Test EvaluationScenarios as Driving
Force
FormalVerification
EmbeddedValidator + + (15 patterns)
MTest with CTE/ES +
Reactis Tester + +
Reactis Validator + + –/+ (2 patterns)
Simulink® Verification and Validation™
+ + + (12 patterns)
Simulink® Design Verifier™ + + –/+ (4 patterns)
SystemTest™ +
TPT + +
T-VEC + +
TransformationsApproach (Dai,‘06) + +
Watchdogs (Conrad,‘98) +
MiLEST + + + (~50 patterns) +
December, 19th, 2008 MBT of Embedded Systems Justyna Zander-Nowicka24
MiLEST with respect to MBT Taxonomy
TestApproach
Test Generation: Selection Criteria and Technology
Test Execution Options
Test Evaluation:Specificationand Technology
MiLEST
- data coverage- requirements coverage- test case specifications - automatic generation- offline generation
- MiL - reactive
- reference signal-feature – based- requirements coverage- test evaluation specifications- automatic- online evaluation
December, 19th, 2008 MBT of Embedded Systems Justyna Zander-Nowicka25
Signal Features – a Descriptive Approach
u1(time) -step
timeq1(time)-
step response
time
Step response characteristics: rise time (tr), maximum overshoot (max), settling time (ts), steady state error (ess)
ts
max
tr
ess
December, 19th, 2008 MBT of Embedded Systems Justyna Zander-Nowicka26
Signal-Features Generation
December, 19th, 2008 MBT of Embedded Systems Justyna Zander-Nowicka27
Signal-Feature Generation for Test Data
transformation
December, 19th, 2008 MBT of Embedded Systems Justyna Zander-Nowicka28
Signal-Features Generation and Evaluation
December, 19th, 2008 MBT of Embedded Systems Justyna Zander-Nowicka29
From Signal Feature Detection to Signal Feature Gen eration
Signal feature detection
Trigger-independent
Immediately identifiable
� Detect signal value
� Detect increase / decrease
� …
Signal feature generation
Trigger-independent
Immediately identifiable
� Any curve coming through a given value within the permitted range of values , where duration time is default
� Any increasing/decreasing function with a default/given slope or other characteristicsin the permitted range of values , where duration time is default
� …
December, 19th, 2008 MBT of Embedded Systems Justyna Zander-Nowicka30
Classification of Signal Features based on their De tection Type
Immediately identifiable
Identifiable with determinate delay
Identifiable with indeterminate delay
� Detect signal value� Detect increase / decrease /
constant� Detect continuous signal /
derivative� Detect linearity (w.r.t. 1st value)� Detect functional relation y = f(x)� Detect causal filter� Detect max-to-date / min-to-date
� Detect max / min / inflection� Detect peak� Detect impulse� Detect step
� Detect duration ofevery single delay
� Detect signal value @ time1
� Detect time stamp� Detect any time independent
features over a time interval� e.g., value @ time1
� e.g., value @ [time1, time2]
� Detect any timeindependent featuresover a time interval
� e.g., value @ time of max
� Detect step response characteristics(rise time, settlingtime, overshoot)
� Detect response delay� Detect complete step
Tim
e-in
depe
nden
tTr
igge
red
MSOffice8
Slajd 30
MSOffice8 Complete Step Detection : das was die Preconditions von Step response machen: Step detektieren und dann triggern wenn die signale 'Step' und 'Step
response' sich stabilisiert haben.
Step detection : detektiert nur ein Step, triggert also direkt beim Step
max-to-date : speichert immer den maximalen Wert bislang. wenn ich z.B. die Wertefolge habe 0 1 2 3 5 6 10 9 54 6 7 3, max-to-date liefert: 0 1 2 3 4 6
10 10 54 54 54 54 Justyna Zander-Nowicka, 12/12/2006
December, 19th, 2008 MBT of Embedded Systems Justyna Zander-Nowicka31
Signal-Features Classification (excerpt)
SigFEvaluation View Generation View
Time-independent
Imm
ediately identifiable
Signal value detection
Basic mathematical operations (e.g., zero detection)
Increase detection
Decrease detection
Constant detection
Signal continuity detection
Any curve crossing the value of interest in the permitted range of values, where duration time = default
Generation information: - value of interest
Any curve described by a basic mathematical operations (e.g., crossing zero value in the permitted range of values), where duration time = default
Generation information: - time of zero crossing
Any ramp increasing with a default/given slope in the permitted range of values, where duration time = default
Generation information: - slope - initial output- final output
Any ramp decreasing with a default/given slope in the permitted range of values, where duration time = default
Generation information: - slope - initial output- final output
Any constant in the permitted range of values, where duration time = default Generation information: - constant value
Any continuous curve in the permitted range of values, where duration time = default
December, 19th, 2008 MBT of Embedded Systems Justyna Zander-Nowicka32
IF – THEN Rules
Logical connectives, e.g.:
IF constrained_inputsn AND constrained_outputsm THEN constrained_inputsn AND constrained_outputsm
IF constrained_inputsn AND constrained_outputsm THEN constrained_outputsm
IF constrained_inputsn THEN constrained_inputsn AND constrained_outputsm
IF constrained_inputsn THEN constrained_outputsm
IF true ^ any constraints THEN constrained_outputsm
Alternative, i.e.:
IF ATHEN B
OR C OR D
Temporal expressions, e.g.:
IF A THEN during(x)B AND after(y)C
December, 19th, 2008 MBT of Embedded Systems Justyna Zander-Nowicka33
Test Patterns Classification (2)
Test activity Test pattern name Context Problem Solutio n instance
Test data generation
Generate signal feature
Evaluation of a step response function is intended
Generation of appropriate signal to stimulate an SUT
Test activity Test pattern name Context Problem Solutio n instance
Test control specification
Automatic sequencing of test cases
Test of an electronic control unit
Establishing of the starting point of the next test case
IF verdict=pass or verdict=fail or verdict=error of a test case THEN leave this test case at that time point & execute the next test case starting at that established time point
� Test Data Structure Pattern
� Test Requirement Level
� Test Case Level
- Signal-Feature Generators
� Test Control Patterns (e.g., for reactive testing)
� Test Harness Pattern
December, 19th, 2008 MBT of Embedded Systems Justyna Zander-Nowicka34
Combination of Variants
� Combination techniques:
� Minimal combination
� One factor at a time
� N-wise combination
� Others:
� Complete combination
� Random combination
� etc...
December, 19th, 2008 MBT of Embedded Systems Justyna Zander-Nowicka35
Pedal Interpretation Component
velocity (v)
acceleration pedal (phi_Acc)
…
driving torque (T_des_Drive)
System under Test
…
December, 19th, 2008 MBT of Embedded Systems Justyna Zander-Nowicka36
Test Data Patterns Derivation
0,0
v,
phi_Acc
time
0,0
v,
phi_Acc
time
time0,0
v,
phi_Acc
0,0
v,
phi_Acc
time
Interpretation of accelerator pedal position
Normalized accelerator pedal position should be interpreted as desired driving torque. The desireddriving torque is scaled in the non-negative range in such a way that the higher the velocity is given,the lower driving torque is obtained.
Generate v=const
Generate phi_Acc increases
Generate v=const
Generate phi_Acc decreases
Generate v=const
Generate phi_Acc=const
IF T_des_Drive>=0
Generate v increases
Generate phi_Acc=const
December, 19th, 2008 MBT of Embedded Systems Justyna Zander-Nowicka37
Concrete Test Data
0,0
phi_Acc
time
0,0
velocity
time
0,0
phi_Brake
time
Range constraints
Temporal constraints
December, 19th, 2008 MBT of Embedded Systems Justyna Zander-Nowicka38
Variants for the Increase Generation – Concrete View
� Consider the velocity of a car < -10, 70 > with the partition point of 0.
� Then, using the classification tree method (Grochtmann & Grimm, 1993), and the formulas:
<pn, pn + 10% * (pn+1 – pn)> and <pn – 10% * (pn – pn-1), pn>
� Increase variants are: <-10, -9>, <-1, 0>, (0, 7>, <63, 70>.
December, 19th, 2008 MBT of Embedded Systems Justyna Zander-Nowicka39
Concrete Test Data Variants
v phi_Acc
v1 v2 v3 v4 v5 phi_Acc1 phi_Acc2
{-10} {-5} {0} {35} {70} [0,10] [90,100]
t0
t1
t2
t3
phi_Acc v
SUT inputs
phi_Acc1
1
2
3
4
One factor at a time combination
tim
e [u
nit
s]
itera
tio
ns[
n]
phi_Acc2
5
6
t4
t5
v1 v2 v3 v4 v5
0
10
20
30
40
50
60
70
80
90
100
0 2 4 6 8 10 12
time [s]
phi_
Acc
-10
0
10
20
30
40
50
60
70
80
90
0 2 4 6 8 10 12
time [s]
v
phi_Acc1
phi_Acc2
v1v2
v3
iteration1 iteration 2 iteration3 iteration 4 iteration 5 iteration6
v1
v4
v5
December, 19th, 2008 MBT of Embedded Systems Justyna Zander-Nowicka40
Set of Test Cases Sequenced in Test Suites
December, 19th, 2008 MBT of Embedded Systems Justyna Zander-Nowicka41
Test Cases Sequenced in Test Suite
December, 19th, 2008 MBT of Embedded Systems Justyna Zander-Nowicka42
MiLEST Test Quality Metrics
Test data related:
Signal range consistency
Constraint correctness
Variants coverage for a SigF
Variants coverage during test execution
Variants related preconditions coverage
Variants related assertions coverage
SUT output variants coverage
Minimal combination coverage
Test specification related:
Test requirements coverage
VFs activation coverage
VF specification quality
Preconditions coverage
Effective assertions coverage
Test control related:
Test cases coverage
Others:
Service activation coverage
System model coverage
Cost/effort needed for constructing a test data set
Relative number of found errors in relation to the number of test cases needed to find them
Coverage of signal variants combinations –CTCmax, CTCmin
December, 19th, 2008 MBT of Embedded Systems Justyna Zander-Nowicka43
Summary and Future Work
� Three types of case studies :
� component level test
� component in the loop level test
� integration level test
Abst ract ion Test
Specification
Test Quality A
u to m
a ti o
n
Test E
val u
at i on, T
est
Oracl e, A
rbi t rat i o
n
S i gn a
l -F e a
t ur e
Ap p
r oa c
h
December, 19th, 2008 MBT of Embedded Systems Justyna Zander-Nowicka44
Example: Ariane 5
� Ariane 5 Flight 501 on 4 June 1996 failed� Weight: 740 t, Payload: cluster satellites� Rocket self-destructing 37 seconds after launch
because of a malfunction in the control software� Most expensive computer bug in history:
370 Mio $
Causes:� Reused software from Ariane 4� Data conversion from 64-bit float to 16-bit
signed integer � overflow / not caught� ADA software with 2 channels (redundancy), but
identical implementation!� 1st channel had same problem 72ms before� Software handler got exceptions from both
channels, no Plan B for such situations� Main computer interpreted horizontal velocity
and sent strange control command� Self-destruction due to safety issues
...declarevertical_veloc_sensor: float;horizontal_veloc_sensor: float;vertical_veloc_bias: integer;horizontal_veloc_bias: integer;...
begindeclarepragma suppress(numeric_error, horizontal_veloc_bias);
beginsensor_get(vertical_veloc_sensor);sensor_get(horizontal_veloc_sensor);vertical_veloc_bias := integer(vertical_veloc_sensor);horizontal_veloc_bias :=
integer(horizontal_veloc_sensor);...
exceptionwhen numeric_error => calculate_vertical_veloc();when others => use_irs1();
end;end irs2;.
Horizontal velocity > 32786.0 internal unit
Unclassified Exception caught �Control transfer to 1 st channel
ADA Code of 2nd channel
* source: http://www-aix.gsi.de/~giese/swr/ariane5.html (retrieved 2008)
December, 19th, 2008 MBT of Embedded Systems Justyna Zander-Nowicka45
MiLEST Realization
� MiLEST – Model-in-the-Loop for Embedded System TestingIt is a Simulink® add-on built on top of the MATLAB® engine.
December, 19th, 2008 MBT of Embedded Systems Justyna Zander-Nowicka46
Integration of MiLEST in the Automotive-specific V -Modell ®
Acknowledgement: J. Großmann et al. (2008)
December, 19th, 2008 MBT of Embedded Systems Justyna Zander-Nowicka47
Model- and Requirement-based Testing
Execution Environment
SYSTEM IMPLEMENTATION
TEST MODEL
TEST IMPLEMENTATION
Transformation Transformation
Test Objectives
SYSTEM MODELInterfaces and Test Objectives
REQUIREMENTS
� What is the role of a system model?� What is the role of a test model?� Is it possible to use a common language for both
system and test specifications?
� How can discrete and continuous signals be handled at the same time?
� How should a test framework be realized?
� How to automate the test process?
� How to assure the quality of tests?
December, 19th, 2008 MBT of Embedded Systems Justyna Zander-Nowicka48
MiLEST Marketing
Features :� Systematic functional test specification� Signal -feature – oriented paradigm� Graphical test design� Test process automation
� systematic and automatic test datageneration
� online automatic test evaluation� Model-in-the-Loop test execution� Reusable test patterns� Abstract and concrete views
Benefits:� Testing in early design stages� Test of hybrid systems including temporal
and logical dependencies� Traceability of test cases to the
requirements� Traceability of verdicts to the root faults� Increased test coverage and test
completeness� Assured quality of the tests
December, 19th, 2008 MBT of Embedded Systems Justyna Zander-Nowicka49
Discrete and Continuous Signal Interpretation in Si mulink
� Consider a second order Runge-Kutta numerical integration
21
12
1
)()(
)2
)(,2
(
)),((
atxtx
atx
htfha
ttxfha
kk
kk
kk
kkk
+=
++=
=
+