qu meeting phd thesis kessentini
TRANSCRIPT
International Center of Excellence in Software Engineering
28 February, 2011
Transformation by Example
Marouane Kessentini
Translation Metaphor
• Different languages– English to French ?– Vulcan to Smurf ?
• Same language– Detection Translation– Long sentence Split in two sentences
• Translation errors
Transformation
Rulesconforms
TargetModel
TargetModel
TargetMeta-model
TargetMeta-model
SourceMeta-model
SourceMeta-model
Source Model
Source Model
conforms
Context
Source model Target model
Endogenous transformations
Exogenous transformations
SMM≠TMM
SMM=TMM
Model transformation testing
Statement
• Problem: transformation and testing activities require specific and contextual knowledge– Not always fully available– Difficult to express, structure, implement
• Solution : the use of examples to compensate the lack of knowledge
Challenges…
• Transforming models without specifying rules• Detecting design defects without specifying
them• Testing transformations without
– Defining expected target models– Specifying constrains
Outline
• Defining Exogenous Transformations
• Detecting Elements to Transform for Endogenous Transformations
• Testing Transformations
• Conclusion and Future work
Existing Work• Several transformation approaches
(Czarnecki and Helsen ’05 ) – Graph transformation, Direct manipulation,
Structure-driven, Relational , Hybrid…
• Available work based on rules (Egyed ’02) – VIATRA (Varro et al. ’04)– AGG (Taenzer et al. ’03)– ATL (Jouault et al. ’05)
– …
Defining Trans. Detecting Elements Testing Trans. Conclusion
Problem • Difficult to define/express transformation
rules– Usually 1-to-1 mappings– Dynamic model mappings – State explosion problem for behavioral models
• Difficult to derive consensual rules– Diverge expert’s opinions– Need to understand the source and target formalisms– Easier to describe examples than consistence and complete rule
sets
• Idea = Model Transformation by Example
Defining Trans. Detecting Elements Testing Trans. Conclusion
Transformation by Example
• Limitations– Formalism and languages dependent– Mostly 1-to-1 mappings – Difficult to fully automate– Strong hypotheses (e.g., representative samples)– Applied only to static models
Defining Trans. Detecting Elements Testing Trans. Conclusion
By example approaches
Exogenous transformation
Endogenous transformation
Traceability Rules generation
Varrò et al. 06 X X X
Wimmer et al. 07 X X X
Sun et al. 09 X X
Dolques et al. 10 X X X
Overview
Transformation(Heuristic search)
Base of examples
SourceModel
TargetModel
Defining Trans. Detecting Elements Testing Trans. Conclusion
B32
Illustration• Transformation example
[SM, TM, MB]
TM = relational schemaTable(Position).column(Title, Position,_).…
SM = class diagramClass(Position).Attribute(Title, Position,_).…Association(0,1,_,n_, Assigned, Position, Employee).…Generalization(Employee, Operative).
Block B32Class(Position) : Table(Position).Attribute(Title, Position,_) : Column(idPosition, position, pk), Column(title, Position,_). …Class(Employee) : Table(Employee). …Association(0,1,_,n,_, Position, Employee) : Column(idPosition, employee, fk).
Defining Trans. Detecting Elements Testing Trans. Conclusion
Proposal
• Transformation problem = search in an n-dimensional space
• Construct = dimension
• Solution = {<constructi,Tj(constructi)>}
• 1 construct = m possibilities of transformation
• Complexity = mn possible combinations – Exp : 6012 possibilities !
Heuristic search
Defining Trans. Detecting Elements Testing Trans. Conclusion
Search-based Model Transformation
• Used heuristic algorithm– Particle Swarm Optimization (Kennedy et al.
’95)
– Simulated Annealing (Kirkpatrick et al. ’83 )
Simulated Annealing Algorithm
Initial solution Final
solutionPSO Algorithm
New solution
Defining Trans. Detecting Elements Testing Trans. Conclusion
Solution Representation (PSO-SA)• Transformation encoding
20 18 1 17 12 7 15 2 9 3 5 29
Constructs
Block number
solution
1
6
3
4
2
510
11
89
7 12
1 2 543 76 1098 11 12
Transformations blocks
Defining Trans. Detecting Elements Testing Trans. Conclusion
Change Operator (SA)
Solutioni
1 2 543 76
New solution generation
1 2 543 76
Example 1
Example 2
Example 3
Example n
Base of examples
Solutioni+1
Defining Trans. Detecting Elements Testing Trans. Conclusion
Fitness Function (PSO-SA)
• Good model transformation– Good transformation for individual constructs– Consistency between construct transformations– Temporal constraints preservation (behavioral models)
• Fitness function (to maximize)
n
jjjjj tdpef
1
Defining Trans. Detecting Elements Testing Trans. Conclusion
Fitness Function20 18 1 17 12 7 15 2 9 3 5
Begin b1Class(Client) : Table(client). Attribute(N_Client, Client, unique) : Column(n_Client, client, pk).…Class(Reservation) : Table(Reservation).Attribute(N_reservation, Reservation, unique) : Column(n_reservation,
Reservation, pk).…Association (1,1,0,n,_, Client, Reservation) : Column(n_Client, Reservation, fk).End b1
Association (0,1,_,n,_, Position, Employee)
e3 = 1p3 = 5/7 = 0.71d3 = 2/2 = 1
Defining Trans. Detecting Elements Testing Trans. Conclusion
Evaluation
• CLD-to-RS transformation– 12 examples (industrial projects)
• SD-to-CPN transformation– 10 examples
• n-fold cross-validation– Transform each example using the n-1 other examples
– Average precision
• Model transformation precision (AC, MC)
constructs#
onansformaticorrect tr with constructs#AC
Defining Trans. Detecting Elements Testing Trans. Conclusion
CLD-to-RS Results
• Precision of the 12 generated transformationsSource Model
Number of constructs
Fitness AC MC
SM 1 72 0.696 0.618 0.882
SM 2 83 0.714 0.682 0.928
SM 3 49 0.762 0.721 0.943
SM 4 53 0.796 0.719 0.931
SM 5 38 0.773 0.789 0.952
SM 6 47 0.746 0.652 0.918
SM 7 78 0.715 0.772 0.957
SM 8 34 0.896 0.822 0.981
SM 9 92 0.61 0.634 0.87
SM 10 28 0.892 0.908 0.969
SM 11 59 0.773 0.717 0.915
SM 12 63 0.805 0.762 0.938
Average 58 0.764 0.733 0.932
Defining Trans. Detecting Elements Testing Trans. Conclusion
Example-size variation (CLD-to-RS)
Defining Trans. Detecting Elements Testing Trans. Conclusion
SD-to-CPN ResultsDefining Trans. Detecting Elements Testing Trans. Conclusion
CPN Size Comparison
Size(WebSPN) Size(dMOTOE) Variation
22 13 41%36 22 39%39 24 38%43 31 28%51 36 30%50 39 22%56 39 30%53 44 16%58 52 10%76 54 29%
Average Reduction :% 28.3%
by WebSPN generated CPNin constructs#
dMOTOEby generated CPNin constructs#Variation
Defining Trans. Detecting Elements Testing Trans. Conclusion
Outline
• Defining Exogenous Transformations
• Detecting Elements to Transform for Endogenous Transformations
• Testing Transformations
• Conclusion and Future work
Defining Trans. Detecting Elements Testing Trans. Conclusion
Endogenous Transformation• Endogenous transformation to improve
software quality– Detecting design defects : situations that adversely
affect the development of a software– Applying refactoring operations (transformation)
• The Blob example– Detect “God” classes (number of : methods,
relations, …)– Transformation (move methods, …)
Defining Trans. Detecting Elements Testing Trans. Conclusion
Existing Work
• Usual approach (Moha et al. ’10, Marinescu et al. ’04, ...)
Definition symptoms detection algorithm
Defining Trans. Detecting Elements Testing Trans. Conclusion
Problem
• Detection issues– Need an exhaustive design defects list– No consensual definition of symptoms– Difficulty to automate symptom’s evaluation– Difficulty to evaluate the risk to guide the
manual inspection of the defect candidates
Defining Trans. Detecting Elements Testing Trans. Conclusion
Endogenous Transformation by Example
Deviance from perfection
Detection rules generationEndogenous
Transformation
Defining Trans. Detecting Elements Testing Trans. Conclusion
•Two perspectives :
Overview
Rules generation(Harmony search)
Base of examples
Qualitymetrics
Generated rules
Defining Trans. Detecting Elements Testing Trans. Conclusion
Harmony Search
• Intuition– Music composition
• Algorithm1. Generate some rules randomly
• rule = metrics composition
2. Evaluate the quality of these rules• Comparing between the detected defects and expected
ones
3. Repeat step 1 and 2 Until (stopping criteria)
Defining Trans. Detecting Elements Testing Trans. Conclusion
Validation• Defects detection in three open source projects
Xerces, Quick UML et Gantt
• Validation data– 3-fold cross validation– Occurrences of Blob, Spaghetti code (SC), and Function
decomposition (FD) – Found manually– Used in rule-based detection DECOR (Moha et al. ’10)
• Comparison with DECOR
Systems Number of classes KLOC
Quick UML 142 19
Xerces 991 240
Gantt 471 91
Defining Trans. Detecting Elements Testing Trans. Conclusion
Gantt ResultsDefining Trans. Detecting Elements Testing Trans. Conclusion
Comparison with DECOR
HS DECORPrecision-Gantt 87% 59%
Precision-Quick UML 86% 42%
Precision-Xerces 81% 67%
Defining Trans. Detecting Elements Testing Trans. Conclusion
Endogenous Transformation by Example
Deviance from perfection
Detection rules generationEndogenous
Transformation
Defining Trans. Detecting Elements Testing Trans. Conclusion
•Two perspectives :
Artificial Immune System
• Intuition : – Biological Immune system
• Negative selection principal (Forrest et al., ’95) – Each deviation from the normal cell behaviour is
considered as a risk
• Deviance from perfection is a better criterion than closeness to evil when identifying risky code
Defining Trans. Detecting Elements Testing Trans. Conclusion
Negative Selection
Detector
Non-self
self
self
Foreign element
Affinity
Defining Trans. Detecting Elements Testing Trans. Conclusion
Overview
Negative SelectionBase of
examplesRisky
Candidates
Defining Trans. Detecting Elements Testing Trans. Conclusion
Detection With Negative Selection
Detector generation
Risk estimation
Reference code(Self)
Detectors
Code to evaluate Risky code(Non-self)
Defining Trans. Detecting Elements Testing Trans. Conclusion
Detector Generation and Refinement
• Heuristic search using a genetic algorithm– Initial population of detectors (artificial code)– Evaluate the quality of detectors
• Maximise the generality of detectors to cover non-self : LG(di)
• Minimise the overlap between detectors : O(di) Detectors
d1 d2
d3
Self s1
s2
s3 s4
s5
Defining Trans. Detecting Elements Testing Trans. Conclusion
Risk Estimation
Detectors
Code fragment to evaluate
Similarity distance: Riskei
d1
d2
d3d4
d5
d6
d7d8
d9
d10
Defining Trans. Detecting Elements Testing Trans. Conclusion
Validation• Defects detection in two open source projects
Xerces et Gantt– Reference system : JHotDraw
• Validation data– Occurrences of Blob, Spaghetti code (SC), and Function
decomposition (FD) – Found manually– Used in rule-based detection DECOR (Moha et al. ’10)
• Comparison with DECOR
Systems Number of classes KLOC
Gantt 245 31
Xerces 991 240
JHotdraw 471 91
Defining Trans. Detecting Elements Testing Trans. Conclusion
Xerces results
Defining Trans. Detecting Elements Testing Trans. Conclusion
Comparison with DECOR
AIS DECORPrecision-Gantt 95% 59%
Precision-Xerces 90% 67%
Defining Trans. Detecting Elements Testing Trans. Conclusion
Outline
• Defining Exogenous Transformations
• Detecting Elements to Transform for Endogenous Transformations
• Testing Transformations
• Conclusion and Future work
Defining Trans. Detecting Elements Testing Trans. Conclusion
Transformation Testing
• Model transformation testing– Test cases generation
• Source models
– Oracle function definition
Transformation mechanism
Results verification
Source models
Target models
Oracle function
Detectederrors
Defining Trans. Detecting Elements Testing Trans. Conclusion
Transformation Testing
• Existing oracle function definitions– Model comparison (Lin et al. ’ 05, Baudry et al. ’08, ...)
• Target models vs expected models
– Specification conformance : pre- and post- conditions (Baudry et al. ’ 07, Giner et al. ’09, ...)
• Design by contract• Pattern matching
Defining Trans. Detecting Elements Testing Trans. Conclusion
Transformation Testing
• Oracle function definition is difficult– Model comparison
• Expected target model for each test case• Graph isomorphism problem
– Specification conformance • Large number of constraints to define• Difficult to write in practice
Defining Trans. Detecting Elements Testing Trans. Conclusion
Approach OverviewDefining Trans. Detecting Elements Testing Trans. Conclusion
Evaluation
• CLD-to-RS transformation– 12 examples (industrial projects)
• SD-to-CPN transformation– 10 examples
• n-fold cross validation– Average precision and recall
errorsation transformdetected ofnumber total
errorsation transformpositive trueofnumber Precison
errorsation transformofnumber total
errorsation transformpositive trueofnumber Recall
Defining Trans. Detecting Elements Testing Trans. Conclusion
CLD-to-RS Results
Source Model Number of elements
Number of transormation errors introduced manually
Precision Recall
SM1 72 13 82% 93%
SM2 83 14 93% 94%
SM3 49 11 92% 100%
SM4 53 16 88% 100%
SM5 38 9 90% 100%
SM6 47 12 100% 100%
SM7 78 16 84% 95%
SM8 34 8 100% 100%
SM9 92 14 82% 93%
SM10 28 9 100% 100%
SM11 59 13 93% 100%
SM12 63 15 94% 100%
Average 58 12 91% 98%
Defining Trans. Detecting Elements Testing Trans. Conclusion
SD-to-CPN ResultsSource Model Number of
elementsNumber of transormation
errors introduced manually Precision Recall
SM1 16 14 93% 93%
SM2 18 12 94% 95%
SM3 27 11 85% 95%
SM4 28 11 88% 100%
SM5 36 8 75% 100%
SM6 36 9 100% 100%
SM7 42 17 88% 100%
SM8 49 10 91% 100%
SM9 53 14 100% 100%
SM10 57 9 100% 96%
Average 36 11 91% 97%
Defining Trans. Detecting Elements Testing Trans. Conclusion
Errors detected
Defining Trans. Detecting Elements Testing Trans. Conclusion
Test unit Risk Meta-model error
Transformation logic (rules) error
UC260.98 X X
UC240.95 X X
UC220.94 X
UC230.90 X
UC210.90 X
UC270.85 X
UC250.78
UC280.76
ToolDefining Trans. Detecting Elements Testing Trans. Conclusion
Outline
• Defining Exogenous Transformations
• Detecting Elements to Transform for Endogenous Transformations
• Testing Transformations
• Conclusion and Future work
Defining Trans. Detecting Elements Testing Trans. Conclusion
Conclusion • Novel “by example” solutions for
– Model transformation• MODELS08, SOSYM Journal, ECMFA10, BMFA10, MPM10,
LMO09
– Design Defects detection• ASE10, CSMR11, FASE11
– Transformation testing• ASE Journal, CASCON10
• Validation– Very encouraging results– Comparison with existing approaches
Defining Trans. Detecting Elements Testing Trans. Conclusion
Future Work• Application to other transformation problems
– Transformation rules generation from examples– Code generation– Model refinement– Model evolution
– Completing the three-steps process for design defects and testing
• identification and correction steps
• Validation with larger systems
Defining Trans. Detecting Elements Testing Trans. Conclusion
Publications
• Book Chapters and Journal Papers :– Kessentini, M., Sahraoui, H., and Boukadoum, M. 2010. Search-Based Model
Transformation by Example, Software and System Modeling Journal-Special Issue of MODELS08 (accepted)
– Kessentini, M., Sahraoui, H., and Boukadoum, M. 2010. Example-based Model Transformation Testing, Automated Software Engineering Journal (accepted)
– Asztalos, M., Kessentini, M., Syriani, E., and Wimmer, M. 2010. Towards Rule Composition. Journal of the Electronic Communications of the EASST, Multi-Paradigm Modeling (accepted)
– Kessentini, M., Sahraoui, H., Boukadoum, M., 2010. Maintenance, Evolution and Reengineering of Software,Models by Example. In ”Emerging Technologies for the Evolution and Maintenance of Software Models” book, edited by Jrg Rech and Christian Bunse (Under review).
Publications
• Refereed Conference– Kessentini M., Vaucher S., and Sahraoui H. 2010. Deviance from
perfection is a better criterion than closeness to evil when identifying risky code. In Proceedings of the IEEE/ACM international conference on Automated software engineering ASE 2010.
– Kessentini, M., Sahraoui, H., and Boukadoum, M. 2008. Model Transformation as an Optimization Problem. In Proceedings of the 11th international Conference on Model Driven Engineering Languages and Systems MODELS 2008.
– Kessentini, M., Sahraoui, H., Boukadoum, and M. Wimmer, M. 2011. A Music-Inspired Approach for Design Defects Detection Proceedings of the 15th European Conference on Software Maintenance and Reengineering CSMR 2011
Publications
– Kessentini, M., Bouchoucha, A.,Sahraoui, H., and Boukadoum, M. 2010. Example-Based Sequence Diagrams to Colored Petri NetsTransformation Using Heuristic Search. In Proceedings of the 6th European Conference on Modelling Foundations and Applications ECMFA 2010
– Kessentini, M., Sahraoui, H., Boukadoum, and M. Wimmer, M. 2011. Search-based Design Defects Detection by Example. In 14th International Conference on Fundamental Approaches to Software Engineering Conference FASE 2011
– Kessentini, M., Sahraoui, H., and Boukadoum, M. 2009. Transformation de modèle par l’exemple : approche par méta-heuritique. Actes du 15e conférence francophone sur les Langages et Modéles à Objets, LMO2009.
Publications
– Kessentini, M., Sahraoui, H., and Boukadoum, M. 2010. Testing Sequence Diagram to Colored Petri Nets Transformation: An Immune System Metaphor. In Proceedings of the 20th Annual International Conference on Computer Science and Software Engineering CASCON2010. (Best Paper Award )
– Kessentini, M., Wimmer, M.,Sahraoui, H., and Boukadoum, M. 2010. Generating Transformation Rules from Examples for Behavioral Models. In Proceedings of Behavioural Modelling - Foundations and Application BMFA 2010 (Best Paper Award)
– Asztalos, M., Syriani, E., Kessentini, M., Wimmer, M., and Wimmer, M. 2010. Towards Rule Composition. MODELS 2010 MPM Workshops. Springer. (Best Paper Award )
My long-term project
• Source : rejected paper
• Target : accepted paper
• Base of examples : good/bad quality of papers