model driven performance analysis
DESCRIPTION
Model Driven Performance Analysis. University College London James Skene – [email protected]. Outline. Requirements for the analysis method, as I see them. Overview of the model driven performance approach chosen Rationale related to the requirements. Future work. - PowerPoint PPT PresentationTRANSCRIPT
![Page 2: Model Driven Performance Analysis](https://reader033.vdocument.in/reader033/viewer/2022052510/5681364b550346895d9dca86/html5/thumbnails/2.jpg)
Outline
• Requirements for the analysis method, as I see them.
• Overview of the model driven performance approach chosen– Rationale related to the requirements.
• Future work
![Page 3: Model Driven Performance Analysis](https://reader033.vdocument.in/reader033/viewer/2022052510/5681364b550346895d9dca86/html5/thumbnails/3.jpg)
Performance Analysis Functional Requirements
• Assuming the existence of the TAPAS platform…• Reason about compositionality of service level
agreements• Predict application capacity
– Over-provisioning or under provisioning w.r.t SLAs a cost.
– Targeted at ASP technologies
• Enable design time performance prediction– Select architecture
![Page 4: Model Driven Performance Analysis](https://reader033.vdocument.in/reader033/viewer/2022052510/5681364b550346895d9dca86/html5/thumbnails/4.jpg)
Non-functional requirements
• Be usable:– Performance analysis outside usual software
engineering competence• Must be integrated with standard software
engineering practice
– Minimise cost of performance analysis.
• Be used:– Performance analysis is currently not
performed despite benefits
![Page 5: Model Driven Performance Analysis](https://reader033.vdocument.in/reader033/viewer/2022052510/5681364b550346895d9dca86/html5/thumbnails/5.jpg)
Approach
• Mappings from analysis to design models within the Model Driven Architecture (MDA)
• Qualitatively:– Includes UML so is integrated with standard software
engineering practice.
– Is tool supported, so:• Can integrate the technique
• Can provide assistance with the technique
• Can automate the technique
• Also meets the functional requirements!
![Page 6: Model Driven Performance Analysis](https://reader033.vdocument.in/reader033/viewer/2022052510/5681364b550346895d9dca86/html5/thumbnails/6.jpg)
The Model Driven Architecture (MDA)
• Family of specifications– UML – The Unified Modelling Language– MOF – The Meta-Object Facility– CWM – The Common Warehouse Meta-model– Also: CORBA – The Common Object-Request
Broker Architecture
• Not really an architecture• Software designs captured as UML models
![Page 7: Model Driven Performance Analysis](https://reader033.vdocument.in/reader033/viewer/2022052510/5681364b550346895d9dca86/html5/thumbnails/7.jpg)
PIMs and PSMs
• Problem: Technical infrastructure changes independently of business rules, but these are strongly coupled in designs.
• Solution: Decouple them
Platform Independent Model (PIM)
Platform Specific Model (PSM)
«realize»
![Page 8: Model Driven Performance Analysis](https://reader033.vdocument.in/reader033/viewer/2022052510/5681364b550346895d9dca86/html5/thumbnails/8.jpg)
Semantic domains
• PIMs and PSMs relate to different types of thing.
• It is convenient to describe these designs using different languages.– E.g. EJB implementation details
• UML can describe object oriented designs.• UML contains extension mechanisms to
provide additional interpretations for model elements.
![Page 9: Model Driven Performance Analysis](https://reader033.vdocument.in/reader033/viewer/2022052510/5681364b550346895d9dca86/html5/thumbnails/9.jpg)
Metamodels
UML
PIM PSM
Virtual Metamodel
Profile
Meta-model:
Model:
![Page 10: Model Driven Performance Analysis](https://reader033.vdocument.in/reader033/viewer/2022052510/5681364b550346895d9dca86/html5/thumbnails/10.jpg)
Profiles
• The lightweight extension mechanisms:– Stereotypes extend the meaning of UML model
elements.
– Tagged values associate qualities with model elements.
– Constraints govern the form of models, enforcing domain semantics. Act at meta-model level.
– Profiles group stereotypes, tagged values and constraints.
• Freedom through constraint• Opportunity for standardisation
![Page 11: Model Driven Performance Analysis](https://reader033.vdocument.in/reader033/viewer/2022052510/5681364b550346895d9dca86/html5/thumbnails/11.jpg)
Mappings
PIM
PSM PSM
Source Code
Analysis
Results
PIM
![Page 12: Model Driven Performance Analysis](https://reader033.vdocument.in/reader033/viewer/2022052510/5681364b550346895d9dca86/html5/thumbnails/12.jpg)
How are mappings described?
• Imperative mappings specify an algorithm
• Declarative mappings specify pair-wise constraints
• Declarative mappings can be captured in a profile using constraints.
«profile»Mapping
«profile»Design
«profile»QN
![Page 13: Model Driven Performance Analysis](https://reader033.vdocument.in/reader033/viewer/2022052510/5681364b550346895d9dca86/html5/thumbnails/13.jpg)
Benefits of mappings
• They can be checked, providing assistance to modellers• Declarative mappings only need to be partially
specified– The flexibility addresses the difficulty in producing feasible
analysis models.
• The mappings define a semantics for the design domain, in terms of the analysis domain concepts.
• The declarative mappings provide guidance for subsequent automated mappings.
• Can capture expert modelling techniques
![Page 14: Model Driven Performance Analysis](https://reader033.vdocument.in/reader033/viewer/2022052510/5681364b550346895d9dca86/html5/thumbnails/14.jpg)
Design domain: A soft-real-time profile
Based on the ‘UML Profile for Schedulability, Performance, and Time Specification’
• Stereotypes to:– Identify workload classes– Identify resources accessed under mutual
exclusion– Identify actions having resource demands
![Page 15: Model Driven Performance Analysis](https://reader033.vdocument.in/reader033/viewer/2022052510/5681364b550346895d9dca86/html5/thumbnails/15.jpg)
A Soft-real-time profile 2
• Tagged values to:– Specify workload parameters (e.g. population, think-
time, or arrival rate)
– Specify resource demands for actions/procedures
– Specify probabilities for choices, average number of iterations.
• Constraints:– Object containing action with resource demand must be
deployed in context where resource is available.
![Page 16: Model Driven Performance Analysis](https://reader033.vdocument.in/reader033/viewer/2022052510/5681364b550346895d9dca86/html5/thumbnails/16.jpg)
Example design model - sequence
:UpdateBean :ManagerBean :EmployeeBean
1:update()
2:ejbCreate()
3:ejbCreate()
{repetitions = 100, demand={cpu:10000}}
{p= 0.5, demand={cpu:100, disk1:5}}
{demand={cpu:100, disk1:5}}
![Page 17: Model Driven Performance Analysis](https://reader033.vdocument.in/reader033/viewer/2022052510/5681364b550346895d9dca86/html5/thumbnails/17.jpg)
Platform
Example design model – deployment
:UpdateBean
:ManagerBean
:EmployeeBean{serviceRate=0.1s}
«resource»
CPU
«resource»
Disk2
«resource»
Disk1«deploys»
«deploys»
«deploys»
{serviceRate=0.1s}
{serviceRate=0.001s}
![Page 18: Model Driven Performance Analysis](https://reader033.vdocument.in/reader033/viewer/2022052510/5681364b550346895d9dca86/html5/thumbnails/18.jpg)
A performance analysis domain profile
• Queuing networks• Stereotypes:
– Identify instances as queues, delays or populations.
• Tagged values:– Specify service intervals and probabilities on links.
• Constraints:– Ensure that the network is connected.
![Page 19: Model Driven Performance Analysis](https://reader033.vdocument.in/reader033/viewer/2022052510/5681364b550346895d9dca86/html5/thumbnails/19.jpg)
Example QN Collaboration
«client»
Workload
«queue»
CPU
«queue»
Disk1
«queue»
Disk2{thinkTime = 5sec,
Population = 15}
{serviceRate = 1000}
{p = 0.05}
{p = 0.05}
{p = 0.02}
{serviceRate = 0.1}
{serviceRate = 0.1}
![Page 20: Model Driven Performance Analysis](https://reader033.vdocument.in/reader033/viewer/2022052510/5681364b550346895d9dca86/html5/thumbnails/20.jpg)
Mapping from design to analysis domain
• Resources correspond to queues.• Resource demands translate to probabilities or
demand vectors.• Much more complicated mappings will be required to
capture infrastructure details (e.g. performance of containers).
«model»
Design
«model»
QN«DesignToQN»
![Page 21: Model Driven Performance Analysis](https://reader033.vdocument.in/reader033/viewer/2022052510/5681364b550346895d9dca86/html5/thumbnails/21.jpg)
Requirements?
PIM
PSM
Analysis
EJB MQ server Oracle
Tool:
QN
SPA
SPN
Lifecycle
SLAng
![Page 22: Model Driven Performance Analysis](https://reader033.vdocument.in/reader033/viewer/2022052510/5681364b550346895d9dca86/html5/thumbnails/22.jpg)
Progress
• SLAng identifies relevant scenarios and technologies.
• Assembling a toolset:– Poseidon UML– MDR plug-in for NetBeans– LUI OCL checker, NEPTUNE project– Libraries and tools for performance analysis
![Page 23: Model Driven Performance Analysis](https://reader033.vdocument.in/reader033/viewer/2022052510/5681364b550346895d9dca86/html5/thumbnails/23.jpg)
Future work
• Define profiles
• Associate with SLAng constructs
• Create tool to automate analysis
• Integrate into single IDE
• Automate mappings