2010 01 lecture sig um mfes 2 - patterns metrics quality
TRANSCRIPT
![Page 1: 2010 01 lecture SIG UM MFES 2 - Patterns metrics quality](https://reader033.vdocument.in/reader033/viewer/2022060108/554e7bf3b4c90545698b5011/html5/thumbnails/1.jpg)
38 I 112
Software Analysis and Testing, MFES Universidade do Minho by Joost Visser, Software Improvement Group © 2010.
Structure of the lecture
Analysis
StaticAnalysis
DynamicAnalysis
testingmetrics modelspatterns
![Page 2: 2010 01 lecture SIG UM MFES 2 - Patterns metrics quality](https://reader033.vdocument.in/reader033/viewer/2022060108/554e7bf3b4c90545698b5011/html5/thumbnails/2.jpg)
39 I 112
Software Analysis and Testing, MFES Universidade do Minho by Joost Visser, Software Improvement Group © 2010.
PATTERNS
![Page 3: 2010 01 lecture SIG UM MFES 2 - Patterns metrics quality](https://reader033.vdocument.in/reader033/viewer/2022060108/554e7bf3b4c90545698b5011/html5/thumbnails/3.jpg)
40 I 112
Software Analysis and Testing, MFES Universidade do Minho by Joost Visser, Software Improvement Group © 2010.
Patterns
Coding style and coding standards• E.g. layout, identifiers, method length, …
Secure coding guidelines
• E.g. SQL injection, stack trace visibility
Bug patterns
• E.g. null pointer dereferencing, bounds checking
Code smells
• E.g. “god class”, “greedy class”, ..
![Page 4: 2010 01 lecture SIG UM MFES 2 - Patterns metrics quality](https://reader033.vdocument.in/reader033/viewer/2022060108/554e7bf3b4c90545698b5011/html5/thumbnails/4.jpg)
41 I 112
Software Analysis and Testing, MFES Universidade do Minho by Joost Visser, Software Improvement Group © 2010.
PatternsStyle and standards
Checking coding style and coding standards• Layout rules (boring)
• Identifier conventions
• Length of methods
• Depth of conditionals
Aim
• Consistency across different developers
• Ensure maintainability
Tools
• E.g. CheckStyle, PMD, …
• Integrated into IDE, into nightly build
• Can be customized
![Page 5: 2010 01 lecture SIG UM MFES 2 - Patterns metrics quality](https://reader033.vdocument.in/reader033/viewer/2022060108/554e7bf3b4c90545698b5011/html5/thumbnails/5.jpg)
42 I 112
Software Analysis and Testing, MFES Universidade do Minho by Joost Visser, Software Improvement Group © 2010.
PatternsSecure coding
Checking secure coding guidelines• SQL injection attack
• Storing and sending passwords
• Stack-trace leaking
• Cross-site scripting
Aim
• Ensure security
• Security = Confidentiality + Integrity + Availability
Tools
• E.g. Fortify, Coverity
![Page 6: 2010 01 lecture SIG UM MFES 2 - Patterns metrics quality](https://reader033.vdocument.in/reader033/viewer/2022060108/554e7bf3b4c90545698b5011/html5/thumbnails/6.jpg)
43 I 112
Software Analysis and Testing, MFES Universidade do Minho by Joost Visser, Software Improvement Group © 2010.
PatternsBugs
Detecting bug patterns
• Null-dereferencing
• Lack of array bounds checking
• Buffer overflow
Aim
• Correctness
• Compensate for weak type checks
Tools:
• e.g. FindBugs
• Esp. for C, C++
![Page 7: 2010 01 lecture SIG UM MFES 2 - Patterns metrics quality](https://reader033.vdocument.in/reader033/viewer/2022060108/554e7bf3b4c90545698b5011/html5/thumbnails/7.jpg)
44 I 112
Software Analysis and Testing, MFES Universidade do Minho by Joost Visser, Software Improvement Group © 2010.
PatternsExercises
Run PMD / Checkstyle / FindBugs
• E.g. on a project of your own
• E.g. on some (easy-to-compile) open source project
Inspect results
• False or true positives?
![Page 8: 2010 01 lecture SIG UM MFES 2 - Patterns metrics quality](https://reader033.vdocument.in/reader033/viewer/2022060108/554e7bf3b4c90545698b5011/html5/thumbnails/8.jpg)
45 I 112
Software Analysis and Testing, MFES Universidade do Minho by Joost Visser, Software Improvement Group © 2010.
Structure of the lecture
Analysis
StaticAnalysis
DynamicAnalysis
testingmetrics modelspatterns
![Page 9: 2010 01 lecture SIG UM MFES 2 - Patterns metrics quality](https://reader033.vdocument.in/reader033/viewer/2022060108/554e7bf3b4c90545698b5011/html5/thumbnails/9.jpg)
46 I 112
Software Analysis and Testing, MFES Universidade do Minho by Joost Visser, Software Improvement Group © 2010.
METRICS & QUALITY
![Page 10: 2010 01 lecture SIG UM MFES 2 - Patterns metrics quality](https://reader033.vdocument.in/reader033/viewer/2022060108/554e7bf3b4c90545698b5011/html5/thumbnails/10.jpg)
47 I 112
Software Analysis and Testing, MFES Universidade do Minho by Joost Visser, Software Improvement Group © 2010.
Software analysisWhat?
performance
complexity
defects
reliability
securitycorrectness
size
adaptability
usability
Quality
![Page 11: 2010 01 lecture SIG UM MFES 2 - Patterns metrics quality](https://reader033.vdocument.in/reader033/viewer/2022060108/554e7bf3b4c90545698b5011/html5/thumbnails/11.jpg)
48 I 112
Software Analysis and Testing, MFES Universidade do Minho by Joost Visser, Software Improvement Group © 2010.
The bermuda triangle of software quality
Process(organizational)
Project(individual)
People(individual)
Product
CMMI(Scampi)
Prince2
Siebel(Oracle)
ITIL
SAS70
J2EE(IBM)
MCP(Microsoft)
COBIT SecurityISO17799ISO27001BS7799
Six Sigma
ISO 20000
DSDM
TickITISO9001:2000
TMapISTQB RUP
(IBM)
PMI
![Page 12: 2010 01 lecture SIG UM MFES 2 - Patterns metrics quality](https://reader033.vdocument.in/reader033/viewer/2022060108/554e7bf3b4c90545698b5011/html5/thumbnails/12.jpg)
49 I 112
Software Analysis and Testing, MFES Universidade do Minho by Joost Visser, Software Improvement Group © 2010.
Software QualityProcess
Capability Maturity Model® Integration (CMMI®)• “… is a process improvement approach that provides organizations with the
essential elements of effective processes..” (SEI)• CMMI for Development (CMMI-DEV), Version 1.2, August 2006.• consists of 22 process areas with capability or maturity levels.• CMMI was created and is maintained by a team consisting of members from
industry, government, and the Software Engineering Institute (SEI)• http://www.sei.cmu.edu/cmmi
The Standard CMMI Appraisal Methodfor Process Improvement (SCAMPI)• “… is the official SEI method to provide
benchmark-quality ratings relative to CMMI models.”
![Page 13: 2010 01 lecture SIG UM MFES 2 - Patterns metrics quality](https://reader033.vdocument.in/reader033/viewer/2022060108/554e7bf3b4c90545698b5011/html5/thumbnails/13.jpg)
50 I 112
Software Analysis and Testing, MFES Universidade do Minho by Joost Visser, Software Improvement Group © 2010.
Software QualityProcess
http://sas.sei.cmu.edu/pars/
![Page 14: 2010 01 lecture SIG UM MFES 2 - Patterns metrics quality](https://reader033.vdocument.in/reader033/viewer/2022060108/554e7bf3b4c90545698b5011/html5/thumbnails/14.jpg)
51 I 112
Software Analysis and Testing, MFES Universidade do Minho by Joost Visser, Software Improvement Group © 2010.
Software QualityProcess
Levels• L1: Initial• L2: Managed• L3: Defined• L4: Quantitatively Managed• L5: Optimizing
http://www.cmmi.de(browser)
Process Areas• Causal Analysis and Resolution• Configuration Management• Decision Analysis and Resolution• Integrated Project Management• Measurement and Analysis• Organizational Innovation and Deployment• Organizational Process Definition• Organizational Process Focus• Organizational Process Performance• Organizational Training• Product Integration• Project Monitoring and Control• CMMI Project Planning• Process and Product Quality Assurance• Quantitative Project Management• Requirements Development• Requirements Management• Risk Management• Supplier Agreement Management• Technical Solution• Validation• Verification
![Page 15: 2010 01 lecture SIG UM MFES 2 - Patterns metrics quality](https://reader033.vdocument.in/reader033/viewer/2022060108/554e7bf3b4c90545698b5011/html5/thumbnails/15.jpg)
52 I 112
Software Analysis and Testing, MFES Universidade do Minho by Joost Visser, Software Improvement Group © 2010.
The bermuda triangle of software quality
Process(organizational)
Project(individual)
People(individual)
Product
CMMI(Scampi)
Prince2
Siebel(Oracle)
ITIL
SAS70
J2EE(IBM)
MCP(Microsoft)
COBIT SecurityISO17799ISO27001BS7799
Six Sigma
ISO 20000
DSDM
TickITISO9001:2000
TMapISTQB RUP
(IBM)
PMI
ISO 9126ISO 14598
![Page 16: 2010 01 lecture SIG UM MFES 2 - Patterns metrics quality](https://reader033.vdocument.in/reader033/viewer/2022060108/554e7bf3b4c90545698b5011/html5/thumbnails/16.jpg)
53 I 112
Software Analysis and Testing, MFES Universidade do Minho by Joost Visser, Software Improvement Group © 2010.
But …
What is software quality?
What are the technical and functional aspects of quality?
How can technical and functional quality be measured?
![Page 17: 2010 01 lecture SIG UM MFES 2 - Patterns metrics quality](https://reader033.vdocument.in/reader033/viewer/2022060108/554e7bf3b4c90545698b5011/html5/thumbnails/17.jpg)
54 I 112
Software Analysis and Testing, MFES Universidade do Minho by Joost Visser, Software Improvement Group © 2010.
Software product quality standards
ISO/IEC 9126Software engineering -- Product quality
1. Quality model2. External metrics3. Internal metrics4. Quality in use metrics
ISO/IEC 14598Information technology -- Software product evaluation
1. General overview2. Planning and management3. Process for developers4. Process for acquirers5. Process for evaluators6. Documentation of evaluation modules
![Page 18: 2010 01 lecture SIG UM MFES 2 - Patterns metrics quality](https://reader033.vdocument.in/reader033/viewer/2022060108/554e7bf3b4c90545698b5011/html5/thumbnails/18.jpg)
55 I 112
Software Analysis and Testing, MFES Universidade do Minho by Joost Visser, Software Improvement Group © 2010.
ISO/IEC 9126, Part 1Quality perspectives
external quality
internal quality
quality in useeffect ofsoftwareproduct
softwareproduct
build
test
deploy
9126, Part 3
9126, Part 2
9126, Part 4
metricsphase
![Page 19: 2010 01 lecture SIG UM MFES 2 - Patterns metrics quality](https://reader033.vdocument.in/reader033/viewer/2022060108/554e7bf3b4c90545698b5011/html5/thumbnails/19.jpg)
56 I 112
Software Analysis and Testing, MFES Universidade do Minho by Joost Visser, Software Improvement Group © 2010.
ISO/IEC 9126, Part 1Product quality model: internal and external
ISO/IEC 9126Internal/External Quality
reliabilityusability
efficiencyportability
maintainability
analysabilitychangeabilitystabilitytestability
functionality
suitabilityaccuracyinteroperabilitysecurity
maturityfault-tolerancerecoverability
understandabilitylearnabilityoperabilityattractiveness
time behavior
resource utilisation
adaptabilityinstallabilityco-existencereplaceability
![Page 20: 2010 01 lecture SIG UM MFES 2 - Patterns metrics quality](https://reader033.vdocument.in/reader033/viewer/2022060108/554e7bf3b4c90545698b5011/html5/thumbnails/20.jpg)
57 I 112
Software Analysis and Testing, MFES Universidade do Minho by Joost Visser, Software Improvement Group © 2010.
ISO 9126, Part 1Maintainability (= evolvability)
Maintain
Analyze Change Stabilize Test
Maintainability =• Analyzability: easy to understand where and how to modify?• Changeability: easy to perform modification?• Stability: easy to keep coherent when modifying?• Testability: easy to test after modification?
![Page 21: 2010 01 lecture SIG UM MFES 2 - Patterns metrics quality](https://reader033.vdocument.in/reader033/viewer/2022060108/554e7bf3b4c90545698b5011/html5/thumbnails/21.jpg)
58 I 112
Software Analysis and Testing, MFES Universidade do Minho by Joost Visser, Software Improvement Group © 2010.
ISO 9126, Part 1Reliability
Degree of failure
Prevent Tolerate Recover
Reliability =• Maturity: how much has been done to prevent failures?• Fault tolerance: when failure occurs, is it fatal?• Recoverability: when fatal failure occurs, how much effort to restart?
![Page 22: 2010 01 lecture SIG UM MFES 2 - Patterns metrics quality](https://reader033.vdocument.in/reader033/viewer/2022060108/554e7bf3b4c90545698b5011/html5/thumbnails/22.jpg)
59 I 112
Software Analysis and Testing, MFES Universidade do Minho by Joost Visser, Software Improvement Group © 2010.
ISO/IEC 9126, Part 1Product quality model: quality-in-use
ISO/IEC 9126Quality in Use
effectivenessproductivity
satisfactionsafety
![Page 23: 2010 01 lecture SIG UM MFES 2 - Patterns metrics quality](https://reader033.vdocument.in/reader033/viewer/2022060108/554e7bf3b4c90545698b5011/html5/thumbnails/23.jpg)
60 I 112
Software Analysis and Testing, MFES Universidade do Minho by Joost Visser, Software Improvement Group © 2010.
ISO 9126Part 2,3: metrics
External metrics, e.g.:• Changeability: “change implementation elapse time”,
time between diagnosis and correction
• Testability: “re-test efficiency”, time between correction and conclusion of test
Internal metrics, e.g.:• Analysability: “activity recording”,
ratio between actual and required number of logged data items• Changeability: “change impact”,
number of modifications and problems introduced by them
Critique
• Not pure product measures, rather product in its environment
• Measure after the fact
• No clear distinction between functional and technical quality
![Page 24: 2010 01 lecture SIG UM MFES 2 - Patterns metrics quality](https://reader033.vdocument.in/reader033/viewer/2022060108/554e7bf3b4c90545698b5011/html5/thumbnails/24.jpg)
61 I 112
Software Analysis and Testing, MFES Universidade do Minho by Joost Visser, Software Improvement Group © 2010.
The issue
• Companies innovate and change
• Software systems need to adapt in the same pace as the business changes
• Software systems that do not adapt lose their value
• The technical quality of software systems is a key element
Clients
Business
IT
![Page 25: 2010 01 lecture SIG UM MFES 2 - Patterns metrics quality](https://reader033.vdocument.in/reader033/viewer/2022060108/554e7bf3b4c90545698b5011/html5/thumbnails/25.jpg)
62 I 112
Software Analysis and Testing, MFES Universidade do Minho by Joost Visser, Software Improvement Group © 2010.
Functional vs technical quality
Functional quality
Technicalquality
low cost & risk
high cost & risk
Software with high technical quality can evolve with low cost andrisk to keep meeting functional and non-functional requirements.
![Page 26: 2010 01 lecture SIG UM MFES 2 - Patterns metrics quality](https://reader033.vdocument.in/reader033/viewer/2022060108/554e7bf3b4c90545698b5011/html5/thumbnails/26.jpg)
63 I 112
Software Analysis and Testing, MFES Universidade do Minho by Joost Visser, Software Improvement Group © 2010.
ISO/IEC 9126, Part 1Product quality model: technical quality
ISO/IEC 9126Software Product Quality
reliabilityusability
efficiencyportability
maintainability
analysabilitychangeabilitystabilitytestability
functionality
suitabilityaccuracyinteroperabilitysecurity
maturityfault-tolerancerecoverability
understandabilitylearnabilityoperabilityattractiveness
time behavior
resource utilisation
adaptabilityinstallabilityco-existencereplaceability
![Page 27: 2010 01 lecture SIG UM MFES 2 - Patterns metrics quality](https://reader033.vdocument.in/reader033/viewer/2022060108/554e7bf3b4c90545698b5011/html5/thumbnails/27.jpg)
64 I 112
Software Analysis and Testing, MFES Universidade do Minho by Joost Visser, Software Improvement Group © 2010.
So …
What is software quality?
What are the functional and technical aspects of quality?
How can technical quality be measured?
![Page 28: 2010 01 lecture SIG UM MFES 2 - Patterns metrics quality](https://reader033.vdocument.in/reader033/viewer/2022060108/554e7bf3b4c90545698b5011/html5/thumbnails/28.jpg)
65 I 112
Software Analysis and Testing, MFES Universidade do Minho by Joost Visser, Software Improvement Group © 2010.
A Challenge
Use source code metrics to measure technical quality?
Plenty of metrics defined in literature• LOC, cyclomatic complexity, fan in/out, coupling,
cohesion, …• Halstead, Chidamber-Kemener, Shepperd, …
Plenty of tools available• Variations on Lint, PMD, FindBugs, …• Coverity, FxCop, Fortify, QA-C, Understand, …• Integrated into IDEs
But:• Do they measure technical quality of a system?
![Page 29: 2010 01 lecture SIG UM MFES 2 - Patterns metrics quality](https://reader033.vdocument.in/reader033/viewer/2022060108/554e7bf3b4c90545698b5011/html5/thumbnails/29.jpg)
66 I 112
Software Analysis and Testing, MFES Universidade do Minho by Joost Visser, Software Improvement Group © 2010.
Source code metricsLines of code (LOC)
• Easy! Or …
• SLOC = Source Lines of Code• Physical (≈ newlines)• Logical (≈ statements)
• Blank lines, comment lines, lines with only “}”• Generated versus manually written
• Measure effort / productivity: specific to programming language
![Page 30: 2010 01 lecture SIG UM MFES 2 - Patterns metrics quality](https://reader033.vdocument.in/reader033/viewer/2022060108/554e7bf3b4c90545698b5011/html5/thumbnails/30.jpg)
67 I 112
Software Analysis and Testing, MFES Universidade do Minho by Joost Visser, Software Improvement Group © 2010.
Source code metricsFunction Point Analysis (FPA)
• A.J. Albrecht - IBM - 1979• Objective measure of functional size
• Counted manually• IFPUG, Nesma, Cocomo• Large error margins
• Backfiring• Per language correlated with LOC• SPR, QSM
• Problematic, but popular for estimation
![Page 31: 2010 01 lecture SIG UM MFES 2 - Patterns metrics quality](https://reader033.vdocument.in/reader033/viewer/2022060108/554e7bf3b4c90545698b5011/html5/thumbnails/31.jpg)
68 I 112
Software Analysis and Testing, MFES Universidade do Minho by Joost Visser, Software Improvement Group © 2010.
Source code metricsCyclomatic complexity
• T. McCabe, IEEE Trans. on Sw Engineering, 1976• Accepted in the software community• Number of independent, non-circular paths per method• Intuitive: number of decisions made in a method• 1 + the number of if statements (and while, for, ...)
if
if
while
![Page 32: 2010 01 lecture SIG UM MFES 2 - Patterns metrics quality](https://reader033.vdocument.in/reader033/viewer/2022060108/554e7bf3b4c90545698b5011/html5/thumbnails/32.jpg)
69 I 112
Software Analysis and Testing, MFES Universidade do Minho by Joost Visser, Software Improvement Group © 2010.
Code duplicationDefinition
Code duplication measurement
0: abc1: def2: ghi3: jkl4: mno5: pqr6: stu7: vwx8: yz
34: xxxxx35: def36: ghi37: jkl38: mno39: pqr40: stu41: vwx42: xxxxxx
Number of duplicated lines:14
![Page 33: 2010 01 lecture SIG UM MFES 2 - Patterns metrics quality](https://reader033.vdocument.in/reader033/viewer/2022060108/554e7bf3b4c90545698b5011/html5/thumbnails/33.jpg)
70 I 112
Software Analysis and Testing, MFES Universidade do Minho by Joost Visser, Software Improvement Group © 2010.
Code duplication
A B C D
![Page 34: 2010 01 lecture SIG UM MFES 2 - Patterns metrics quality](https://reader033.vdocument.in/reader033/viewer/2022060108/554e7bf3b4c90545698b5011/html5/thumbnails/34.jpg)
71 I 112
Software Analysis and Testing, MFES Universidade do Minho by Joost Visser, Software Improvement Group © 2010.
Source code metricsCoupling
• Efferent Coupling (Ce)• How many classes do I depend on?
• Afferent Coupling (Ca)• How many classes depend on me?
• Instability = Ce/(Ca+Ce) ∈ [0,1]• Ratio of efferent versus total coupling• 0 = very stable = hard to change• 1 = very instable = easy to change
![Page 35: 2010 01 lecture SIG UM MFES 2 - Patterns metrics quality](https://reader033.vdocument.in/reader033/viewer/2022060108/554e7bf3b4c90545698b5011/html5/thumbnails/35.jpg)
72 I 112
Software Analysis and Testing, MFES Universidade do Minho by Joost Visser, Software Improvement Group © 2010.
Software metrics crisisHow does measurement data lead to information?
Plethora of software metrics• Ample definitions in literature• Ample tools that calculate
Measurement yields data, not information• How to aggregate individual measurement values?• How to map aggregated values onto quality attributes?• How to set thresholds?• How to act on results?
SIG quality model handles these issues in a pragmatic way
![Page 36: 2010 01 lecture SIG UM MFES 2 - Patterns metrics quality](https://reader033.vdocument.in/reader033/viewer/2022060108/554e7bf3b4c90545698b5011/html5/thumbnails/36.jpg)
73 I 112
Software Analysis and Testing, MFES Universidade do Minho by Joost Visser, Software Improvement Group © 2010.
The statistical nature of software metricsAveraging is fundamentally flawed
Average• Is measure for central tendency• For “symmetric” distributions, such as normal. But:
![Page 37: 2010 01 lecture SIG UM MFES 2 - Patterns metrics quality](https://reader033.vdocument.in/reader033/viewer/2022060108/554e7bf3b4c90545698b5011/html5/thumbnails/37.jpg)
74 I 112
Software Analysis and Testing, MFES Universidade do Minho by Joost Visser, Software Improvement Group © 2010.
The statistical nature of software metricsEmphasize area of risk
Exploit a-symmetry• High-risk code is on the right• Weighing with LOC
![Page 38: 2010 01 lecture SIG UM MFES 2 - Patterns metrics quality](https://reader033.vdocument.in/reader033/viewer/2022060108/554e7bf3b4c90545698b5011/html5/thumbnails/38.jpg)
75 I 112
Software Analysis and Testing, MFES Universidade do Minho by Joost Visser, Software Improvement Group © 2010.
The statistical nature of software metricsGo where the variation is
Observe for all:• Systems are similar in low percentiles. Systems differ in higher percentiles.• Interesting differences occur mostly above the 70% percentile
![Page 39: 2010 01 lecture SIG UM MFES 2 - Patterns metrics quality](https://reader033.vdocument.in/reader033/viewer/2022060108/554e7bf3b4c90545698b5011/html5/thumbnails/39.jpg)
76 I 112
Software Analysis and Testing, MFES Universidade do Minho by Joost Visser, Software Improvement Group © 2010.
The statistical nature of software metricsGo where the variation is
Similar for most source code metrics
![Page 40: 2010 01 lecture SIG UM MFES 2 - Patterns metrics quality](https://reader033.vdocument.in/reader033/viewer/2022060108/554e7bf3b4c90545698b5011/html5/thumbnails/40.jpg)
77 I 112
Software Analysis and Testing, MFES Universidade do Minho by Joost Visser, Software Improvement Group © 2010.
SIG Quality ModelQuality profiles
1. Measure source code metricsper method / file / module
2. Summarize distribution of measurementvalues in “Quality Profiles”
Very high> 50
High21 - 50
Moderate11 - 20
Low1 - 10
Riskcategory
Cyclomaticcomplexity
Sum lines of codeper category Lines of code per risk category
13 %
High
5 %
Very high
12 %70 %
ModerateLow
![Page 41: 2010 01 lecture SIG UM MFES 2 - Patterns metrics quality](https://reader033.vdocument.in/reader033/viewer/2022060108/554e7bf3b4c90545698b5011/html5/thumbnails/41.jpg)
78 I 112
Software Analysis and Testing, MFES Universidade do Minho by Joost Visser, Software Improvement Group © 2010.
Quality profilesComparing systems
Aggregation by averaging is fundamentally flawed
![Page 42: 2010 01 lecture SIG UM MFES 2 - Patterns metrics quality](https://reader033.vdocument.in/reader033/viewer/2022060108/554e7bf3b4c90545698b5011/html5/thumbnails/42.jpg)
79 I 112
Software Analysis and Testing, MFES Universidade do Minho by Joost Visser, Software Improvement Group © 2010.
Quality profiles, in general
Input• type Input metric = Map item (metric,LOC)
Risk groups• type Risk = Low | Moderate | High | Very High• risk :: metric → Risk
Output• type ProfileAbs = Map Risk LOC• type Profile = Map Risk Percentage
Aggregation• profile :: Input metric → Profile
![Page 43: 2010 01 lecture SIG UM MFES 2 - Patterns metrics quality](https://reader033.vdocument.in/reader033/viewer/2022060108/554e7bf3b4c90545698b5011/html5/thumbnails/43.jpg)
80 I 112
Software Analysis and Testing, MFES Universidade do Minho by Joost Visser, Software Improvement Group © 2010.
SIG Quality ModelHow do measurements lead to ratings?
A practical model for measuring maintainabilityHeitlager, Kuipers, Visser in QUATIC 2007, IEEE Press
a. Aggregate measurements into “Quality Profiles”b. Map measurements and quality profiles to ratings for system propertiesc. Map ratings for system properties to ratings for ISO/IEC 9126 quality characteristicsd. Map to overall rating of technical quality
QualityProfiles
HHHHI
HHHHH
HHHHI
HHHII
HIIII
HHIII
PropertyRatings
HHHHI
HHHII
HHIII
HHIII
QualityRatings
HHHII
OverallRating
Measure-ments
a. b. c. d.
![Page 44: 2010 01 lecture SIG UM MFES 2 - Patterns metrics quality](https://reader033.vdocument.in/reader033/viewer/2022060108/554e7bf3b4c90545698b5011/html5/thumbnails/44.jpg)
81 I 112
Software Analysis and Testing, MFES Universidade do Minho by Joost Visser, Software Improvement Group © 2010.
Certification clientreceives certificate and obtains
right to use quality mark
Certification bodyconfirms evaluation report
and issues certificate
System producersubmits source code and
high-level description
system source code+ high-level description
Evaluation bodyperforms evaluation and delivers evaluation report
certificate
evaluationreport
System producer andcertification client canbe the sameorganization
1.
2.
3.
Software product certificationby SIG and TÜViT
![Page 45: 2010 01 lecture SIG UM MFES 2 - Patterns metrics quality](https://reader033.vdocument.in/reader033/viewer/2022060108/554e7bf3b4c90545698b5011/html5/thumbnails/45.jpg)
82 I 112
Software Analysis and Testing, MFES Universidade do Minho by Joost Visser, Software Improvement Group © 2010.
Evaluation results
Evaluation report• Defines scope of the evaluation• Summarizes measurement results• Provides ratings (properties, quality, and overall)• May provide hints for the producer to improve ratings
Certificate• States conformance to
SIG/TÜViT Evaluation Criteria• Confers right to use quality mark
“TÜViT Trusted Product Maintainability”
![Page 46: 2010 01 lecture SIG UM MFES 2 - Patterns metrics quality](https://reader033.vdocument.in/reader033/viewer/2022060108/554e7bf3b4c90545698b5011/html5/thumbnails/46.jpg)
83 I 112
Software Analysis and Testing, MFES Universidade do Minho by Joost Visser, Software Improvement Group © 2010.
A pragmatic model for measuring maintainability.Heitlager, T. Kuipers, J. Visser. QUATIC 2007.
Certification of Technical Quality of Software.J.P. Correia, J.Visser. OpenCert 2008.
Mapping System Properties to ISO/IEC 9126 Maintainability CharacteristicsJ.P. Correia, Y. Kanellopoulos, J.Visser. SQM 2009.
Further reading
![Page 47: 2010 01 lecture SIG UM MFES 2 - Patterns metrics quality](https://reader033.vdocument.in/reader033/viewer/2022060108/554e7bf3b4c90545698b5011/html5/thumbnails/47.jpg)
84 I 112
Software Analysis and Testing, MFES Universidade do Minho by Joost Visser, Software Improvement Group © 2010.
Software Risk Assessment service
Assignment• “Can we scale from 100 to 100,000 customers?”• “Should we accept delay and cost overrun, or cancel the project?”
Analysis• Source code: understanding (reverse engineering) + evaluation (quality)• Interviews: technical + strategic
Reporting• Quality judgment using star ratings
• Risk analysis putting quality findings in business perspective
• Recommendations to mitigate risks
![Page 48: 2010 01 lecture SIG UM MFES 2 - Patterns metrics quality](https://reader033.vdocument.in/reader033/viewer/2022060108/554e7bf3b4c90545698b5011/html5/thumbnails/48.jpg)
85 I 112
Software Analysis and Testing, MFES Universidade do Minho by Joost Visser, Software Improvement Group © 2010.
Software Risk Assessment
Docum
entation
Interviews
Facts
Interpretation, reconciliation, evaluation
Presentation
Facts
Automatedanalysis
Report
“Facts”
Benchmark Source code
![Page 49: 2010 01 lecture SIG UM MFES 2 - Patterns metrics quality](https://reader033.vdocument.in/reader033/viewer/2022060108/554e7bf3b4c90545698b5011/html5/thumbnails/49.jpg)
86 I 112
Software Analysis and Testing, MFES Universidade do Minho by Joost Visser, Software Improvement Group © 2010.
Software Risk AssessmentExample: stagnation before go-live
DB
170 MY
GUI
23 MY13 MY
15 MY
7 MY
5 MY
7 MY5 MYDB
Ruleengine
DB
Core
templates
21 MY
Internal architecture• Technology risks• Rebuild value• Quality
Results• Insurmountable stability issues, untestable, excessive maintenance burden• Now: reduce technical complexity, partially automate deployment• Start planning replacement
![Page 50: 2010 01 lecture SIG UM MFES 2 - Patterns metrics quality](https://reader033.vdocument.in/reader033/viewer/2022060108/554e7bf3b4c90545698b5011/html5/thumbnails/50.jpg)
87 I 112
Software Analysis and Testing, MFES Universidade do Minho by Joost Visser, Software Improvement Group © 2010.
Software Monitoring service
Quality roadmap• “complexity from 2 to 4 stars by 3rd month” in maintenance project
• “final product shall be 4 stars” in development project
Dashboard• Regular analysis of source code typically once per week
• Shown on dashboard with overviews and drill down possibilities
Consultancy• Regular reports (presentation and/or written)• Guard quality agreements, meet quality targets.• Identify risks and opportunities
![Page 51: 2010 01 lecture SIG UM MFES 2 - Patterns metrics quality](https://reader033.vdocument.in/reader033/viewer/2022060108/554e7bf3b4c90545698b5011/html5/thumbnails/51.jpg)
88 I 112
Software Analysis and Testing, MFES Universidade do Minho by Joost Visser, Software Improvement Group © 2010.
Software MonitorDashboard
![Page 52: 2010 01 lecture SIG UM MFES 2 - Patterns metrics quality](https://reader033.vdocument.in/reader033/viewer/2022060108/554e7bf3b4c90545698b5011/html5/thumbnails/52.jpg)
89 I 112
Software Analysis and Testing, MFES Universidade do Minho by Joost Visser, Software Improvement Group © 2010.
Software MonitorExample: vendor management and roadmap
Duplication
Complexity
From client testimonial:• “Technical quality: as it improves adding functionality is made easier”• “As quality was increasing, productivity was going up”
![Page 53: 2010 01 lecture SIG UM MFES 2 - Patterns metrics quality](https://reader033.vdocument.in/reader033/viewer/2022060108/554e7bf3b4c90545698b5011/html5/thumbnails/53.jpg)
90 I 112
Software Analysis and Testing, MFES Universidade do Minho by Joost Visser, Software Improvement Group © 2010.
What should you remember (so far)from this lecture?
Testing• Automated unit testing!
Patterns• Run tools!
Quality and metrics• Technical quality matters in the long run• A few simple metrics are sufficient• If aggregated in well-chosen, meaningful ways• The simultaneous use of distinct metrics allows zooming in on root
causes