remediation metrics
DESCRIPTION
Remediation metrics. For each problem reported Severity Level (Crit, Error, Warn, Info) Review State (Via workflow) Unknown (not reviewed yet) Known (reviewed, but nothing else done) Accepted (reviewed, but not formally accepted) Mitigated (problem fixed). Report Display. - PowerPoint PPT PresentationTRANSCRIPT
Remediation metrics
• For each problem reported– Severity Level (Crit, Error, Warn, Info)– Review State (Via workflow)
• Unknown (not reviewed yet)• Known (reviewed, but nothing else done)• Accepted (reviewed, but not formally accepted)• Mitigated (problem fixed)
Report Display
• Table of raw counts (row=Sev, col=Rev)
• Horiz bar for each Sev– Shows composition % in each Rev state
• List of problem types for each Rev state
• 4x4 sparklines (Sev x Rev)
Things to add/change?
• Use capture-recapture (or capture for removal) for each Rev state
• Show ‘delta market share’ chart for each Rev– Normalized to show % of each Sev in the
Rev
• Lots of other things, I’m sure… it’s still young
Code complexity metrics
• McCabe Cyclomatic– MCC == Br + 1– Branch complexity
• System Complexity– SYSC == Fo^2 + P/(Fo+1)– Design-time metric to show ‘difficulty to
implement’– Card, D. N. And W. W. Agresti. "Measuring Software Design Complexity."
The Journal of Systems And Software 8, 3 (June 1988), 185-197.
• Information flow complexity– IFC == ((Fi+Vr)*(Fo+Vw))^2– Indicates ‘stress points’ or multi-purpose
functions– IEEE 982.2 1988. IEEE Guide for the Use of IEEE Standard Dictionary of
Measures to Produce Reliable Software. A25. Data of Information Flow Complexity. p. 112.