security metrics
DESCRIPTION
TRANSCRIPT
Security Management MetricsVicente Aceituno, 2008
Conferencia FIST Marzo/Madrid 2008 @
Sponsored by:
2
About me
Vice president of the ISSA Spain chapter. www.issa-spain.org
Vice president of the FIST Conferences association. www.fistconference.org
Author of a number of articles: Google: vaceituno wikipedia
Director of the ISM3 Consortium The consortium promotes ISM3, an ISMS standard ISM3 is the main source for this presentation. www.ism3.com
3
The world without Metrics
4
Management vs Engineering
Security Engineering: Design and build systems than can be used securely.
Security Management: Employ people and systems (that can be well or badly engineered) safely.
5
Targets vs Outcomes
Activity and Targets are weakly linked. Targets:
+Security / -Risk Trust
Activity: Keep systems updated Assign user accounts Inform users of their rights
6
Definition
Metrics are quantitative measurements that can be interpreted in the context of a series of previous or equivalent measurements.
Metrics make management possible:1. Measurement – Some call this “metrics” too.
2. Interpretation – Some call this “indicator”.
3. Investigation – (When appropriate, logs are key here) Common cause Special cause
4. Rationalization
5. Informed Decision
7
Qualitative vs Quantitative Measurement
William Thomson (Lord Kelvin): “I often say that when you can
measure what you are speaking about, and express it in numbers, you know something about it; but when you cannot express it in numbers, your knowledge is of a meager and unsatisfactory kind; it may be the beginning of knowledge, but you have scarcely, in your thoughts, advanced to the stage of science, whatever the matter may be”:
Meaning: “What can’t be measured, can’t be managed”
8
Interpretation
It doesn’t make sense to set thresholds beforehand. You have to learn what is normal to find out what is abnormal.
Thresholds can be fuzzy. False positives and false negatives. Example: 90% accuracy, 1000 students tested for HIV, 10 have it.
HIV Have HIV Don’t have HIV
Test positive for HIV 9 99
Test negative for HIV 1 891
9
Is it successful? Is it normal? How does it compare against peers?
Interpretation
10
Are outcomes better fit to their purpose? Are outcomes getting closer or further from target? Are we getting fewer false positives and false negatives? Are we using resources more efficiently?
Interpretation
11
Rationalization
Is the correction/change working? Is it cost effective? Can we meet our targets with the resources we
have? Are we getting the same outputs with fewer
resources?
12
Decisions
13
Good Metrics are SMARTIED
S.M.A.R.T Specific: The metric is relevant to the process being measured. Measurable: Metric measurement is feasible with reasonable cost. Actionable: It is possible to act on the process to improve the metric. Relevant: Improvements in the metric meaningfully enhances the
contribution of the process towards the goals of the management system.
Timely: The metric measurement is fast enough for being used effectively. +Interpretable: Interpretation is feasible (there is comparable
data) with reasonable cost (false positives or false negatives rates are low enough)
+Enquirable: Investigation is feasible with reasonable cost. +Dynamic: The metric values change over time.
14
Fashion vs Results
Real Time vs Continuous Improvement Management is far more than Incident Response.
Risk Assessment as a Metric Only as useful as Investigation results.
Certification / Audit Compliant / Not compliant is NOT a Metric.
15
What are good Metrics?
Activity: The number of outputs produced in a time period;Scope: The proportion of the environment or system that is protected by the process. Update: The time since the last update or refresh of process outputs.Availability: The time since a process has performed as expected upon demand (uptime), the frequency and duration of interruptions, and the time interval between interruptions.Efficiency / ROSI: Ratio of losses averted to the cost of the investment in the process. Efficacy / Benchmark: Ratio of outputs produced in comparison to the theoretical maximum. Measuring efficacy of a process implies the comparison against a baseline.Load: Ratio of available resources in actual use, like CPU load, repositories capacity, bandwidth, licenses and overtime hours per employee.Accuracy: Rate of false positives and false negatives.
16
Examples
Activity: Number of access attempts successful
Scope: % Resources protected with Access Control
Update: Time elapsed since last access attempt successful
Availability: % of Time Access Control is available
Efficiency / ROSI: Access attempts successful per euro
Efficacy / Benchmark: Malicious access attempts failed vs Malicious access attempts successful. Legitimate access attempts failed vs Legitimate access attempts
successful.Load: % mean and peak Gb, Mb/s, CPU and licenses in use.
17
Metrics and Capability
Undefined. The process might be used, but it is not defined.
Defined. The process is documented and used. Managed. The process is Defined and the results
of the process are used to fix and improve the process.
Controlled. The process is Managed and milestones and need of resources is accurately predicted.
Optimized. The process is Controlled and improvement leads to a saving in resources
18
Capability: Undefined
No metrics necessary
19
Capability: Defined
Measurement - None Interpretation - None Investigation – (When appropriate, logs are key here)
Common cause (changes in the environment, results of management decisions)
Special cause (incidents)
Rationalization for use of time, budget, people and other resources – Not possible
Informed Decision – Not possible
20
Capability: Managed
Measurement: Scope, Activity, Availability Interpretation - Accuracy. (rate of false negatives and false positives)
Is it normal? Find faults before they produce incidents. Is it successful? Is the correction/change working? See trends and understand where things are going - Compare to self
(over time to show progress) - Are outputs getting closer or further from target?
Benchmarking: Compare against industry/peers to show relative position Efficacy. (comparison with ideal outcome) Are outputs better fit to their
purpose? Update. (are outcomes recent enough to be valid)
Investigation (Common cause, Special cause) Rationalization for use of time, budget, people and other resources –
Possible Informed Decision – Possible
21
Capability: Controlled
Measurement Load. (what resources are used to produce the
outcomes, finding bottlenecks) Interpretation
Can we meet our targets in time with the resources we have?
What resources and time are necessary to meet our targets ?
Investigation (Common cause, Special cause) Decision, Rationalization/justification, Planning –
Possible
22
Capability: Optimized
Measurement Efficiency. (comparison of use of resources with goal)
Interpretation How efficient is it? Are we getting the same outcomes with fewer
resources? Are we getting more/better outcomes with the same
resources? Investigation (Common cause, Special cause) Decision, Rationalization/justification, Planning,
Tradeoffs. point of diminishing returns (ROSI) – Possible
23
Metric Specification
Name of the metric;Description of what is measured;How is the metric measured;How often is the measurement taken;How are the thresholds calculated;Range of values considered normal for the metric;Best possible value of the metric;Units of measurement.
24
Metrics Representation
25
Metrics Representation
26
Metrics Representation
Access Rights Granted
0,02000,04000,06000,08000,0
10000,012000,014000,016000,0
Wee
k10
Wee
k13
Wee
k16
Wee
k19
Wee
k22
Wee
k25
Wee
k28
Wee
k31
Wee
k34
Wee
k37
Wee
k40
Wee
k43
Wee
k46
Wee
k49
Weeks
Access Rights Granted
0,0200,0400,0600,0800,0
1000,01200,01400,01600,01800,0
Wee
k10
Wee
k13
Wee
k16
Wee
k19
Wee
k22
Wee
k25
Wee
k28
Wee
k31
Wee
k34
Wee
k37
Wee
k40
Wee
k43
Wee
k46
Wee
k49
Weeks
Access Rights Granted
0,0200,0400,0600,0800,0
1000,01200,01400,01600,0
Wee
k10
Wee
k13
Wee
k16
Wee
k19
Wee
k22
Wee
k25
Wee
k28
Wee
k31
Wee
k34
Wee
k37
Wee
k40
Wee
k43
Wee
k46
Weeks
0,0200,0400,0600,0800,0
1000,01200,01400,01600,01800,0
1 4 7 10 13 16 19 22 25 28 31 34 37 40
27
Metrics Representation
Access Rights Granted
0,0200,0400,0600,0800,0
1000,01200,01400,01600,01800,0
Wee
k10
Wee
k13
Wee
k16
Wee
k19
Wee
k22
Wee
k25
Wee
k28
Wee
k31
Wee
k34
Wee
k37
Wee
k40
Wee
k43
Wee
k46
Wee
k49
Weeks
28
Using Metrics
Acumulado de Recomendaciones por Responsable (Suma de días)
0
500
1000
1500
2000
2500
Enero
Febre
ro
Mar
zoAbr
il
May
oJu
nio
Julio
Agosto
Septie
mbr
e
Octub
re
Noviem
bre
Diciem
bre
Mr Blue
Mr Pink
Mr Yellow
Mr Purple
Mr Soft Blue
Mr Red
Mr Green
Mr Orange
29
Using Metrics
Acumulado de Recomendaciones de Seguridad por Responsable - Suma en días
0
200
400
600
800
1000
1200
1400
1600
Mr Pink Mr Red Mr Blue MrPurple
MrBrown
Mr Black Mr Grey MrOrange
MrGreen
Altas
Medias
30
Using security management metrics
Key Goal Indicators Key Performance Indicators Services Levels Agreements / Underpinnig Contracts Balanced Scorecard (Customer, Internal, Stakeholder,
Innovation - Goals and Measures)
31
Creative CommonsAttribution-NoDerivs 2.0
Attribution. You must give the original author credit.
For any reuse or distribution, you must make clear to others the license terms of this work.
Any of these conditions can be waived if you get permission from the author.
Your fair use and other rights are in no way affected by the above.
This work is licensed under the Creative Commons Attribution-NoDerivs License. To view a copy of this license, visit http://creativecommons.org/licenses/by-nd/2.0/ or send a letter to Creative Commons, 559 Nathan Abbott Way, Stanford, California 94305, USA.
You are free:
•to copy, distribute, display, and perform this work
Under the following conditions:
No Derivative Works. You may not alter, transform, or build upon this work.
32
THANK YOU
@ with the sponsorship of:
www.fistconference.org