slides
TRANSCRIPT
Copyright © The OWASP FoundationPermission is granted to copy, distribute and/or modify this document under the terms of the OWASP License.
The OWASP Foundation
AppSec DC
http://www.owasp.org
Application security metrics from the organization on down to the vulnerabilities
Chris [email protected]
November 13, 200911:30am-12:30pm
OWASP
Agenda
1. Why use metrics?2. Challenges & Goals for Application Security Metrics3. Enumerations 4. Organizational Metrics5. Testing Metrics6. Application Metrics7. WASC Web Application Security Statistics Project
20088. Future Plans
2
OWASP
To measure is to know.James Clerk Maxwell, 1831-
1879
Measurement motivates.John Kenneth Galbraith. 1908-
2006
3
OWASP
Metrics do matter
1. Metrics quantify the otherwise unquantifiable
2. Metrics can show trends and trends matter more than measurements do
3. Metrics can show if we are doing a good or bad job
4. Metrics can show if you have no idea where you are
5. Metrics establish where “You are here” really is
6. Metrics build bridges to managers
7. Metrics allow cross sectional comparisons
8. Metrics set targets
9. Metrics benchmark yourself against the opposition
10. Metrics create curiosity
4
Source: Andy Jaquith, Yankee Group, Metricon 2.0
OWASP
Metrics don’t matter It is too easy to count things for no purpose
other than to count them You cannot measure security so stop This following is all that matters and you can’t
map security metrics to them:» Maintenance of availability» Preservation of wealth» Limitation on corporate liability» Compliance» Shepherding the corporate brand
Cost of measurement not worth the benefit
5Source: Mike Rothman, Security Incite, Metricon 2.0
OWASP
Bad metrics are worse than no metrics
6
OWASP
Security metrics can drive executive decision making
How secure am I? Am I better off than this time
last year? Am I spending the right amount
of $$? How do I compare to my peers? What risk transfer options do I
have?
7
Source: Measuring Security Tutorial, Dan Geer
OWASP
Goals of Application Security Metrics Provide quantifiable information to support
enterprise risk management and risk-based decision making
Articulate progress towards goals and objectives Provide a repeatable, quantifiable way to assess,
compare, and track improvements in assurance Focus activities on risk mitigation in order of priority
and exploitability Facilitate adoption and improvement of secure
software design and development processes Provide an objective means of comparing and
benchmarking projects, divisions, organizations, and vendor products
8
Source: Practical Measurement Framework for Software Assurance and Information Security, DHS SwA Measurement Working Group
OWASP
Use Enumerations
Common Vulnerabilities and Exposures
Common Weakness Enumeration
Common Attack Pattern Enumeration and Classification
Enumerations help identify specific software-related items that can be counted, aggregated, evaluated over time
OWASP
Organizational Metrics
Percentage of application inventory developed with SDLC (which version of SDLC?)
Business criticality of each application in inventory
Percentage of application inventory tested for security (what level of testing?)
Percentage of application inventory remediated and meeting assurance requirements
Roll up of testing results10
OWASP
Organizational Metrics
Cost to fix defects at different points in the software lifecycle
Cost of data breaches related to software vulnerabilities
11
OWASP
Testing Metrics
Number of threats identified in threat model
Size of attack surface identified Percentage code coverage (static and
dynamic) Coverage of defect categories (CWE) Coverage of attack pattern categories
(CAPEC)
12
OWASP
SANS Top 25 Mapped to Application Security Methods
Source: 2009 Microsoft
OWASP
Weakness Class Prevalence based on 2008 CVE data
4855 total flaws tracked by CVE in 2008
OWASP
Basic Metrics: Defect counts
Design and implementation defects
CWE identifierCVSS scoreSeverityLikelihood of exploit
OWASP
Automated Code Analysis Techniques
Static Analysis: (White Box Testing) Similar to a line by line code review. Benefit is there is complete coverage of the entire source or binary. Downside is it is computationally impossible to have a perfect analysis.» Static Source – analyze the source code» Static Binary – analyze the binary executable» Source vs. Binary – You don’t always have all the source code. You
don’t want to part with your source code to get a 3rd party analysis Dynamic Analysis: (Black Box Testing) Run time analysis
more like traditional testing. Benefit is there is perfect modeling of a particular input so you can show exploitability. Downside is you cannot create all inputs in reasonable time.» Automated dynamic testing (also known as penetration testing) using
tools» Manual Penetrating Testing (with or without use of tools)
Create lists of defects that can be labeled with CWE, CVSS, Exploitability
OWASP
Manual Analysis
Manual Penetration Testing – can discover some issues that cannot be determined automatically because a human can understand issues related to business logic or design
Manual Code Review – typically focused only on specific high risk areas of code
Manual Design Review – can determine some vulnerabilities early on in the design process before the program is even built.
Threat Modeling
OWASP
WASC Web Application Security Statistics Project 2008
Purpose Collaborative industry wide effort to pool together
sanitized website vulnerability data and to gain a better understanding about the web application vulnerability landscape.
Ascertain which classes of attacks are the most prevalent regardless of the methodology used to identify them. MITRE CVE project for custom web applications.
Goals Identify the prevalence and probability of different
vulnerability classes. Compare testing methodologies against what types of
vulnerabilities they are likely to identify.
18
OWASP
Project Team
Project Leader Sergey Gordeychik
Project Contributors Sergey Gordeychik, Dmitry Evteev (POSITIVE
TECHNOLOGIES) Chris Wysopal, Chris Eng (VERACODE) Jeremiah Grossman (WHITEHAT SECURITY) Mandeep Khera (CENZIC) Shreeraj Shah (BLUEINFY) Matt Lantinga (HP APPLICATION SECURITY CENTER) Lawson Lee (dns – used WebInspect) Campbell Murray (ENCRIPTION LIMITED)
19
OWASP
Summary
12186 web applications with 97554 detected vulnerabilities
more than 13%* of all reviewed sites can be compromised completely automatically
About 49% of web applications contain vulnerabilities of high risk level detected by scanning
manual and automated assessment by white box method allows to detect these high risk level vulnerabilities with probability up to 80-96%
99% of web applications are not compliant with PCI DSS standard
* Web applications with Brute Force Attack, Buffer Overflow, OS Commanding, Path Traversal, Remote File Inclusion, SSI Injection, Session Fixation, SQL Injection, Insufficient Authentication, Insufficient Authorization vulnerabilities detected by automatic scanning.
20
OWASP
Compared to 2007 WASS Project
Number of sites with SQL Injection fell by 13%
Number of sites with Cross-site Scripting fell 20%
Number of sites with different types of Information Leakage rose by 24%
Probability to compromise a host automatically rose from 7 to 13 %.
21
OWASP
Probability to detect a vulnerability
22
OWASP
% of total vulnerabilities
23
OWASP
White box vs. black box
24
OWASP
Full Report
http://projects.webappsec.org/Web-Application-Security-Statistics
25
OWASP
Future Plans
Veracode processes over 100 applications and 500 Million lines of code per month
Collecting data: vulnerabilities found/fixedApplication metadata: industry, time in dev
cycle, application type
Vulnerability trends Industry/Platform/Language differences
26
OWASP
Further reading on software security metrics & testing
NIST, Performance Measurement Guide for Information Security http://csrc.nist.gov/publications/nistpubs/800-55-Rev1/SP800-55-rev1.pdf
Security Metrics: Replacing Fear, Uncertainty, and Doubt by Andrew Jaquith,
The Art of Software Security Testing by Chris Wysopal, Lucas Nelson, Dino Dai Zovi, Elfriede Dustin
27
Copyright © The OWASP FoundationPermission is granted to copy, distribute and/or modify this document under the terms of the OWASP License.
The OWASP Foundation
AppSec DC
http://www.owasp.org
Q&A