applying software quality models to software security
TRANSCRIPT
© 2015 Carnegie Mellon University
Applying Software Quality
Models to Software Security
Software Engineering Institute Carnegie Mellon University
Pittsburgh, PA 15213
Carol Woody, Ph.D. April 21, 2015
2 CISQ March 2015
© 2015 Carnegie Mellon University
Copyright 2015 Carnegie Mellon University This material is based upon work funded and supported by the Department of Defense under Contract No. FA8721-05-C-0003 with Carnegie Mellon University for the operation of the Software Engineering Institute, a federally funded research and development center. Any opinions, findings and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the United States Department of Defense. NO WARRANTY. THIS CARNEGIE MELLON UNIVERSITY AND SOFTWARE ENGINEERING INSTITUTE MATERIAL IS FURNISHED ON AN “AS-IS” BASIS. CARNEGIE MELLON UNIVERSITY MAKES NO WARRANTIES OF ANY KIND, EITHER EXPRESSED OR IMPLIED, AS TO ANY MATTER INCLUDING, BUT NOT LIMITED TO, WARRANTY OF FITNESS FOR PURPOSE OR MERCHANTABILITY, EXCLUSIVITY, OR RESULTS OBTAINED FROM USE OF THE MATERIAL. CARNEGIE MELLON UNIVERSITY DOES NOT MAKE ANY WARRANTY OF ANY KIND WITH RESPECT TO FREEDOM FROM PATENT, TRADEMARK, OR COPYRIGHT INFRINGEMENT. This material has been approved for public release and unlimited distribution except as restricted below. This material may be reproduced in its entirety, without modification, and freely distributed in written or electronic form without requesting formal permission. Permission is required for any other use. Requests for permission should be directed to the Software Engineering Institute at [email protected]. Team Software ProcessSM and TSPSM are service marks of Carnegie Mellon University. DM-0001890
3 CISQ March 2015
© 2015 Carnegie Mellon University 3
© 2015 Carnegie Mellon University
Cyber Security Engineering (CSE) Team
Mission: Build Security In
Address security, software assurance, and survivability throughout the development and acquisition lifecycle by creating methods, solutions, and training that can be integrated into existing practices.
CSE Focus Areas
Education and Competencies
Measurement and Analysis
Lifecycle Management
Engineering
http://www.cert.org/cybersecurity-engineering/
4 CISQ March 2015
© 2015 Carnegie Mellon University 4
© 2015 Carnegie Mellon University
CSE Portfolio
Software Assurance Education and Competencies
Masters of Software Assurance Curriculum Model endorsed by IEEE and ACM
Software Assurance Competency Model
Software Assurance Course Delivery and Material Development
Security & Software Assurance Measurement and Analysis
Predictive Analytics Research
Researching the use of Quality Models to Support Software Assurance
Security & Software Assurance Management
Mission Risk Diagnostic (MRD)
Survivability Analysis Framework (SAF)
Security & Software Assurance Engineering
Security Quality Requirements Engineering (SQUARE)
Security Engineering Risk Analysis (SERA)
Risk in the Software Supply Chain
Focus of today's
presentation
5 CISQ March 2015
© 2015 Carnegie Mellon University
Cyber Security is a Lifecycle Challenge
Mission thread
(Business process)
Design
Weaknesses
Coding
Weaknesses
Implementation
Weaknesses
6 CISQ March 2015
© 2015 Carnegie Mellon University
Can Predictions of Quality Inform Security Risk Predictions?
The SEI has quality data for over 100 Team Software Process (TSP) development projects used to predict operational quality.
Data from five projects with low defect density in system testing reported very low or zero safety critical and security defects in production use.
7 CISQ March 2015
© 2015 Carnegie Mellon University
Semantic Gaps
Quality tracks defects/faults (engineering and testing)
Defect: non-fulfilment of intended usage requirements (ISO/IEC 9126) [essentially nonconformity to a specified requirement, missing or incorrect requirements]
Software fault: accidental condition that causes a functional unit to fail to perform its required function (IEEE Standard Dictionary of Measures to produce reliable software 982.1, 1988)
Security cares about vulnerabilities (operations)
Information security vulnerability: mistake in software that can be exploited by a hacker to gain access to a system or network (http://cve.mitre.org/about/terminology.html)
Software vulnerability: instance of an error in the specification, development, or configuration of software such that its execution can violate a security policy (Shin and Williams, 2010)
8 CISQ March 2015
© 2015 Carnegie Mellon University
Vulnerabilities are Defects
1-5% of defects are vulnerabilities
Analysis of defects for five versions of Microsoft windows operating systems and two versions of Red Hat Linux systems) (Alhazmi, et.al., 2007)
Win 95 (14.5 MLOC) and Win 98 (18 MLOC) vulnerabilities are 1.00% and 0.84% respectively of identified defects
Red Hat Linux 6.2 (1.8 MLOC) and 7.1 (6.4 MLOC) vulnerabilities are 5.63% and 4.34% respectively of identified defects.
Tom Longstaff asserted that vulnerabilities might represent 5% of total defects (http://research.microsoft.com/en-us/um/redmond/events/swsecinstitute/slides/longstaff.pdf)
Ross Anderson: “it's reasonable to expect a 35,000,000 line program like Windows 2000 to have 1,000,000 bugs, only 1% of them are security-critical.” (Anderson, 2001)
9 CISQ March 2015
© 2015 Carnegie Mellon University
Data: Five Projects from Three Organizations
Projects Types: Legacy system replacement, Medical devices
Successful security/safety critical results in operation for at least a year
With one exception, projects implemented below 20 defects per MLOC had no reported operational security or safety-critical defects.
The exception utilized specialized defect removal practices for secure systems.
Org. Project TypeSecure or Safety
Critical Defects
Defect
DensitySize
D D1
Safety
Critical 20 46.07 2.8 MLOC
D D2
Safety
Critical 0 4.44 .9 MLOC
D D3
Safety
Critical 0 9.23 1.3 MLOC
A A1 Secure 0 91.70 .6 MLOC
T T1 Secure 0 20.00 .1 MLOC
Quality Threshold
10 CISQ March 2015
© 2015 Carnegie Mellon University
Quality Focuses on Defect Injection and Removal
Poor quality does predict poor security:
• 1-5% of the defects are vulnerabilities
• Cost to fix substantially increases the later a defect is discovered
0102030405060
Early Defect Removal across Life Cycle
11
11 CISQ March 2015
© 2015 Carnegie Mellon University
Software Faults: Introduction, Discovery, and Cost
Faults account for 30‒50% percent of total software project costs.
Most faults are introduced before coding (~70%).
Most faults are discovered at system integration or later (~80%).
12 CISQ March 2015
© 2015 Carnegie Mellon University
Successful Projects
Embed Quality and Safety/Security
Inspection at Each Lifecycle Step
13 CISQ March 2015
© 2015 Carnegie Mellon University
Successful Projects Use Metrics Extensively
Development Metrics
Incoming/week
Triage rate
% closed
Development work for cycle
Software change request per developer per week
# developers
Software change request per verifier & validator per week
# verification persons
Software Change Metrics
Fixed work per cycle
Deferred planned work per cycle
Measure constantly from many dimensions to identify problems early
15 CISQ March 2015
© 2015 Carnegie Mellon University
How Will Quality Help Security?
Good quality will ensure proper implementation of specified results
Effective code checking will identify improper implementations of specifications (11 of SANS Top 25)
Effective design reviews will identify missing requirements (12 of SANS Top 25)
if appropriate security results are considered in the development of requirements
if requirements are effectively translated into detail designs and code specifications to support the required security results
SANS Top 25: SysAdmin, Audit, Network, Security Top 25 Most Dangerous Programming Errors
(http://cwe.mitre.org/top25)
Security Requirements Must be Properly Specified
16 CISQ March 2015
© 2015 Carnegie Mellon University
Poor Quality Predicts Poor Security
If you have a quality problem then you have a security problem
Quality does not happen by accident and neither does security
Neither quality nor security can be “tested in”
Quality approaches such as TSP focus on personal accountability at each stage of the life cycle
Effective results require
clearly define what “right” looks like
measuring and rewarding the right behaviors
reinforcement by training, tracking and independent review
17 CISQ March 2015
© 2015 Carnegie Mellon University
Linking Security and Quality Measures
• If defects are measured,
from 1-5% of these should
be considered to be
security vulnerabilities.
• It is also feasible that when
security vulnerabilities are
measured then code
quality can be estimated
by considering these to be
1-5% of the expected
defects.
Reducing Defects Reduces Vulnerabilities
8-10 Feb 2011 International Conference on Software Quality - ICSQ 2015 17 17 CISQ March 2015
© 2015 Carnegie Mellon University
18 CISQ March 2015
© 2015 Carnegie Mellon University
Challenges for Applicability
• Metrics are not collected about vulnerabilities specific to each
product release
– Open Source products
– National Vulnerability Database
• Data about vulnerabilities are not collected in a form that can be
parsed and analyzed using quality tools and measurements
– Update history does not report product vulnerability data
– Difficulty in evaluating size of products (lines of code or function points)
• Life cycles such as Agile do not typically collect defects until
integration
19 CISQ March 2015
© 2015 Carnegie Mellon University
Contact Information
Carol Woody, Ph.D.
Technical Manager
CERT/CSF/CSE
Telephone: +1 412-268-5800
Email: [email protected]
U.S. Mail
Software Engineering Institute
Customer Relations
4500 Fifth Avenue
Pittsburgh, PA 15213-2612
USA
Web
http://www.cert.org/cybersecurity-engineering/
Customer Relations
Email: [email protected]
Telephone: +1 412-268-5800
SEI Phone: +1 412-268-5800
SEI Fax: +1 412-268-6257