software assurance maturity model

96
Software Assurance Maturity Model A guide to building security into software development VERSION - 1.0

Upload: hoanganh

Post on 01-Jan-2017

233 views

Category:

Documents


2 download

TRANSCRIPT

Page 1: Software Assurance Maturity Model

Software Assurance Maturity Model

A guide to building security into software developmentVersion - 1.0

Page 2: Software Assurance Maturity Model

License

This work is licensed under the Creative Commons Attribution-Share Alike 3.0 License. To view a copy of this license, visit http://creativecommons.org/licenses/by-sa/3.0/ or send a letter to Creative Commons, 171 Second Street, Suite 300, San Francisco, California, 94105, USA.

Fabio ArciniegasMatt BartoldusSebastien DeleersnyderJonathan CarterDarren Challey

Brian ChessDinis CruzJustin DerryBart De WinJames McGovern

Matteo MeucciJeff PayneGunnar PetersonJeff PiperAndy Steingruebl

John StevenChad ThunbergColin WatsonJeff Williams

Acknowledgements

The Software Assurance Maturity Model (SAMM) was originally developed, designed, and written by Pravir Chandra ([email protected]), an independent software security consultant. Creation of the first draft was made possible through funding from Fortify Software, Inc. This document is currently maintained and updated through the OpenSAMM Project led by Pravir Chandra. Since the initial re-lease of SAMM, this project has become part of the Open Web Application Security Project (OWASP). Thanks also go to many supporting organizations that are listed on back cover.

contributors & reviewers

This work would not be possible without the support of many individual reviewers and experts that offered contributions and critical feedback. They are (in alphabetical order):

For the Latest version and additionaL inFo, pLease see the project web site at

http://www.opensamm.org

OWASPThe Open Web Application Security Project

The Open Web Application Security Project (OWASP) is a worldwide free and open community fo-cused on improving the security of application software. Our mission is to make application security “visible,” so that people and organizations can make informed decisions about application security risks. Every- one is free to participate in OWASP and all of our materials are available under a free and open software license. The OWASP Foundation is a 501(c)3 not-for-profit charitable organization that en-sures the ongoing availability and support for our work. Visit OWASP online at http://www.owasp.org.

This is an OWASP Project

Page 3: Software Assurance Maturity Model

sAM

M /

soft

wA

re A

ssu

rA

nc

e M

Atu

rit

y M

od

el -

V1.

0

3

Executive Summary

Business Functions

Security Practices

SAMM Overview

Strategy &Metrics

Education &Guidance

ThreatAssessment

SecureArchitecture

SecurityRequirements

EnvironmentHardening

OperationalEnablement

VulnerabilityManagement

DesignReview

CodeReview

Policy &Compliance

SecurityTesting

Governance Construction Deployment

SoftwareDevelopment

The Software Assurance Maturity Model (SAMM) is an open framework to help organizations for-mulate and implement a strategy for software security that is tailored to the specific risks facing the organization. The resources provided by SAMM will aid in:

✦Evaluating an organization’s existing software security practices ✦Building a balanced software security assurance program in well-defined iterations ✦Demonstrating concrete improvements to a security assurance program ✦Defining and measuring security-related activities throughout an organization

SAMM was defined with flexibility in mind such that it can be utilized by small, medium, and large orga-nizations using any style of development. Additionally, this model can be applied organization-wide, for a single line-of-business, or even for an individual project. Beyond these traits, SAMM was built on the following principles:

✦An organization’s behavior changes slowly over time - A successful software security program should be specified in small iterations that deliver tangible assurance gains while incrementally working toward long-term goals. ✦There is no single recipe that works for all organizations - A software security framework must be flexible and allow organizations to tailor their choices based on their risk tolerance and the way in which they build and use software. ✦Guidance related to security activities must be prescriptive - All the steps in building and assessing an assurance program should be simple, well-defined, and measurable. This model also provides roadmap templates for common types of organizations.

The foundation of the model is built upon the core business functions of software development with security practices tied to each (see diagram below). The building blocks of the model are the three ma-turity levels defined for each of the twelve security practices. These define a wide variety of activities in which an organization could engage to reduce security risks and increase software assurance. Additional details are included to measure successful activity performance, understand the associated assurance benefits, estimate personnel and other costs.

As an open project, SAMM content shall always remain vendor-neutral and freely available for all to use.

Page 4: Software Assurance Maturity Model

sAM

M /

soft

wA

re A

ssu

rA

nc

e M

Atu

rit

y M

od

el -

V1.

0

4

Contents

Executive Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3

UnderstAnding the model 6Business Functions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8Governance . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10

Construction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12

Verification . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14

Deployment . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16

Applying the model 18Using the Maturity Levels . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20

Conducting Assessments . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21

Creating Scorecards . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26

Building Assurance Programs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27Independent Software Vendor . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28

Online Service Provider . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29

Financial Services Organization . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30

Government Organization . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31

the secUrity prActices 32Strategy & Metrics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 34

Policy & Compliance . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38

Education & Guidance . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 42

Threat Assessment . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 46

Security Requirements . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 50

Secure Architecture . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 54

Design Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 58

Code Review . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 62

Security Testing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 66

Vulnerability Management . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 70

Environment Hardening . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 74

Operational Enablement . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 78

cAse stUdies 82VirtualWare . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 84

Page 5: Software Assurance Maturity Model

sAM

M /

soft

wA

re A

ssu

rA

nc

e M

Atu

rit

y M

od

el -

V1.

0

5

Assess existing software assurance practices

3 ✦Executive Summary 8-9 ✦Business Functions 10-11 ✦Governance 12-13 ✦Construction 14-15 ✦Verification 16-17 ✦Deployment 21-25 ✦Conducting Assessments 26 ✦Creating Scorecards 20 ✦Using the Maturity Levels 34-37 ✧Strategy & Metrics 38-41 ✧Policy & Compliance 42-45 ✧Education & Guidance 46-49 ✧Threat Assessment 50-53 ✧Security Requirements 54-57 ✧Secure Architecture 58-61 ✧Design Review 62-65 ✧Code Review 66-69 ✧Security Testing 70-73 ✧Vulnerability Management 74-77 ✧Environment Hardening 78-81 ✧Operational Enablement 27-31 ✧ Building Assurance Programs 84-95 ✧VirtualWare

i woUld like to...

Build a strategic roadmap for an organization

3 ✦Executive Summary 8-9 ✦Business Functions 10-11 ✦Governance 12-13 ✦Construction 14-15 ✦Verification 16-17 ✦Deployment 20 ✦Using the Maturity Levels 27-31 ✦ Building Assurance Programs 21-25 ✧Conducting Assessments 26 ✧Creating Scorecards 84-95 ✧VirtualWare 34-37 ✧Strategy & Metrics 38-41 ✧Policy & Compliance 42-45 ✧Education & Guidance 46-49 ✧Threat Assessment 50-53 ✧Security Requirements 54-57 ✧Secure Architecture 58-61 ✧Design Review 62-65 ✧Code Review 66-69 ✧Security Testing 70-73 ✧Vulnerability Management 74-77 ✧Environment Hardening 78-81 ✧Operational Enablement

Implement or perform security activities

3 ✦Executive Summary 8-9 ✦Business Functions 10-11 ✦Governance 12-13 ✦Construction 14-15 ✦Verification 16-17 ✦Deployment 20 ✦Using the Maturity Levels 34-37 ✧Strategy & Metrics 38-41 ✧Policy & Compliance 42-45 ✧Education & Guidance 46-49 ✧Threat Assessment 50-53 ✧Security Requirements 54-57 ✧Secure Architecture 58-61 ✧Design Review 62-65 ✧Code Review 66-69 ✧Security Testing 70-73 ✧Vulnerability Management 74-77 ✧Environment Hardening 78-81 ✧Operational Enablement 21-25 ✧Conducting Assessments 26 ✧Creating Scorecards27-31 ✧ Building Assurance Programs 84-95 ✧VirtualWare

✦ read ✧ skim

Page 6: Software Assurance Maturity Model

Understanding the Model

A view of the big picture

Page 7: Software Assurance Maturity Model

SAMM is built upon a collection of Security Practices that are tied back into the core Business Functions involved in software development. This section introduces those Business Functions and the corresponding Security Practices for each. After covering the high-level framework, the Maturity Levels for each Security Practice are also discussed briefly in order to paint a picture of how each can be iteratively improved over time.

Page 8: Software Assurance Maturity Model

sAM

M /

un

der

stA

nd

ing t

he

Mo

del

- V

1.0

8

At the highest level, SAMM defines four critical Business Functions. Each Business Function (list-ed below) is a category of activities related to the nuts-and-bolts of software development, or stated another way, any organization involved with software development must fulfill each of these Business Functions to some degree.

For each Business Function, SAMM defines three Security Practices. Each Security Practice (list-ed opposite) is an area of security-related activities that build assurance for the related Business Func-tion. So overall, there are twelve Security Practices that are the independent silos for improvement that map underneath the Business Functions of software development.

For each Security Practice, SAMM defines three Maturity Levels as Objectives. Each Level within a Security Practice is characterized by a successively more sophisticated Objective defined by specific activities and more stringent success metrics than the previous level. Additionally, each Security Practice can be improved independently, though related activities can lead to optimizations.

Governance is centered on the processes and activities related to how an organization manages overall software development activities. More specifically, this includes concerns that cross-cut groups involved in development as well as business processes that are established at the organization level.

Construction

Verification

Deployment

Construction concerns the processes and activities related to how an organization defines goals and creates software within development projects. In general, this will include product management, re-quirements gathering, high-level architecture specification, detailed design, and implementation.

Verification is focused on the processes and activities related to how an organization checks and tests artifacts produced throughout software development. This typically includes quality assurance work such as testing, but it can also include other review and evaluation activities.

Deployment entails the processes and activities related to how an organization manages release of software that has been created. This can involve shipping products to end users, deploying products to internal or external hosts, and normal operations of software in the runtime environment.

Governance

Business Functions

...more on page 10

...more on page 12

...more on page 14

...more on page 16

Page 9: Software Assurance Maturity Model

sAM

M /

un

der

stA

nd

ing t

he

Mo

del

- V

1.0

9

Threat Assessment involves accu-rately identifying and characterizing poten-tial attacks upon an organization’s software in order to better understand the risks and facilitate risk management.

Security Requirements involves promoting the inclusion of security-related requirements during the software develop-ment process in order to specify correct functionality from inception.

Secure Architecture involves bol-stering the design process with activities to promote secure-by-default designs and control over technologies and frameworks upon which software is built.

Design Review involves inspection of the artifacts created from the design pro-cess to ensure provision of adequate se-curity mechanisms and adherence to an organization’s expectations for security.

Code Review involves assessment of an organization’s source code to aid vul-nerability discovery and related mitigation activities as well as establish a baseline for secure coding expectations.

Security Testing involves testing the organization’s software in its runtime envi-ronment in order to both discover vulner-abilities and establish a minimum standard for software releases.

Vulnerability Management involves establishing consistent processes for man-aging internal and external vulnerability re-ports to limit exposure and gather data to enhance the security assurance program.

Environment Hardening involves implementing controls for the operating environment surrounding an organization’s software to bolster the security posture of applications that have been deployed.

Operational Enablement involves identifying and capturing security-relevant information needed by an operator to properly configure, deploy, and run an or-ganization’s software.

Strategy & Metrics involves the over-all strategic direction of the software as-surance program and instrumentation of processes and activities to collect metrics about an organization’s security posture.

Policy & Compliance involves setting up a security and compliance control and audit framework throughout an organiza-tion to achieve increased assurance in soft-ware under construction and in operation..

Education & Guidance involves in-creasing security knowledge amongst per-sonnel in software development through training and guidance on security topics relevant to individual job functions.

Maturity LevelsEach of the twelve Security Practices has three defined Maturity Levels and an implicit starting point at zero. The details for each level differs between the Practices, but they generally represent:

23

10 Implicit starting point representing the activities in the Practice being unfulfilled

Initial understanding and ad hoc provision of Security Practice

Increase efficiency and/or effectiveness of the Security Practice

Comprehensive mastery of the Security Practice at scale

NotationThroughout this document, the following capitalized terms will be reserved words that refer to the SAMM components de-fined in this section. If these terms appear without capitalization, they should be in-terpreted based on the their context:

✦Business Function also as Function ✦Security Practice also as Practice ✦Maturity Level also as Level, Objective

Gove

rnan

ce

Cons

truct

ion

Verifi

catio

n

Deplo

ymen

t

Page 10: Software Assurance Maturity Model

sAM

M /

un

der

stA

nd

ing t

he

Mo

del

- V

1.0

10

GovernanceDescription of Security Practices

Strategy & Metrics

The Strategy & Metrics (SM) Practice is focused on establishing the framework within an organization for a software security assurance program. This is the most fundamental step in defining security goals in a way that’s both measurable and aligned with the organization’s real business risk.

By starting with lightweight risk profiles, an organization grows into more advanced risk classification schemes for application and data assets over time. With additional insight on relative risk measures, an organization can tune its project-level security goals and develop granular roadmaps to make the security program more efficient.

At the more advanced levels within this Practice, an organization draws upon many data sources, both internal and external, to collect metrics and qualitative feedback on the security program. This allows fine tuning of cost outlay versus the realized benefit at the program level.

Education & Guidance

The Education & Guidance (EG) Practice is focused on arming personnel involved in the software life-cycle with knowledge and resources to design, develop, and deploy secure software. With improved access to information, project teams will be better able to proactively identify and mitigate the specific security risks that apply to their organization.

One major theme for improvement across the Objectives is providing training for employees, either through instructor-led sessions or computer-based modules. As an organization progresses, a broad base of training is built by starting with developers and moving to other roles throughout the organiza-tion, culminating with the addition of role-based certification to ensure comprehension of the material.

In addition to training, this Practice also requires pulling security-relevant information into guidelines that serve as reference information to staff. This builds a foundation for establishing a baseline expec-tation for security practices in your organization, and later allows for incremental improvement once usage of the guidelines has been adopted.

Policy & Compliance

The Policy & Compliance (PC) Practice is focused on understanding and meeting external legal and regulatory requirements while also driving internal security standards to ensure compliance in a way that’s aligned with the business purpose of the organization.

A driving theme for improvement within this Practice is focus on project-level audits that gather in-formation about the organization’s behavior in order to check that expectations are being met. By introducing routine audits that start out lightweight and grow in depth over time, organizational change is achieved iteratively.

In a sophisticated form, provision of this Practice entails organization-wide understanding of both in-ternal standards and external compliance drivers while also maintaining low-latency checkpoints with project teams to ensure no project is operating outside expectations without visibility.

Page 11: Software Assurance Maturity Model

sAM

M /

un

der

stA

nd

ing t

he

Mo

del

- V

1.0

11

GovernanceActivities overview

1EG 3EG2EG

objective Offer development staff access to resources around the topics of secure programming and deployment

Educate all personnel in the software life-cycle with role-specific guidance on secure development

Mandate comprehensive security training and certify personnel for baseline knowledge

Activities A. Conduct technical security awareness training

B. Build and maintain technical guidelines

A. Conduct role-specific application security training

B. Utilize security coaches to enhance project teams

A. Create formal application security support portal

B. Establish role-based examination/certification

Education & Guidance

1PC 3PC2PC

objective Understand relevant governance and compliance drivers to the organization

Establish security and compliance baseline and understand per-project risks

Require compliance and measure projects against organization-wide policies and standards

Activities A. Identify and monitor external compliance drivers

B. Build and maintain compliance guidelines

A. Build policies and standards for security and compliance

B. Establish project audit practice

A. Create compliance gates for projects

B. Adopt solution for audit data collection

Policy & Compliance

1SM 3SM2SM

objective Establish unified strategic roadmap for software security within the organization

Measure relative value of data and software assets and choose risk tolerance

Align security expenditure with relevant business indicators and asset value

Activities A. Estimate overall business risk profile

B. Build and maintain assurance program roadmap

A. Classify data and applications based on business risk

B. Establish and measure per-classification security goals

A. Conduct periodic industry-wide cost comparisons

B. Collect metrics for historic security spend

Strategy & Metrics ...more on page 34

...more on page 38

...more on page 42

Page 12: Software Assurance Maturity Model

sAM

M /

un

der

stA

nd

ing t

he

Mo

del

- V

1.0

12

ConstructionDescription of Security Practices

Threat Assessment

The Threat Assessment (TA) Practice is centered on identification and understanding the project-level risks based on the functionality of the software being developed and characteristics of the runtime environment. From details about threats and likely attacks against each project, the organization as a whole operates more effectively through better decisions about prioritization of initiatives for security. Additionally, decisions for risk acceptance are more informed, therefore better aligned to the business.

By starting with simple threat models and building to more detailed methods of threat analysis and weighting, an organization improves over time. Ultimately, a sophisticated organization would maintain this information in a way that is tightly coupled to the compensating factors and pass-through risks from external entities. This provides greater breadth of understanding for potential downstream impacts from security issues while keeping a close watch on the organization’s current performance against known threats.

Secure Architecture

The Secure Architecture (SA) Practice is focused on proactive steps for an organization to design and build secure software by default. By enhancing the software design process with reusable services and components, the overall security risk from software development can be dramatically reduced.

Beginning from simple recommendations about software frameworks and explicit consideration of secure design principles, an organization evolves toward consistently using design patterns for security functionality. Also, activities encourage project teams to increased utilization of centralized security services and infrastructure.

As an organization evolves over time, sophisticated provision of this Practice entails organizations build-ing reference platforms to cover the generic types of software they build. These serve as frameworks upon which developers can build custom software with less risk of vulnerabilities.

Security Requirements

The Security Requirements (SR) Practice is focused on proactively specifying the expected behavior of software with respect to security. Through addition of analysis activities at the project level, security requirements are initially gathered based on the high-level business purpose of the software. As an orga-nization advances, more advanced techniques are used such as access control specifications to discover new security requirements that may not have been initially obvious to development.

In a sophisticated form, provision of this Practice also entails pushing the security requirements of the organization into its relationships with suppliers and then auditing projects to ensure all are adhering to expectations with regard to specification of security requirements.

Page 13: Software Assurance Maturity Model

sAM

M /

un

der

stA

nd

ing t

he

Mo

del

- V

1.0

13

ConstructionActivities overview

1TA 2TA 3TA

objective Identify and understand high-level threats to the organization and individual projects

Increase accuracy of threat assessment and improve granularity of per-project understanding

Concretely tie compensating controls to each threat against internal and third-party software

Activities A. Build and maintain application-specific threat models

B. Develop attacker profile from software architecture

A. Build and maintain abuse-case models per project

B. Adopt a weighting system for measurement of threats

A. Explicitly evaluate risk from third-party components

B. Elaborate threat models with compensating controls

Threat Assessment

1SR 2SR 3SR

objective Consider security explicitly during the software requirements process

Increase granularity of security requirements derived from business logic and known risks

Mandate security requirements process for all software projects and third-party dependencies

Activities A. Derive security requirements from business functionality

B. Evaluate security and compliance guidance for requirements

A. Build an access control matrix for resources and capabilities

B. Specify security requirements based on known risks

A. Build security requirements into supplier agreements

B. Expand audit program for security requirements

Security Requirements

1SA 2SA 3SA

objective Insert consideration of proactive security guidance into the software design process

Direct the software design process toward known-secure services and secure-by-default designs

Formally control the software design process and validate utilization of secure components

Activities A. Maintain list of recommended software frameworks

B. Explicitly apply security principles to design

A. Identify and promote security services and infrastructure

B. Identify security design patterns from architecture

A. Establish formal reference architectures and platforms

B. Validate usage of frameworks, patterns, and platforms

Secure Architecture

...more on page 46

...more on page 50

...more on page 54

Page 14: Software Assurance Maturity Model

sAM

M /

un

der

stA

nd

ing t

he

Mo

del

- V

1.0

14

VerificationDescription of Security Practices

Design Review

The Design Review (DR) Practice is focused on assessment of software design and architecture for se-curity-related problems. This allows an organization to detect architecture-level issues early in software development and thereby avoid potentially large costs from refactoring later due to security concerns.

Beginning with lightweight activities to build understanding of the security-relevant details about an architecture, an organization evolves toward more formal inspection methods that verify completeness in provision of security mechanisms. At the organization level, design review services are built and of-fered to stakeholders.

In a sophisticated form, provision of this Practice involves detailed, data-level inspection of designs and enforcement of baseline expectations for conducting design assessments and reviewing findings before releases are accepted.

Code Review

The Code Review (CR) Practice is focused on inspection of software at the source code level in order to find security vulnerabilities. Code-level vulnerabilities are generally simple to understand concep-tually, but even informed developers can easily make mistakes that leave software open to potential compromise.

To begin, an organization uses lightweight checklists and for efficiency, only inspects the most critical software modules. However, as an organization evolves it uses automation technology to dramatically improve coverage and efficacy of code review activities.

Sophisticated provision of this Practice involves deeper integration of code review into the develop-ment process to enable project teams to find problems earlier. This also enables organizations to better audit and set expectations for code review findings before releases can be made.

Security Testing

The Security Testing (ST) Practice is focused on inspection of software in the runtime environment in order to find security problems. These testing activities bolster the assurance case for software by checking it in the same context in which it is expected to run, thus making visible operational miscon-figurations or errors in business logic that are difficult to otherwise find.

Starting with penetration testing and high-level test cases based on the functionality of software, an organization evolves toward usage of security testing automation to cover the wide variety of test cases that might demonstrate a vulnerability in the system.

In an advanced form, provision of this Practice involves customization of testing automation to build a battery of security tests covering application-specific concerns in detail. With additional visibility at the organization level, security testing enables organizations to set minimum expectations for security testing results before a project release is accepted.

Page 15: Software Assurance Maturity Model

sAM

M /

un

der

stA

nd

ing t

he

Mo

del

- V

1.0

15

VerificationActivities overview

1DR 2DR 3DR

objective Support ad hoc reviews of software design to ensure baseline mitigations for known risks

Offer assessment services to review software design against comprehensive best practices for security

Require assessments and validate artifacts to develop detailed understanding of protection mechanisms

Activities A. Identify software attack surfaceB. Analyze design against known

security requirements

A. Inspect for complete provision of security mechanisms

B. Deploy design review service for project teams

A. Develop data-flow diagrams for sensitive resources

B. Establish release gates for design review

Design Review

1CR 2CR 3CR

objective Opportunistically find basic code-level vulnerabilities and other high-risk security issues

Make code review during development more accurate and efficient through automation

Mandate comprehensive code review process to discover language-level and application-specific risks

Activities A. Create review checklists from known security requirements

B. Perform point-review of high-risk code

A. Utilize automated code analysis tools

B. Integrate code analysis into development process

A. Customize code analysis for application-specific concerns

B. Establish release gates for code review

Code Review

1ST 2ST 3ST

objective Establish process to perform basic security tests based on implementation and software requirements

Make security testing during development more complete and efficient through automation

Require application-specific security testing to ensure baseline security before deployment

Activities A. Derive test cases from known security requirements

B. Conduct penetration testing on software releases

A. Utilize automated security testing tools

B. Integrate security testing into development process

A. Employ application-specific security testing automation

B. Establish release gates for security testing

Security Testing

...more on page 58

...more on page 62

...more on page 66

Page 16: Software Assurance Maturity Model

sAM

M /

un

der

stA

nd

ing t

he

Mo

del

- V

1.0

16

DeploymentDescription of Security Practices

Vulnerability Management

The Vulnerability Management (VM) Practice is focused on the processes within an organization with respect to handling vulnerability reports and operational incidents. By having these processes in place, an organization’s projects will have consistent expectations and increased efficiency for handling these events, rather than chaotic and uninformed responses.

Starting from lightweight assignment of roles in the event of an incident, an organization grows into a more formal incident response process that ensures visibility and tracking on issues that occur. Com-munications are also improved to improve overall understanding of the processes.

In an advanced form, vulnerability management involves thorough dissecting of incidents and vulnerabil-ity reports to collect detailed metrics and other root-cause information to feedback into the organiza-tion’s downstream behavior.

Environment Hardening

The Environment Hardening (EH) Practice is focused on building assurance for the runtime environ-ment that hosts the organization’s software. Since secure operation of an application can be deterio-rated by problems in external components, hardening this underlying infrastructure directly improves the overall security posture of the software.

By starting with simple tracking and distributing of information about the operating environment to keep development teams better informed, an organization evolves to scalable methods for managing deployment of security patches and instrumenting the operating environment with early-warning de-tectors for potential security issues before damage is done.

As an organization advances, the operating environment is further reviewed and hardened by deploy-ment of protection tools to add layers of defenses and safety nets to limit damage in case any vulner-abilities are exploited.

Operational Enablement

The Operational Enablement (OE) Practice is focused on gathering security critical information from the project teams building software and communicating it to the users and operators of the software. Without this information, even the most securely designed software carries undue risks since impor-tant security characteristics and choices will not be known at a deployment site.

Starting from lightweight documentation to capture the most impactful details for users and operators, an organization evolves toward building complete operational security guides that are delivered with each release.

In an advanced form, operational enablement also entails organization-level checks against individual project teams to ensure that information is being captured and shared according to expectations.

Page 17: Software Assurance Maturity Model

sAM

M /

un

der

stA

nd

ing t

he

Mo

del

- V

1.0

17

DeploymentActivities overview

1VM 2VM 3VM

objective Understand high-level plan for responding to vulnerability reports or incidents

Elaborate expectations for response process to improve consistency and communications

Improve analysis and data gathering within response process for feedback into proactive planning

Activities A. Identify point of contact for security issues

B. Create informal security response team(s)

A. Establish consistent incident response process

B. Adopt a security issue disclosure process

A. Conduct root cause analysis for incidents

B. Collect per-incident metrics

Vulnerability Management

1EH 2EH 3EH

objective Understand baseline operational environment for applications and software components

Improve confidence in application operations by hardening the operating environment

Validate application health and status of operational environment against known best practices

Activities A. Maintain operational environment specification

B. Identify and install critical security upgrades and patches

A. Establish routine patch management process

B. Monitor baseline environment configuration status

A. Identify and deploy relevant operations protection tools

B. Expand audit program for environment configuration

Environment Hardening

1OE 2OE 3OE

objective Enable communications between development teams and operators for critical security-relevant data

Improve expectations for continuous secure operations through provision of detailed procedures

Mandate communication of security information and validate artifacts for completeness

Activities A. Capture critical security information for deployment

B. Document procedures for typical application alerts

A. Create per-release change management procedures

B. Maintain formal operational security guides

A. Expand audit program for operational information

B. Perform code signing for application components

Operational Enablement

...more on page 70

...more on page 74

...more on page 78

Page 18: Software Assurance Maturity Model

Applying the Model

Putting it all to work

Page 19: Software Assurance Maturity Model

This section covers several important and useful applications of SAMM. Given the core design of the model itself, an organization can use SAMM as a benchmark to measure its security as-surance program and create a scorecard. Using scorecards, an organization can demonstrate improvement through iterations of developing an assurance program. And most importantly, an organization can use SAMM roadmap templates to guide the build-out or improvement of a security assurance initiative.

Page 20: Software Assurance Maturity Model

sAM

M /

App

lyin

g t

he

Mo

del

- V

1.0

20

objective

The Objective is a general statement that cap-tures the assurance goal of attaining the associ-ated Level. As the Levels increase for a given Practice, the Objectives characterize more so-phisticated goals in terms of building assurance for software development and deployment.

Activities

The Activities are core requisites for attaining the Level. Some are meant to be performed organi-zation-wide and some correspond to actions for individual project teams. In either case, the Activi-ties capture the core security function and orga-nizations are free to determine how they fulfill the Activities.

resUlts

The Results characterize capabilities and deliv-erables obtained by achieving the given Level. In some cases these are specified concretely and in others, a more qualitative statement is made about increased capability.

sUccess metrics

The Success Metrics specify example measure-ments that can be used to check if an organiza-tion is performing at the given Level. Data collec-tion and management is left to the choice of each organization, but recommended data sources and thresholds are provided.

costs

The Costs are qualitative statements about the expenses incurred by an organization attaining the given Level. While specific values will vary for each organizations, these are meant to provide an idea of the one-time and ongoing costs associ-ated with operating at a particular Level.

personnel

These properties of a Level indicate the esti-mated ongoing overhead in terms of human re-sources for operating at the given Level.

✦Developers - Individuals performing detailed design and implementation of the software ✦Architects - Individuals performing high-level design work and large scale system engineering ✦Managers - Individuals performing day-to-day management of development staff ✦QA Testers - Individuals performing quality assurance testing and pre-release verification of software ✦Security Auditors - Individuals with technical security knowledge related to software being produced ✦Business Owners - Individuals performing key decision making on software and its business requirements ✦Support Operations - Individuals performing customer support or direct technical operations support

relAted levels

The Related Levels are references to Levels with-in other Practices that have some potential over-laps depending upon the organization’s structure and progress in building an assurance program. Functionally, these indicate synergies or optimi-zations in Activity implementation if the Related Level is also a goal or already in place.

Using the Maturity Levels

Each of the twelve Security Practices have three Maturity Levels. Each Level has several components that specify the critical factors for understanding and achieving the stated Level. Beyond that, these prescriptive details make it possible to use the definitions of the Security Practices even outside the context of using SAMM to build a software assurance program.

Page 21: Software Assurance Maturity Model

sAM

M /

App

lyin

g t

he

Mo

del

- V

1.0

21

Conducting Assessments

By measuring an organization against the defined Security Practices, an overall picture of built-in se-curity assurance activities is created. This type of assessment is useful for understanding the breadth of security activities currently in place at an organization. Further, it enables that organization to then utilize SAMM to create a future roadmap for iterative improvement.

The process of conducting an assessment is simply evaluating an organization to determine the Maturity Level at which it is performing, The extent to which an organization’s performance is checked will usu-ally vary according to the drivers behind the assessment, but in general, there are two recommended styles:

✦Lightweight - The assessment worksheets for each Practice are evaluated and scores are assigned based on answers. This type of assessment is usually sufficient for an organization that is trying to map their existing assurance program into SAMM and just wants to get a quick picture of where they stand. ✦Detailed - After completion of the assessment worksheets, additional audit work is performed to check the organization to ensure the Activities prescribed by each Practice are in place. Additionally since each Practice also specifies Success Metrics, that data should be collected to ensure that the organization is performing as expected.

Scoring an organization using the assessment worksheets is straightforward. After answering the ques-tions, evaluate the answer column to determine the Level. It is indicated by affirmative answers on all questions above the markers to the right of the answer column.

Existing assurance programs might not always consist of activities that neatly fall on a boundary be-tween Maturity Levels, e.g. an organization that assesses to a Level 1 for a given Practice might also have additional activities in place but not such that Level 2 is completed. For such cases, the organization’s score should be annotated with a “+” symbol to indicate there’s additional assurances in place beyond those indicated by the Level obtained. For example, an organization that is performing all Level 1 Activi-ties for Operational Enablement as well as one Level 2 or 3 Activity would be assigned a “1+” score. Likewise, an organization performing all Activities for a Security Practice, including some beyond the scope of SAMM, would be given a "3+" score.

StartCompleteassessmentworksheets

Assessmenttype?

Done

CheckSuccessMetrics

Audit forperformedActivities

Assign ascore perPractice

Adjustscore perPractice

lightweight

detailed

2 310 0+ 1+ 2+ 3+assessment scores

Page 22: Software Assurance Maturity Model

sAM

M /

App

lyin

g t

he

Mo

del

- V

1.0

22

yes/no

✦Have most developers been given high-level security awareness training?

✦Does each project team have access to secure development best practices and guidance?

✦Are most roles in the development process given role-specific training and guidance?

✦Are most stakeholders able to pull in security coaches for use on projects?

✦ Is security-related guidance centrally controlled and consistently distributed throughout the organization?

✦Are most people tested to ensure a baseline skill-set for secure development practices?

yes/no

✦Do most project stakeholders know their project’s compliance status?

✦Are compliance requirements specifically considered by project teams?

✦Does the organization utilize a set of policies and standards to control software development?

✦Are project teams able to request an audit for compliance with policies and standards?

✦Are projects periodically audited to ensure a baseline of compliance with policies and standards?

✦Does the organization systematically use audits to collect and control compliance evidence?

yes/no

✦ Is there a software security assurance program already in place?

✦Do most of the business stakeholders understand your organization’s risk profile?

✦ Is most of your development staff aware of future plans for the assurance program?

✦Are most of your applications and resources categorized by risk?

✦Are risk ratings used to tailor the required assurance activities?

✦Does most of the organization know about what’s required based on risk ratings?

✦ Is per-project data for cost of assurance activities collected?

✦Does your organization regularly compare your security spend with other organizations?

Strategy & Metrics

Education & Guidance

Policy & Compliance

1EG

3EG

2EG

1PC

3PC

2PC

1SM

3SM

2SM

GovernanceAssessment worksheet

Page 23: Software Assurance Maturity Model

sAM

M /

App

lyin

g t

he

Mo

del

- V

1.0

23

ConstructionAssessment worksheet

yes/no

✦Are project teams provided with a list of recommended third-party components?

✦Are most project teams aware of secure design principles and applying them?

✦Do you advertise shared security services with guidance for project teams?

✦Are project teams provided with prescriptive design patterns based on their application architecture?

✦Are project teams building software from centrally controlled platforms and frameworks?

✦Are project teams being audited for usage of secure architecture components?

yes/no

✦Do most project teams specify some security requirements during development?

✦Do project teams pull requirements from best-practices and compliance guidance?

✦Are most stakeholders reviewing access control matrices for relevant projects?

✦Are project teams specifying requirements based on feedback from other security activities?

✦Are most stakeholders reviewing vendor agreements for security requirements?

✦Are the security requirements specified by project teams being audited?

yes/no

✦Do most projects in your organization consider and document likely threats?

✦Does your organization understand and document the types of attackers it faces?

✦Do project teams regularly analyze functional requirements for likely abuses?

✦Do project teams use a method of rating threats for relative comparison?

✦Are stakeholders aware of relevant threats and ratings?

✦Do project teams specifically consider risk from external software?

✦Are all protection mechanisms and controls captured and mapped back to threats?

Threat Assessment

1TA

2TA

3TA

Secure Architecture

Security Requirements

1SR

2SR

3SR

1SA

2SA

3SA

Page 24: Software Assurance Maturity Model

sAM

M /

App

lyin

g t

he

Mo

del

- V

1.0

24

yes/no

✦Are projects specifying some security tests based on requirements?

✦Do most projects perform penetration tests prior to release?

✦Are most stakeholders aware of the security test status prior to release?

✦Are projects using automation to evaluate security test cases?

✦Do most projects follow a consistent process to evaluate and report on security tests to stakeholders?

✦Are security test cases comprehensively generated for application-specific logic?

✦Do routine project audits demand minimum standard results from security testing?

yes/no

✦Do most project teams have review checklists based on common problems?

✦Are project teams generally performing review of selected high-risk code?

✦Can most project teams access automated code analysis tools to find security problems?

✦Do most stakeholders consistently require and review results from code reviews?

✦Do project teams utilize automation to check code against application-specific coding standards?

✦Does routine project audit require a baseline for code review results prior to release?

yes/no

✦Do project teams document the attack perimeter of software designs?

✦Do project teams check software designs against known security risks?

✦Do most project teams specifically analyze design elements for security mechanisms?

✦Are most project stakeholders aware of how to obtain a formal design review?

✦Does the design review process incorporate detailed data-level analysis?

✦Does routine project audit require a baseline for design review results?

VerificationAssessment worksheet

Design Review

1DR

2DR

3DR

1CR

2CR

3CR

1ST

2ST

3ST

Code Review

Security Testing

Page 25: Software Assurance Maturity Model

sAM

M /

App

lyin

g t

he

Mo

del

- V

1.0

25

yes/no

✦Do you deliver security notes with the majority of software releases?

✦Are security-related alerts and error conditions documented for most projects?

✦Are most project utilizing a change management process that’s well understood?

✦Do project teams deliver an operational security guide with each product release?

✦Are most projects being audited to check each release for appropriate operational security information?

✦ Is code signing routinely performed on software components using a consistent process?

yes/no

✦Do the majority of projects document some requirements for the operational environment?

✦Do most projects check for security updates to third-party software components?

✦ Is a consistent process used to apply upgrades and patches to critical dependencies?

✦Do most project leverage automation to check application and environment health?

✦Are stakeholders aware of options for additional tools to protect software while running in operations?

✦Does routine audit check most projects for baseline environment health?

yes/no

✦Do most projects have a point of contact for security issues?

✦Does your organization have an assigned security response team?

✦Are most project teams aware of their security point(s) of contact and response team(s)?

✦Does the organization utilize a consistent process for incident reporting and handling?

✦Are most project stakeholders aware of relevant security disclosures related to their software projects?

✦Are most incidents inspected for root causes to generate further recommendations?

✦Do most projects consistently collect and report data and metrics related to incidents?

DeploymentAssessment worksheet

Vulnerability Management

1VM

2VM

3VM

1EH

2EH

3EH

1OE

2OE

3OE

Environment Hardening

Operational Enablement

Page 26: Software Assurance Maturity Model

sAM

M /

App

lyin

g t

he

Mo

del

- V

1.0

26

after

before

2

3

1

2

1+

2

1

2

0+

1+

0

1

1

2

2

3

1

2

0+

1

1

1

2

3

EXAM

PLE

Strategy &Metrics

Policy &Compliance

Education &Guidance

ThreatAssessment

SecurityRequirements

SecureArchitecture

DesignReview

OperationalEnablement

EnvironmentHardening

VulnerabilityManagement

SecurityTesting

CodeReview

Creating Scorecards

Based on the scores assigned to each Security Practice, an organiza-tion can create a scorecard to capture those values. Functionally, a scorecard can be the simple set of 12 scores for a particular time. However, selecting a time interval over which to generate a score-card facilitates understanding of overall changes in the assurance program during the time frame.

Using interval scorecards is encouraged for several situations:

✦Gap analysis - Capturing scores from detailed assessments versus expected performance levels ✦Demonstrating improvement - Capturing scores from before and after an iteration of assurance program build-out ✦Ongoing measurement - Capturing scores over consistent time frames for an assurance program that is already in place

The figure on the right shows an example scorecard for how an organization’s assurance program changed over the course of one year. If that organization had also saved the data about where they were planning on being at the end of the year, that would be another interesting data set to plot since it would help show the extent to which the plans had to change over the year.

Page 27: Software Assurance Maturity Model

sAM

M /

App

lyin

g t

he

Mo

del

- V

1.0

27

Building Assurance Programs

EXAM

PLE

Strategy &Metrics

Policy &Compliance

Education &Guidance

ThreatAssessment

SecurityRequirements

SecureArchitecture

DesignReview

OperationalEnablement

EnvironmentHardening

VulnerabilityManagement

SecurityTesting

CodeReview

Phase 2Phase 3Phase 4

Phase 1

Start

Selectappropriate

roadmap

Existingroadmaptemplate?

Done

Mark selectedimprovementson roadmap

Conductinitial

assessment

Adjustroadmap toorganization

no yes

Addinganotherphase?

Createempty

roadmap

SelectPractices to

improve

no

yes

One of the main uses of SAMM is to help organizations build soft-ware security assurance programs. That process is straightforward, and generally begins with an assessment if the organization is al-ready performing some security assurance activities.

Several roadmap templates for common types of organizations are provided. Thus, many organizations can choose an appropriate match and then tailor the roadmap template to their needs. For other types of organizations, it may be necessary to build a custom roadmap.

Roadmaps (pictured to the right) consist of phases (the vertical bars) in which several Practices are each improved by one Level. Therefore, building a roadmap entails selection of which Practices to improve in each planned phase. Organizations are free to plan into the future as far as they wish, but are encouraged to iterate based on business drivers and organization-specific information to ensure the assurance goals are commensurate with their business goals and risk tolerance.

After a roadmap is established, the build-out of an assurance pro-gram is simple. An organization begins an improvement phases and works to achieve the stated Levels by performing the prescribed Activities. At the end of the phase, the roadmap should be adjusted based on what was actually accomplished, and then the next phase can begin.

Page 28: Software Assurance Maturity Model

sAM

M /

App

lyin

g t

he

Mo

del

- V

1.0

28

Strategy &Metrics

Policy &Compliance

Education &Guidance

ThreatAssessment

SecurityRequirements

SecureArchitecture

DesignReview

OperationalEnablement

EnvironmentHardening

VulnerabilityManagement

SecurityTesting

CodeReview

Phase 2Phase 3Phase 4

Phase 1

Independent Software VendorRoadmap template

rAtionAle

An Independent Software Vendor involves the core business func-tion of building and selling software components and applications.

Initial drivers to limit common vulnerabilities affecting customers and users leads to early concentration on Code Review and Secu-rity Testing activities.

Shifting toward more proactive prevention of security errors in product specification, an organization adds activities for Security Requirements over time.

Also, to minimize the impact from any discovered security issues, the organization ramps up Vulnerability Management activities over time.

As the organization matures, knowledge transfer activities from Operational Enablement are added to better inform customers and users about secure operation of the software.

AdditionAl considerAtions

Outsourced DevelopmentFor organizations using external development resources, restric-tions on code access typically leads to prioritization of Security Requirements activities instead of Code Review activities. Addi-tionally, advancing Threat Assessment in earlier phases would allow the organization to better clarify security needs to the outsourced developers. Since expertise on software configuration will gener-ally be strongest within the outsourced group, contracts should be constructed to account for the activities related to Operational Enablement.

Internet-Connected ApplicationsOrganizations building applications that use online resources have additional risks from the core internet-facing infrastructure that hosts the internet-facing systems. To account for this risk, organiza-tions should add activities from Environment Hardening to their roadmaps.

Drivers and Embedded DevelopmentFor organizations building low-level drivers or software for embed-ded systems, security vulnerabilities in software design can be more damaging and costly to repair. Therefore, roadmaps should be modi-fied to emphasize Secure Architecture and Design Review activities in earlier phases.

Organizations Grown by AcquisitionIn an organization grown by acquisition, there can often be several project teams following different development models with vary-ing degrees of security-related activities incorporated. An organiza-tion such as this may require a separate roadmap for each division or project team to account for varying starting points as well as project-specific concerns if a variety of software types are being developed.

Page 29: Software Assurance Maturity Model

sAM

M /

App

lyin

g t

he

Mo

del

- V

1.0

29

Strategy &Metrics

Policy &Compliance

Education &Guidance

ThreatAssessment

SecurityRequirements

SecureArchitecture

DesignReview

OperationalEnablement

EnvironmentHardening

VulnerabilityManagement

SecurityTesting

CodeReview

Phase 3Phase 4Phase 5

Phase 1Phase 2

rAtionAle

An Online Services Provider involves the core business function of building web applications and other network-accessible interfaces.

Initial drivers to validate the overall soundness of design without stifling innovation lead to early concentration on Design Review and Security Testing activities.

Since critical systems will be network-facing, Environment Harden-ing activities are also added early and ramped over time to account for risks from the hosted environment.

Though it can vary based on the core business of the organizations, Policy & Compliance activities should be started early and then ad-vanced according to the criticality of external compliance drivers.

As the organization matures, activities from Threat Assessment, Se-curity Requirements, and Secure Architecture are slowly added to help bolster proactive security after some baseline expectations for security have been established.

AdditionAl considerAtions

Outsourced DevelopmentFor organizations using external development resources, restric-tions on code access typically leads to prioritization of Security Requirements activities instead of Code Review activities. Addi-tionally, advancing Threat Assessment in earlier phases would allow the organization to better clarify security needs to the outsourced developers. Since expertise on software configuration will gener-ally be strongest within the outsourced group, contracts should be constructed to account for the activities related to Operational Enablement.

Online Payment ProcessingOrganizations required to be in compliance with the Payment Card Industry Data Security Standard (PCI-DSS) or other online pay-ment standards should place activities from Policy & Compliance in earlier phases of the roadmap. This allows the organization to opportunistically establish activities that ensure compliance and en-able the future roadmap to be tailored accordingly.

Web Services PlatformsFor organizations building web services platforms, design errors can carry additional risks and be more costly to mitigate. Therefore, ac-tivities from Threat Assessment, Security Requirements, and Secure Architecture should be placed in earlier phases of the roadmap.

Organizations Grown by AcquisitionIn an organization grown by acquisition, there can often be several project teams following different development models with vary-ing degrees of security-related activities incorporated. An organiza-tion such as this may require a separate roadmap for each division or project team to account for varying starting points as well as project-specific concerns if a variety of software types are being developed.

Online Service ProviderRoadmap template

Page 30: Software Assurance Maturity Model

sAM

M /

App

lyin

g t

he

Mo

del

- V

1.0

30

Strategy &Metrics

Policy &Compliance

Education &Guidance

ThreatAssessment

SecurityRequirements

SecureArchitecture

DesignReview

OperationalEnablement

EnvironmentHardening

VulnerabilityManagement

SecurityTesting

CodeReview

Phase 3Phase 4Phase 5

Phase 1Phase 2

Financial Services OrganizationRoadmap template

rAtionAle

A Financial Services Organization involves the core business func-tion of building systems to support financial transactions and pro-cessing. In general, this implies a greater concentration of internal and back-end systems that interface with disparate external data providers.

Initially, effort is focused on improving the Practices related to Gov-ernance since these are critical services that set the baseline for the assurance program and help meet compliance requirements for the organization.

Since building secure and reliable software proactively is an overall goal, Practices within Construction are started early on and ramped up sharply as the program matures.

Verification activities are also ramped up smoothly over the course of the roadmap to handle legacy systems without creating unreal-istic expectations. Additionally, this helps ensure enough cycles are spent building out more proactive Practices.

Since a financial services organization often operates the software they build, focus is given to the Practices within Deployment dur-ing the middle of the roadmap after some initial Governance is in place but before heavy focus is given to the proactive Construction Practices.

AdditionAl considerAtions

Outsourced DevelopmentFor organizations using external development resources, restric-tions on code access typically leads to prioritization of Security Requirements activities instead of Code Review activities. Addi-tionally, advancing Threat Assessment in earlier phases would allow the organization to better clarify security needs to the outsourced developers. Since expertise on software configuration will gener-ally be strongest within the outsourced group, contracts should be constructed to account for the activities related to Operational Enablement.

Web Services PlatformsFor organizations building web services platforms, design errors can carry additional risks and be more costly to mitigate. Therefore, ac-tivities from Threat Assessment, Security Requirements, and Secure Architecture should be placed in earlier phases of the roadmap.

Organizations Grown by AcquisitionIn an organization grown by acquisition, there can often be several project teams following different development models with vary-ing degrees of security-related activities incorporated. An organiza-tion such as this may require a separate roadmap for each division or project team to account for varying starting points as well as project-specific concerns if a variety of software types are being developed.

Page 31: Software Assurance Maturity Model

sAM

M /

App

lyin

g t

he

Mo

del

- V

1.0

31

Strategy &Metrics

Policy &Compliance

Education &Guidance

ThreatAssessment

SecurityRequirements

SecureArchitecture

DesignReview

OperationalEnablement

EnvironmentHardening

VulnerabilityManagement

SecurityTesting

CodeReview

Phase 4

Phase 1

Phase 5Phase 6

Phase 2Phase 3

rAtionAle

A Government Organization involves the core business function of being a state-affiliated organization that builds software to support public sector projects.

Initially, Governance Practices are established, generally to get an idea of the overall compliance burden for the organization in con-text of the concrete roadmap for improvement.

Because of risks of public exposure and the quantity of legacy code generally in place, early emphasis is given to Security Testing within the Verification Practices and later the more involved Code Review or Design Review Practices are developed.

Similar emphasis is placed on the Construction and Deployment Practices. This helps establish the organization’s management of vul-nerabilities and moves toward bolstering the security posture of the operating environment. At the same time, proactive security ac-tivities under Construction are built up to help prevent new issues in software under development.

AdditionAl considerAtions

Outsourced DevelopmentFor organizations using external development resources, restric-tions on code access typically leads to prioritization of Security Requirements activities instead of Code Review activities. Addi-tionally, advancing Threat Assessment in earlier phases would allow the organization to better clarify security needs to the outsourced developers. Since expertise on software configuration will gener-ally be strongest within the outsourced group, contracts should be constructed to account for the activities related to Operational Enablement.

Web Services PlatformsFor organizations building web services platforms, design errors can carry additional risks and be more costly to mitigate. Therefore, ac-tivities from Threat Assessment, Security Requirements, and Secure Architecture should be placed in earlier phases of the roadmap.

Regulatory ComplianceFor organizations under heavy regulations that affect business pro-cesses, the build-out of the Policy & Compliance Practice should be adjusted to accommodate external drivers. Likewise, organiza-tions under a lighter compliance load should take the opportunity to push back build-out of that Practice in favor of others.

Government OrganizationRoadmap template

Page 32: Software Assurance Maturity Model

The Security Practices

An explanation of the details

Page 33: Software Assurance Maturity Model

This section defines the building blocks of SAMM, the Maturity Levels under each Security Practice. For each Practice, the three Levels are covered in a summary table. Following that, the description for each Level includes detailed explanations of the required activities, results an or-ganization can expect from attaining the Level, success metrics to gauge performance, required ongoing personnel investment, and additional associated costs.

Page 34: Software Assurance Maturity Model

sAM

M /

th

e se

cu

rit

y p

rA

ct

ices

- V

1.0

34

3SM2SM1SM

Strategy & Metrics

objective Establish unified strategic roadmap for software security within the organization

Measure relative value of data and software assets and choose risk tolerance

Align security expenditure with relevant business indicators and asset value

Activities A. Estimate overall business risk profile

B. Build and maintain assurance program roadmap

A. Classify data and applications based on business risk

B. Establish and measure per-classification security goals

A. Conduct periodic industry-wide cost comparisons

B. Collect metrics for historic security spend

Assessment ✦ Is there a software security assurance program already in place? ✦ Do most of the business stakeholders understand your organization’s risk profile? ✦ Is most of your development staff aware of future plans for the assurance program?

✦ Are most of your applications and resources categorized by risk? ✦ Are risk ratings used to tailor the required assurance activities? ✦ Does most of the organization know about what’s required based on risk ratings?

✦ Is per-project data for cost of assurance activities collected? ✦ Does your organization regularly compare your security spend with other organizations?

resUlts ✦ Concrete list of the most critical business-level risks caused by software ✦ Tailored roadmap that addresses the security needs for your organization with minimal overhead ✦ Organization-wide understanding of how the assurance program will grow over time

✦ Customized assurance plans per project based on core value to the business ✦ Organization-wide understanding of security-relevance of data and application assets ✦ Better informed stakeholders with respect to understanding and accepting risks

✦ Information to make informed case-by-case decisions on security expenditures ✦ Estimates of past loss due to security issues ✦ Per project consideration of security expense versus loss potential ✦ Industry-wide due diligence with regard to security

Page 35: Software Assurance Maturity Model

sAM

M /

th

e se

cu

rit

y p

rA

ct

ices

- V

1.0

35

Activities

A . Estimate overall business risk profileInterview business owners and stakeholders and create a list of worst-case scenarios across the organization’s various application and data assets. Based on the way in which your organization builds, uses, or sells software, the list of worst-case scenarios can vary widely, but common issues include data theft or corruption, service outages, monetary loss, reverse engineering, account compromise, etc.

After broadly capturing worst-case scenario ideas, collate and select the most important based on collected information and knowledge about the core business. Any number can be selected, but aim for at least 3 and no more than 7 to make efficient use of time and keep the exercise focused.

Elaborate a description of each of the selected items and document details of contributing worst-case scenarios, potential contributing factors, and potential mitigating factors for the organization.

The final business risk profile should be reviewed with business owners and other stakehold-ers for understanding.

B . Build and maintain assurance program roadmapUnderstanding the main business risks to the organization, evaluate the current performance of the organization against each the twelve Practices. Assign a score for each Practice from 1, 2, or 3 based on the corresponding Objective if the organization passes all the cumulative success metrics. If no success metrics are being met, assign a score of 0 to the Practice.

Once a good understanding of current status is obtained, the next goal is to identify the Prac-tices that will be improved in the next iteration. Select them based on business risk profile, other business drivers, compliance requirements, budget tolerance, etc. Once Practices are selected, the goals of the iteration are to achieve the next Objective under each.

Iterations of improvement on the assurance program should be approximately 3-6 months, but an assurance strategy session should take place at least every 3 months to review prog-ress on activities, performance against success metrics and other business drivers that may require program changes.

Establish unified strategic roadmap for software security within the organization

resuLts

✦ Concrete list of the most critical business-level risks caused by software ✦ Tailored roadmap that addresses the security needs for your organization with minimal overhead ✦ Organization-wide understanding of how the assurance program will grow over time

success Metrics

✦ >80% of stakeholders briefed on business risk profile in past 6 months ✦ >80% of staff briefed on assurance program roadmap in past 3 months ✦ >1 assurance program strategy session in past 3 months

costs

✦ Buildout and maintenance of business risk profile ✦ Quarterly evaluation of assurance program

personneL

✦ Developers (1 day/yr) ✦ Architects (4 days/yr) ✦ Managers (4 days/yr) ✦ Business Owners (4 days/yr) ✦ QA Testers (1 day/yr) ✦ Security Auditor (4 days/yr)

reLated LeveLs

✦ Policy & Compliance - 1 ✦ Threat Assessment - 1 ✦ Security Requirements - 2

1SMStrategy & Metrics

Page 36: Software Assurance Maturity Model

sAM

M /

th

e se

cu

rit

y p

rA

ct

ices

- V

1.0

36

Activities

A . Classify data and applications based on business riskEstablish a simple classification system to represent risk-tiers for applications. In its simplest form, this can be a High/Medium/Low categorization. More sophisticated classifications can be used, but there should be no more than seven categories and they should roughly repre-sent a gradient from high to low impact against business risks.

Working from the organization’s business risk profile, create project evaluation criteria that maps each project to one of the risk categories. A similar but separate classification scheme should be created for data assets and each item should be weighted and categorized based on potential impact to business risks.

Evaluate collected information about each application and assign each a risk category based upon overall evaluation criteria and the risk categories of data assets in use. This can be done centrally by a security group or by individual project teams through a customized questionnaire to gather the requisite information.

An ongoing process for application and data asset risk categorization should be established to assign categories to new assets and keep the existing information updated at least bian-nually.

B . Establish and measure per-classification security goalsWith a classification scheme for the organization’s application portfolio in place, direct secu-rity goals and assurance program roadmap choices can be made more granular.

The assurance program’s roadmap should be modified to account for each application risk category by specifying emphasis on particular Practices for each category. For each iteration of the assurance program, this would typically take the form of prioritizing more higher-level Objectives on the highest risk application tier and progressively less stringent Objectives for lower/other categories.

This process establishes the organization’s risk tolerance since active decisions must be made as to what specific Objectives are expected of applications in each risk category. By choosing to keep lower risk applications at lower levels of performance with respect to the Security Practices, resources are saved in exchange for acceptance of a weighted risk. How-ever, it is not necessary to arbitrarily build a separate roadmap for each risk category since that can leads to inefficiency in management of the assurance program itself.

Measure relative value of data and software assets and choose risk tolerance

resuLts

✦ Customized assurance plans per project based on core value to the business ✦ Organization-wide understanding of security-relevance of data and application assets ✦ Better informed stakeholders with respect to understanding and accepting risks

add’L success Metrics

✦ >90% applications and data assets evaluated for risk classification in past 12 months ✦ >80% of staff briefed on relevant application and data risk ratings in past 6 months ✦ >80% of staff briefed on relevant assurance program roadmap in past 3 months

add’L costs

✦ Buildout or license of application and data risk categorization scheme ✦ Program overhead from more granular roadmap planning

add’L personneL

✦ Architects (2 days/yr) ✦ Managers (2 days/yr) ✦ Business Owners (2 days/yr) ✦ Security Auditor (2 days/yr)

reLated LeveLs

✦ Policy & Compliance - 2 ✦ Threat Assessment - 2 ✦ Design Review - 2

2SM Strategy & Metrics

Page 37: Software Assurance Maturity Model

sAM

M /

th

e se

cu

rit

y p

rA

ct

ices

- V

1.0

37

Activities

A . Conduct periodic industry-wide cost comparisonsResearch and gather information about security costs from intra-industry communication forums, business analyst and consulting firms, or other external sources. In particular, there are a few key factors that need to be identified.

First, use collected information to identify the average amount of security effort being ap-plied by similar types of organizations in your industry. This can be done either top-down from estimates of total percentage of budget, revenue, etc. or it can be done bottom-up by identifying security-related activities that are considered normal for your type of organiza-tion. Overall, this can be hard to gauge for certain industries, so collect information from as many relevant sources as are accessible.

The next goal of researching security costs is to determine if there are potential cost savings on third-party security products and services that your organization currently uses. When weighing the decision of switching vendors, account for hidden costs such as retraining staff or other program overhead.

Overall, these cost-comparison exercises should be conducted at least annually prior to the subsequent assurance program strategy session. Comparison information should be presented to stakeholders in order to better align the assurance program with the business.

B . Collect metrics for historic security spendCollect project-specific information on the cost of past security incidents. For instance, time and money spent in cleaning up a breach, monetary loss from system outages, fines and fees to regulatory agencies, project-specific one-off security expenditures for tools or services, etc.

Using the application risk categories and the respective prescribed assurance program road-maps for each, a baseline security cost for each application can be initially estimated from the costs associated with the corresponding risk category.

Combine the application-specific cost information with the general cost model based on risk category, and then evaluate projects for outliers, i.e. sums disproportionate to the risk rating. These indicate either an error in risk evaluation/classification or the necessity to tune the organization’s assurance program to address root causes for security cost more effectively.

The tracking of security spend per project should be done quarterly at the assurance pro-gram strategy session, and the information should be reviewed and evaluated by stakehold-ers at least annually. Outliers and other unforeseen costs should be discussed for potential affect on assurance program roadmap.

Align security expenditure with relevant business indicators and asset value

resuLts

✦ Information to make informed case-by-case decisions on security expenditures ✦ Estimates of past loss due to security issues ✦ Per-project consideration of security expense versus loss potential ✦ Industry-wide due diligence with regard to security

add’L success Metrics

✦ >80% of projects reporting security costs in past 3 months ✦ >1 industry-wide cost comparison in past 1 year ✦ >1 historic security spend evaluation in past 1 year

add’L costs

✦ Buildout or license industry intelligence on security programs ✦ Program overhead from cost estimation, tracking, and evaluation

add’L personneL

✦ Architects (1 days/yr) ✦ Managers (1 days/yr) ✦ Business Owners (1 days/yr) ✦ Security Auditor (1 days/yr)

reLated LeveLs

✦ Vulnerability Management - 1

3SMStrategy & Metrics

Page 38: Software Assurance Maturity Model

sAM

M /

th

e se

cu

rit

y p

rA

ct

ices

- V

1.0

38

3PC2PC1PC

Policy & Compliance

objective Understand relevant governance and compliance drivers to the organization

Establish security and compliance baseline and understand per-project risks

Require compliance and measure projects against organization-wide policies and standards

Activities A. Identify and monitor external compliance drivers

B. Build and maintain compliance guidelines

A. Build policies and standards for security and compliance

B. Establish project audit practice

A. Create compliance gates for projects

B. Adopt solution for audit data collection

Assessment ✦ Do most project stakeholders know their project’s compliance status? ✦ Are compliance requirements specifically considered by project teams?

✦ Does the organization utilize a set of policies and standards to control software development? ✦ Are project teams able to request an audit for compliance with policies and standards?

✦ Are projects periodically audited to ensure a baseline of compliance with policies and standards? ✦ Does the organization systematically use audits to collect and control compliance evidence?

resUlts ✦ Increased assurance for handling third-party audit with positive outcome ✦ Alignment of internal resources based on priority of compliance requirements ✦ Timely discovery of evolving regulatory requirements that affect your organization

✦ Awareness for project teams regarding expectations for both security and compliance ✦ Business owners that better understand specific compliance risks in their product lines ✦ Optimized approach for efficiently meeting compliance with opportunistic security improvement

✦ Organization-level visibility of accepted risks due to non-compliance ✦ Concrete assurance for compliance at the project level ✦ Accurate tracking of past project compliance history ✦ Efficient audit process leveraging tools to cut manual effort

Page 39: Software Assurance Maturity Model

sAM

M /

th

e se

cu

rit

y p

rA

ct

ices

- V

1.0

39

Activities

A . Identify and monitor external compliance driversWhile an organization might have a wide variety of compliance requirements, this activity is specifically oriented around those that either directly or indirectly affect the way in which the organization builds or uses software and/or data. Leverage internal staff focused on compliance if available.

Based on the organization’s core business, conduct research and identify third-party regula-tory standards with which compliance is required or considered an industry norm. Possibili-ties include the Sarbanes-Oxley Act (SOX), the Payment Card Industry Data Security Stan-dards (PCI-DSS), the Health Insurance Portability and Accountability Act (HIPAA), etc. After reading and understanding each third-party standard, collect specific requirements related to software and data and build a consolidated list that maps each driver (third-party standard) to each of its specific requirements for security. At this stage, try to limit the amount of requirements by dropping anything considered optional or only recommended.

At a minimum, conduct research at least biannually to ensure the organization is keeping updated on changes to third-party standards. Depending upon the industry and the impor-tance of compliance, this activity can vary in effort and personnel involvement, but should always be done explicitly.

B . Build and maintain compliance guidelinesBased upon the consolidated list of software and data-related requirements from compliance drivers, elaborate the list by creating a corresponding response statement to each require-ment. Sometimes called control statements, each response should capture the concept of what the organization does to ensure the requirement is met (or to note why it does not apply).

Since typical audit practice often involves checking a control statement for sufficiency and then measuring the organization against the control statement itself, it is critical that they accurately represent actual organizational practices. Also, many requirements can be met by instituting simple, lightweight process elements to cover base-line compliance prior to evolv-ing the organization for better assurance down the road.

Working from the consolidated list, identify major gaps to feed the future planning efforts with regard to building the assurance program. Communicate information about compliance gaps with stakeholders to ensure awareness of the risk from non-compliance.

At a minimum, update and review control statements with stakeholders at least biannually. Depending on the number of compliance drivers, it may make sense to perform updates more often.

Understand relevant governance and compliance drivers to the organization

resuLts

✦ Increased assurance for handling third-party audit with positive outcome ✦ Alignment of internal resources based on priority of compliance requirements ✦ Timely discovery of evolving regulatory requirements that affect your organization

success Metrics

✦ >1 compliance discovery meeting in past 6 months ✦ Compliance checklist completed and updated within past 6 months ✦ >1 compliance review meeting with stakeholders in past 6 months

costs

✦ Initial creation and ongoing maintenance of compliance checklist

personneL

✦ Architects (1 day/yr) ✦ Managers (2 days/yr) ✦ Business Owners (1-2 days/yr)

reLated LeveLs

✦ Strategy & Metrics - 1

1PCPolicy & Compliance

Page 40: Software Assurance Maturity Model

sAM

M /

th

e se

cu

rit

y p

rA

ct

ices

- V

1.0

40

Activities

A . Build policies and standards for security and complianceBeginning with a current compliance guidelines, review regulatory standards and note any optional or recommended security requirements. Also, the organization should conduct a small amount of research to discover any potential future changes in compliance require-ments that are relevant.

Augment the list with any additional requirements based on known business drivers for se-curity. Often it is simplest to consult existing guidance being provided to development staff and gather a set of best practices.

Group common/similar requirements and rewrite each group as more generalized/simplified statements that meet all the compliance drivers as well as provide some additional security value. Work through this process for each grouping with the goal of building a set of inter-nal policies and standards that can be directly mapped back to compliance drivers and best practices.

It is important for the set of policies and standards to not contain requirements that are too difficult or excessively costly for project teams to comply. A useful heuristic is that ap-proximately 80% of projects should be able to comply with minimal disruption. This requires a good communications program being set up to advertise the new policies/standards and assist teams with compliance if needed.

B . Establish project audit practiceCreate a simple audit process for project teams to request and receive an audit against inter-nal standards. Audits are typically performed by security auditors but can also be conducted by security-savvy staff as long as they are knowledgeable about the internal standards.

Based upon any known business risk indicators, projects can be prioritized concurrently with audit queue triage such that high-risk software is assessed sooner or more frequently. Ad-ditionally, low-risk projects can have internal audit requirements loosened to make the audit practice more cost-effective.

Overall, each active project should undergo an audit at least biannually. Generally, subse-quent audits after the initial will be simpler to perform if sufficient audit information about the application is retained.

Advertise this service to business owners and other stakeholders so that they may request an audit for their projects. Detailed pass/fail results per requirement from the internal standards should be delivered to project stakeholders for evaluation. Where practical, audit results should also contain explanations of impact and remediation recommendations.

Establish security and compliance baseline and understand per-project risks

resuLts

✦ Awareness for project teams regarding expectations for both security and compliance ✦ Business owners that better understand specific compliance risks in their product lines ✦ Optimized approach for efficiently meeting compliance with opportunistic security improvement

add’L success Metrics

✦ >75% of staff briefed on policies and standards in past 6 months ✦ >80% stakeholders aware of compliance status against policies and standards

add’L costs

✦ Internal standards buildout or license ✦ Per-project overhead from compliance with internal standards and audit

add’L personneL

✦ Architects (1 days/yr) ✦ Managers (1 days/yr) ✦ Security Auditors (2 days/project/yr)

reLated LeveLs

✦ Education & Guidance - 1 & 3 ✦ Strategy & Metrics - 2 ✦ Security Requirements - 1 & 3 ✦ Secure Architecture - 3 ✦ Code Review - 3 ✦ Design Review - 3 ✦ Environment Hardening - 3

2PC Policy & Compliance

Page 41: Software Assurance Maturity Model

sAM

M /

th

e se

cu

rit

y p

rA

ct

ices

- V

1.0

41

Activities

A . Create compliance gates for projectsOnce an organization has established internal standards for security, the next level of en-forcement is to set particular points in the project life-cycle where a project cannot pass until it is audited against the internal standards and found to be in compliance.

Usually, the compliance gate is placed at the point of software release such that they are not allowed to publish a release until the compliance check is passed. It is important to provide enough time for the audit to take place and remediation to occur, so generally the audit should begin earlier, for instance when a release is given to QA.

Despite being a firm compliance gate, legacy or other specialized projects may not be able to comply, so an exception approval process must also be created. No more than about 20% of all projects should have exception approval.

B . Adopt solution for audit data collectionOrganizations conducting regular audits of project teams generate a large amount of audit data over time. Automation should be utilized to assist in automated collection, manage col-lation for storage and retrieval, and to limit individual access to sensitive audit data.

For many concrete requirements from the internal standards, existing tools such as code analyzers, application penetration testing tools, monitoring software, etc. can be customized and leveraged to automate compliance checks against internal standards. The purpose of automating compliance checks is to both improve efficiency of audit as well as enable more staff to self-check for compliance before a formal audit takes place. Additionally, automated checks are less error-prone and allow for lower latency on discovery of problems.

Information storage features should allow centralized access to current and historic audit data per project. Automation solutions must also provide detailed access control features to limit access to approved individuals with valid business purpose for accessing the audit data.

All instructions and procedures related to accessing compliance data as well as requesting access privileges should be advertised to project teams. Additional time may be initially re-quired from security auditors to bootstrap project teams.

Require compliance and measure projects against organization-wide policies and standards

resuLts

✦ Organization-level visibility of accepted risks due to non-compliance ✦ Concrete assurance for compliance at the project level ✦ Accurate tracking of past project compliance history ✦ Efficient audit process leveraging tools to cut manual effort

add’L success Metrics

✦ >80% projects in compliance with policies and standards as seen by audit ✦ <50% time per audit as compared to manual

add’L costs

✦ Buildout or license tools to automate audit against internal standards ✦ Ongoing maintenance of audit gates and exception process

add’L personneL

✦ Developers (1 days/yr) ✦ Architects (1 days/yr) ✦ Managers (1 days/yr)

reLated LeveLs

✦ Education & Guidance - 3 ✦ Code Review - 2 ✦ Security Testing - 2

3PCPolicy & Compliance

Page 42: Software Assurance Maturity Model

sAM

M /

th

e se

cu

rit

y p

rA

ct

ices

- V

1.0

42

3EG2EG1EG

Education & Guidance

objective Offer development staff access to resources around the topics of secure programming and deployment

Educate all personnel in the software life-cycle with role-specific guidance on secure development

Mandate comprehensive security training and certify personnel for baseline knowledge

Activities A. Conduct technical security awareness training

B. Build and maintain technical guidelines

A. Conduct role-specific application security training

B. Utilize security coaches to enhance project teams

A. Create formal application security support portal

B. Establish role-based examination/certification

Assessment ✦ Have most developers been given high-level security awareness training? ✦ Does each project team have access to secure development best practices and guidance?

✦ Are most roles in the development process given role-specific training and guidance? ✦ Are most stakeholders able to pull in security coaches for use on projects?

✦ Is security-related guidance centrally controlled and consistently distributed throughout the organization? ✦ Are most people tested to ensure a baseline skill-set for secure development and deployment practices?

resUlts ✦ Increased developer awareness on the most common problems at the code level ✦ Maintain software with rudimentary security best-practices in place ✦ Set baseline for security know-how among technical staff ✦ Enable qualitative security checks for baseline security knowledge

✦ End-to-end awareness of the issues that leads to security vulnerabilities at the product, design, and code levels ✦ Build plans to remediate vulnerabilities and design flaws in ongoing projects ✦ Enable qualitative security checkpoints at requirements, design, and development stages ✦ Deeper understanding of security issues encourages more proactive security planning

✦ Efficient remediation of vulnerabilities in both ongoing and legacy code bases ✦ Quickly understand and mitigate against new attacks and threats ✦ Judge security-savvy of staff and measure against a common standard ✦ Establish fair incentives toward security awareness

Page 43: Software Assurance Maturity Model

sAM

M /

th

e se

cu

rit

y p

rA

ct

ices

- V

1.0

43

Activities

A . Conduct technical security awareness trainingEither internally or externally sourced, conduct security training for technical staff that cov-ers the basic tenets of application security. Generally, this can be accomplished via instructor-led training in 1-2 days or via computer-based training with modules taking about the same amount of time per developer.

Course content should cover both conceptual and technical information. Appropriate topics include high-level best practices surrounding input validation, output encoding, error han-dling, logging, authentication, authorization. Additional coverage of commonplace software vulnerabilities is also desirable such as a Top 10 list appropriate to the software being devel-oped (web applications, embedded devices, client-server applications, back-end transaction systems, etc.). Wherever possible, use code samples and lab exercises in the specific pro-gramming language(s) that applies.

To rollout such training, it is recommended to mandate annual security training and then hold courses (either instructor-led or computer-based) as often as required based on devel-opment head-count.

B . Build and maintain technical guidelinesFor development staff, assemble a list of approved documents, web pages, and technical notes that provide technology-specific security advice. These references can be assembled from many publicly available resources on the Internet. In cases where very specialized or pro-prietary technologies permeate the development environment, utilize senior, security-savvy staff to build security notes over time to create such a knowledge base in an ad hoc fashion.

Ensure management is aware of the resources and briefs oncoming staff about their ex-pected usage. Try to keep the guidelines lightweight and up-to-date to avoid clutter and irrel-evance. Once a comfort-level has been established, they can be used as a qualitative checklist to ensure that the guidelines have been read, understood, and followed in the development process.

Offer development staff access to resources around the topics of secure programming and deployment

resuLts

✦ Increased developer awareness on the most common problems at the code level ✦ Maintain software with rudimentary security best-practices in place ✦ Set baseline for security know-how among technical staff ✦ Enable qualitative security checks for baseline security knowledge

success Metrics

✦ >50% development staff briefed on security issues within past 1 year ✦ >75% senior development/architect staff briefed on security issues within past 1 year ✦ Launch technical guidance within 3 months of first training

costs

✦ Training course buildout or license ✦ Ongoing maintenance of technical guidance

personneL

✦ Developers (1-2 days/yr) ✦ Architects (1-2 days/yr)

reLated LeveLs

✦ Policy & Compliance - 2 ✦ Security Requirements - 1 ✦ Secure Architecture - 1

1EGEducation & Guidance

Page 44: Software Assurance Maturity Model

sAM

M /

th

e se

cu

rit

y p

rA

ct

ices

- V

1.0

44

Activities

A . Conduct role-specific application security trainingConduct security training for staff that highlights application security in the context of each role’s job function. Generally, this can be accomplished via instructor-led training in 1-2 days or via computer-based training with modules taking about the same amount of time per person.

For managers and requirements specifiers, course content should feature security require-ments planning, vulnerability and incident management, threat modeling, and misuse/abuse case design.

Tester and auditor training should focus on training staff to understand and more effectively analyze software for security-relevant issues. As such, it should feature techniques for code review, architecture and design analysis, runtime analysis, and effective security test planning.

Expand technical training targeting developers and architects to include other relevant topics such as security design patterns, tool-specific training, threat modeling and software assess-ment techniques.

To rollout such training, it is recommended to mandate annual security awareness training and periodic specialized topics training. Course should be available (either instructor-led or computer-based) as often as required based on head-count per role.

B . Utilize security coaches to enhance project teamsUsing either internal or external experts, make security-savvy staff available to project teams for consultation. Further, this coaching resource should be advertised internally to ensure that staff are aware of its availability.

The coaching staff can be created by recruiting experienced individuals within the organiza-tion to spend some percentage of their time, around 10% maximum, performing coaching activities. The coaches should communicate between one another to ensure they are aware of each other’s area of expertise and route questions accordingly for efficiency.

While coaches can be used at any point in the software life-cycle, appropriate times to use the coaches include during initial product conception, before completion of functional or de-tailed design specification(s), when issues arise during development, test planning, and when operational security incidents occur.

Over time, the internal network of coaching resources can be used as points-of-contact for communicating security-relevant information throughout the organization as well as being local resources that have greater familiarity with the ongoing project teams than a purely centralized security team might.

Educate all personnel in the software life-cycle with role-specific guidance on secure development

resuLts

✦ End-to-end awareness of the issues that leads to security vulnerabilities at the product, design, and code levels ✦ Build plans to remediate vulnerabilities and design flaws in ongoing projects ✦ Enable qualitative security checkpoints at requirements, design, and development stages ✦ Deeper understanding of security issues encourages more proactive security planning

add’L success Metrics

✦ >60% development staff trained within past 1 year ✦ >50% management/analyst staff trained within past 1 year ✦ >80% senior development/architect staff trained within past 1 year ✦ >3.0 Likert on usefulness of training courses

add’L costs

✦ Training library build-out or license ✦ Security-savvy staff for hands-on coaching

add’L personneL

✦ Developers (2 days/yr) ✦ Architects (2 days/yr) ✦ Managers (1-2 days/yr) ✦ Business Owners (1-2 days/yr) ✦ QA Testers (1-2 days/yr) ✦ Security Auditors (1-2 days/yr)

reLated LeveLs

✦ Vulnerability Management - 1 ✦ Design Review - 2 ✦ Secure Architecture - 2

2EG Education & Guidance

Page 45: Software Assurance Maturity Model

sAM

M /

th

e se

cu

rit

y p

rA

ct

ices

- V

1.0

45

Activities

A . Create formal application security support portalBuilding upon written resources on topics relevant to application security, create and ad-vertise a centralized repository (usually an internal web site). The guidelines themselves can be created in any way that makes sense for the organization, but an approval board and straightforward change control processes must be established.

Beyond static content in the form of best-practices lists, tool-specific guides, FAQs, and other articles, the support portal should feature interactive components such as mailing lists, web-based forums, or wikis to allow internal resources to cross-communicate security relevant topics and have the information cataloged for future reference.

The content should be cataloged and easily searchable based upon several common fac-tors such as platform, programming language, pertinence to specific third party libraries or frameworks, life-cycle stage, etc. Project teams creating software should align themselves early in product development to the specific guidelines that they will follow. In product as-sessments, the list of applicable guidelines and product-related discussions should be used as audit criteria.

B . Establish role-based examination/certificationEither per role or per training class/module, create and administer aptitude exams that test people for comprehension and utilization of security knowledge. Typically, exams should be created based on the role-based curricula and target a minimum passing score around 75% correct. While staff should be required to take applicable training or refresher courses an-nually, certification exams should be required biannually at a minimum.

Based upon pass/fail criteria or exceptional performance, staff should be ranked into tiers such that other security-related activities could require individuals of a particular certifica-tion level to sign-off before the activity is complete, e.g. an uncertified developer cannot pass a design into implementation without explicit approval from a certified architect. This pro-vides granular visibility on an per-project basis for tracking security decisions with individual accountability. Overall, this provides a foundation for rewarding or penalizing staff for making good business decisions regarding application security.

Mandate comprehensive security training and certify personnel for baseline knowledge

resuLts

✦ Efficient remediation of vulnerabilities in both ongoing and legacy code bases ✦ Quickly understand and mitigate against new attacks and threats ✦ Judge security-savvy of staff and measure against a common standard ✦ Establish fair incentives toward security awareness

add’L success Metrics

✦ >80% staff certified within past 1 year

add’L costs

✦ Certification examination build-out or license ✦ Ongoing maintenance and change control for application security support portal ✦ Human-resources and overhead cost for implementing employee certification

add’L personneL

✦ Developers (1 day/yr) ✦ Architects (1 day/yr) ✦ Managers (1 day/yr) ✦ Business Owners (1 day/yr) ✦ QA Testers (1 day/yr) ✦ Security Auditors (1 day/yr)

reLated LeveLs

✦ Policy & Compliance - 2 & 3

3EGEducation & Guidance

Page 46: Software Assurance Maturity Model

sAM

M /

th

e se

cu

rit

y p

rA

ct

ices

- V

1.0

46

1TA 2TA 3TA

Threat Assessment

objective Identify and understand high-level threats to the organization and individual projects

Increase accuracy of threat assessment and improve granularity of per-project understanding

Concretely tie compensating controls to each threat against internal and third-party software

Activities A. Build and maintain application-specific threat models

B. Develop attacker profile from software architecture

A. Build and maintain abuse-case models per project

B. Adopt a weighting system for measurement of threats

A. Explicitly evaluate risk from third-party components

B. Elaborate threat models with compensating controls

Assessment ✦ Do most projects in your organization consider and document likely threats? ✦ Does your organization understand and document the types of attackers it faces?

✦ Do project teams regularly analyze functional requirements for likely abuses? ✦ Do project teams use a method of rating threats for relative comparison? ✦ Are stakeholders aware of relevant threats and ratings?

✦ Do project teams specifically consider risk from external software? ✦ Are all protection mechanisms and controls captured and mapped back to threats?

resUlts ✦ High-level understanding of factors that may lead to negative outcomes ✦ Increased awareness of threats amongst project teams ✦ Inventory of threats for your organization

✦ Granular understanding of likely threats to individual projects ✦ Framework for better tradeoff decisions within project teams ✦ Ability to prioritize development efforts within a project team based on risk weighting

✦ Deeper consideration of full threat profile for each software project ✦ Detailed mapping of assurance features to established threats against each software project ✦ Artifacts to document due diligence based on business function of each software project

Page 47: Software Assurance Maturity Model

sAM

M /

th

e se

cu

rit

y p

rA

ct

ices

- V

1.0

47

Activities

A . Build and maintain application-specific threat modelsBased purely on the business purpose of each software project and the business risk profile (if available) identify likely worst-case scenarios for the software under development in each project team. This can be conducted using simple attack trees or through a more formal threat modeling process such as Microsoft’s STRIDE, Trike, etc.

To build attack trees, identify each worst-case scenario in one sentence and label these as the high-level goals of an attacker. From each attacker goal identified, identify preconditions that must hold in order for each goal to be realized. This information should be captured in branches underneath each goal where each branch is either a logical AND or a logical OR of the statements contained underneath. An AND branch indicates that each directly attached child nodes must be true in order to realize the parent node. An OR branch indicates that any one of the directly attached child nodes must be true in order to achieve the parent node.

Regardless of the threat modeling approach, review each current and historic functional requirement to augment the attack tree to indicate security failures relevant to each. Brain-storm by iteratively dissecting each failure scenario into all the possible ways in which an attacker might be able to reach one of the goals. After initial creation, the threat model for an application should be updated when significant changes to the software are made. This assessment should be conducted with senior developers and architects as well as one or more security auditors.

B . Develop attacker profile from software architectureInitially, conduct an assessment to identify all likely threats to the organization based on software projects. For this assessment, consider threats to be limited to agents of malicious intent and omit other risks such as known vulnerabilities, potential weaknesses, etc.

Begin by generally considering external agents and their corresponding motivations for at-tack. To this list, add internal roles that could cause damage and their motivations for insider attack. Based on the architecture of the software project(s) under consideration, it can be more efficient to conduct this analysis once per architecture type instead of for each project individually since applications of architecture and business purpose will generally be suscep-tible to similar threats.

This assessment should be conducted with business owners and other stakeholders but also include one or more security auditors for additional perspective on threats. In the end, the goal is to have a concise list of threat agents and their corresponding motivations for attack.

Identify and understand high-level threats to the organization and individual projects

resuLts

✦ High-level understanding of factors that may lead to negative outcomes ✦ Increased awareness of threats amongst project teams ✦ Inventory of threats for your organization

success Metrics

✦ >50% of project stakeholders briefed on the threat models of relevant projects within past 12 months ✦ >75% of project stakeholders briefed on attacker profiles for relevant architectures

costs

✦ Buildout and maintenance of project artifacts for threat models

personneL

✦ Business Owners (1 day/yr) ✦ Developers (1 day/yr) ✦ Architects (1 day/yr) ✦ Security Auditors (2 day/yr) ✦ Managers (1 day/yr)

reLated LeveLs

✦ Strategy & Metrics - 1 ✦ Security Requirements - 2

Threat Assessment 1TA

Page 48: Software Assurance Maturity Model

sAM

M /

th

e se

cu

rit

y p

rA

ct

ices

- V

1.0

48

Activities

A . Build and maintain abuse-case models per projectFurther considering the threats to the organization, conduct a more formal analysis to deter-mine potential misuse or abuse of functionality. Typically, this process begins with identifica-tion of normal usage scenarios, e.g. use-case diagrams if available.

If a formal abuse-case technique isn’t used, generate a set of abuse-cases for each scenario by starting with a statement of normal usage and brainstorming ways in which the statement might be negated, in whole or in part. The simplest way to get started is to insert the word “no” or “not” into the usage statement in as many ways as possible, typically around nouns and verbs. Each usage scenario should generate several possible abuse-case statements.

Further elaborate the abuse-case statements to include any application-specific concerns based on the business function of the software. The ultimate goal is for the completed set of abuse statements to form a model for usage patterns that should be disallowed by the software. If desired, these abuse cases can be combined with existing threat models.

After initial creation, abuse-case models should be updated for active projects during the design phase. For existing projects, new requirements should be analyzed for potential abuse, and existing projects should opportunistically build abuse-cases for established functionality where practical.

B . Adopt a weighting system for measurement of threatsBased on the established attacker profiles, identify a rating system to allow relative compari-son between the threats. Initially, this can be a simple high-medium-low rating based upon business risk, but any scale can be used provided that there are no more than 5 categories.

After identification of a rating system, build evaluation criteria that allow each threat to be assigned a rating. In order to do this properly, additional factors about each threat must be considered beyond motivation. Important factors include capital and human resources, inherent access privilege, technical ability, relevant goals on the threat model(s), likelihood of successful attack, etc.

After assigning each threat to a rating, use this information to prioritize risk mitigation activi-ties within the development life-cycle. Once built for a project team, it should be updated during design of new features or refactoring efforts.

Increase accuracy of threat assessment and improve granularity of per-project understanding

resuLts

✦ Granular understanding of likely threats to individual projects ✦ Framework for better tradeoff decisions within project teams ✦ Ability to prioritize development efforts within a project team based on risk weighting

add’L success Metrics

✦ >75% of project teams with identified and rated threats ✦ >75% of project stakeholders briefed on threat and abuse models of relevant projects within past 6 months

add’L costs

✦ Project overhead from maintenance of threat models and attacker profiles

add’L personneL

✦ Security Auditor (1 day/yr) ✦ Business Owner (1 day/yr) ✦ Managers (1 day/yr)

reLated LeveLs

✦ Strategy & Metrics - 2 ✦ Secure Architecture - 2

Threat Assessment2TA

Page 49: Software Assurance Maturity Model

sAM

M /

th

e se

cu

rit

y p

rA

ct

ices

- V

1.0

49

Activities

A . Explicitly evaluate risk from third-party componentsConduct an assessment of your software code-base and identify any components that are of external origin. Typically, these will include open-source projects, purchased COTS software, and online services which your software uses.

For each identified component, elaborate attacker profiles for the software project based upon potential compromise of third-party components. Based upon the newly identified attacker profiles, update software threat models to incorporate any likely risks based upon new attacker goals or capabilities.

In addition to threat scenarios, also consider ways in which vulnerabilities or design flaws in the third-party software might affect your code and design. Elaborate your threat models accordingly with the potential risks from vulnerabilities and knowledge of the updated at-tacker profile.

After initially conducted for a project, this must be updated and reviewed during the design phase or every development cycle. This activity should be conducted by a security auditor with relevant technical and business stakeholders.

B . Elaborate threat models with compensating controlsConduct an assessment to formally identify factors that directly prevent preconditions for compromise represented by the threat models. These mitigating factors are the compensat-ing controls that formally address the direct risks from software. Factors can be technical features in the software itself, but can also be process elements in the development life-cycle, infrastructure features, etc.

If using attack trees, the logical relationship represented by each branch will be either an AND or an OR. Therefore, by mitigating against just one precondition on an AND branch, the parent and all connected leaf nodes can be marked as mitigated. However, all child nodes on an OR node must be prevented before the parent can be marked as mitigated.

Regardless of threat modeling technique, identify compensating controls and annotate the threat models directly. The goal is to maximize coverage in terms of controls that mark parts of the threat model as mitigated. For any viable paths remaining, identify potential compen-sating controls for feedback into organizational strategy.

After initially conducted for a project, this must be updated and reviewed during the design phase or every development cycle. This activity should be conducted by a security auditor with relevant technical and business stakeholders.

Concretely tie compensating controls to each threat against internal and third-party software

resuLts

✦ Deeper consideration of full threat profile for each software project ✦ Detailed mapping of assurance features to established threats against each software project ✦ Artifacts to document due diligence based on business function of each software project

add’L success Metrics

✦ >80% of project teams with updated threat models prior to every implementation cycle ✦ >80% of project teams with updated inventory of third-party components prior to every release ✦ >50% of all security incidents identified a priori by threat models in past 12 months

add’L costs

✦ Project overhead from maintenance of detailed threat models and expanded attacker profiles ✦ Discovery of all third-party dependencies

add’L personneL

✦ Business Owners (1 day/yr) ✦ Developers (1 day/yr) ✦ Architects (1 day/yr) ✦ Security Auditors (2 day/yr) ✦ Managers (1 day/yr)

reLated LeveLs

✦ Security Requirements - 2 & 3

Threat Assessment 3TA

Page 50: Software Assurance Maturity Model

sAM

M /

th

e se

cu

rit

y p

rA

ct

ices

- V

1.0

50

1SR 2SR 3SR

Security Requirements

objective Consider security explicitly during the software requirements process

Increase granularity of security requirements derived from business logic and known risks

Mandate security requirements process for all software projects and third-party dependencies

Activities A. Derive security requirements from business functionality

B. Evaluate security and compliance guidance for requirements

A. Build an access control matrix for resources and capabilities

B. Specify security requirements based on known risks

A. Build security requirements into supplier agreements

B. Expand audit program for security requirements

Assessment ✦ Do most project teams specify some security requirements during development? ✦ Do project teams pull requirements from best-practices and compliance guidance?

✦ Are most stakeholders reviewing access control matrices for relevant projects? ✦ Are project teams specifying requirements based on feedback from other security activities?

✦ Are most stakeholders reviewing vendor agreements for security requirements? ✦ Are the security requirements specified by project teams being audited?

resUlts ✦ High-level alignment of development effort with business risks ✦ Ad hoc capturing of industry best-practices for security as explicit requirements ✦ Awareness amongst stakeholders of measures being taken to mitigate risk from software

✦ Detailed understanding of attack scenarios against business logic ✦ Prioritized development effort for security features based on likely attacks ✦ More educated decision-making for trade-offs between features and security efforts ✦ Stakeholders that can better avoid functional requirements that inherently have security flaws

✦ Formally set baseline for security expectations from external code ✦ Centralized information on security effort undertaken by each project team ✦ Ability to align resources to projects based on application risk and desired security requirements

Page 51: Software Assurance Maturity Model

sAM

M /

th

e se

cu

rit

y p

rA

ct

ices

- V

1.0

51

Activities

A . Derive security requirements from business functionalityConduct a review of functional requirements that specify the business logic and overall behavior for each software project. After gathering requirements for a project, conduct an assessment to derive relevant security requirements. Even if software is being built by a third-party, these requirements, once identified, should be included with functional require-ments delivered to vendors.

For each functional requirement, a security auditor should lead stakeholders through the process of explicitly noting any expectations with regard to security. Typically, questions to clarify for each requirement include expectations for data security, access control, transac-tion integrity, criticality of business function, separation of duties, uptime, etc.

It is important to ensure that all security requirements follow the same principles for writing good requirements in general. Specifically, they should be specific, measurable, and reason-able.

Conduct this process for all new requirements on active projects. For existing features, it is recommended to conduct the same process as a gap analysis to fuel future refactoring for security.

B . Evaluate security and compliance guidance for requirementsDetermine industry best-practices that project teams should treat as requirements. These can be chosen from publicly available guidelines, internal or external guidelines/standards/policies, or established compliance requirements.

It is important to not attempt to bring in too many best-practice requirements into each development iteration since there is a time trade-off with design and implementation. The recommended approach is to slowly add best-practices over successive development cycles to bolster the software’s overall assurance profile over time.

For existing systems, refactoring for security best practices can be a complex undertaking. Where possible, add security requirements opportunistically when adding new features. At a minimum, conducting the analysis to identify applicable best practices should be done to help fuel future planning efforts.

This review should be performed by a security auditor with input from business stakehold-ers. Senior developers, architects, and other technical stakeholders should also be involved to bring design and implementation-specific knowledge into the decision process.

Consider security explicitly during the software requirements process

resuLts

✦ High-level alignment of development effort with business risks ✦ Ad hoc capturing of industry best-practices for security as explicit requirements ✦ Awareness amongst stakeholders of measures being taken to mitigate risk from software

success Metrics

✦ >50% of project teams with explicitly defined security requirements

costs

✦ Project overhead from addition of security requirements to each development cycle

personneL

✦ Security Auditor (2 days/yr) ✦ Business Owner (1 days/yr) ✦ Managers (1 day/yr) ✦ Architects (1 day/yr)

reLated LeveLs

✦ Education & Guidance - 1 ✦ Policy & Compliance - 2 ✦ Design Review - 1 ✦ Code Review - 1 ✦ Security Testing - 1

Security Requirements 1SR

Page 52: Software Assurance Maturity Model

sAM

M /

th

e se

cu

rit

y p

rA

ct

ices

- V

1.0

52

Activities

A . Build an access control matrix for resources and capabilitiesBased upon the business purpose of the application, identify user and operator roles. Ad-ditionally, build a list of resources and capabilities by gathering all relevant data assets and application-specific features that are guarded by any form of access control.

In a simple matrix with roles on one axis and resources on the other, consider the relation-ships between each role and each resource and note in each intersection the correct behav-ior of the system in terms of access control according to stakeholders.

For data resources, it is important to note access rights in terms of creation, read access, update, and deletion. For resources that are features, gradation of access rights will likely be application-specific, but at a minimum note if the role should be permitted access to the feature.

This permission matrix will serve as an artifact to document the correct access control rights for the business logic of the overall system. As such, it should be created by the project teams with input from business stakeholders. After initial creation, it should be updated by business stakeholders before every release, but usually toward the beginning of the design phase.

B . Specify security requirements based on known risksExplicitly review existing artifacts that indicate organization or project-specific security risk in order to better understand the overall risk profile for the software. When available, draw on resources such as the high-level business risk profile, individual application threat models, findings from design review, code review, security testing, etc.

In addition to review of existing artifacts, use abuse-case models for an application to serve as fuel for identification of concrete security requirements that directly or indirectly mitigate the abuse scenarios.

This process should be conducted by business owners and security auditors as needed. Ultimately, the notion of risks leading to new security requirements should become a built-in step in the planning phase whereby newly discovered risks are specifically assessed by project teams.

Increase granularity of security requirements derived from business logic and known risks

resuLts

✦ Detailed understanding of attack scenarios against business logic ✦ Prioritized development effort for security features based on likely attacks ✦ More educated decision-making for tradeoffs between features and security efforts ✦ Stakeholders that can better avoid functional requirements that inherently have security flaws

add’L success Metrics

✦ >75% of all projects with updated abuse-case models within past 6 months

add’L costs

✦ Project overhead from buildout and maintenance of abuse-case models

add’L personneL

✦ Security Auditor (2 days/yr) ✦ Managers (1 day/yr) ✦ Architects (2 days/yr) ✦ Business Owners (1 day/yr)

reLated LeveLs

✦ Threat Assessment - 1 & 3 ✦ Strategy & Metrics - 1

Security Requirements2SR

Page 53: Software Assurance Maturity Model

sAM

M /

th

e se

cu

rit

y p

rA

ct

ices

- V

1.0

53

Activities

A . Build security requirements into supplier agreementsBeyond the kinds of security requirements already identified by previous analysis, additional security benefits can be derived from third-party agreements. Typically, requirements and perhaps high-level design will be developed internally while detailed design and implementa-tion is often left up to suppliers.

Based on the specific division of labor for each externally developed component, identify specific security activities and technical assessment criteria to add to the vendor contracts. Commonly, this is a set of activities from the Design Review, Code Review, and Security Testing Practices.

Modifications of agreement language should be handled on a case-by-case basis with each supplier since adding additional requirements will generally mean an increase in cost. The cost of each potential security activity should be balanced against the benefit of the activity as per the usage of the component or system being considered.

B . Expand audit program for security requirementsIncorporate checks for completeness of security requirements into routine project audits. Since this can be difficult to gauge without project-specific knowledge, the audit should focus on checking project artifacts such as requirements or design documentation for evidence that the proper types of analysis were conducted.

Particularly, each functional requirement should be annotated with security requirements based on business drivers as well as expected abuse scenarios. The overall project require-ments should contain a list of requirements generated from best-practices in guidelines and standards. Additionally, there should be a clear list of unfulfilled security requirements and an estimated timeline for their provision in future releases.

This audit should be performed during every development iteration, ideally toward the end of the requirements process, but it must be performed before a release can be made.

Mandate security requirements process for all software projects and third-party dependencies

resuLts

✦ Formally set baseline for security expectations from external code ✦ Centralized information on security effort undertaken by each project team ✦ Ability to align resources to projects based on application risk and desired security requirements

add’L success Metrics

✦ >80% of projects passing security requirements audit in past 6 months ✦ >80% of vendor agreements analyzed for contractual security requirements in past 12 months

add’L costs

✦ Increased cost from outsourced development from additional security requirements ✦ Ongoing project overhead from release gates for security requirements

add’L personneL

✦ Security Auditor (2 days/yr) ✦ Managers (2 days/yr) ✦ Business Owners (1 day/yr)

reLated LeveLs

✦ Threat Assessment - 3 ✦ Policy & Compliance - 2

Security Requirements 3SR

Page 54: Software Assurance Maturity Model

sAM

M /

th

e se

cu

rit

y p

rA

ct

ices

- V

1.0

54

1SA 2SA 3SA

Secure Architecture

objective Insert consideration of proactive security guidance into the software design process

Direct the software design process toward known-secure services and secure-by-default designs

Formally control the software design process and validate utilization of secure components

Activities A. Maintain list of recommended software frameworks

B. Explicitly apply security principles to design

A. Identify and promote security services and infrastructure

B. Identify security design patterns from architecture

A. Establish formal reference architectures and platforms

B. Validate usage of frameworks, patterns, and platforms

Assessment ✦ Are project teams provided with a list of recommended third-party components? ✦ Are most project teams aware of secure design principles and applying them?

✦ Do you advertise shared security services with guidance for project teams? ✦ Are project teams provided with prescriptive design patterns based on their application architecture?

✦ Are project teams building software from centrally controlled platforms and frameworks? ✦ Are project teams being audited for usage of secure architecture components?

resUlts ✦ Ad hoc prevention of unexpected dependencies and one-off implementation choices ✦ Stakeholders aware of increased project risk due to libraries and frameworks chosen ✦ Established protocol within development for proactively applying security mechanisms to a design

✦ Detailed mapping of assets to user roles to encourage better compartmentalization in design ✦ Reusable design building blocks for provision of security protections and functionality ✦ Increased confidence for software projects from use of established design techniques for security

✦ Customized application development platforms that provide built-in security protections ✦ Organization-wide expectations for proactive security effort in development ✦ Stakeholders better able to make tradeoff decisions based on business need for secure design

Page 55: Software Assurance Maturity Model

sAM

M /

th

e se

cu

rit

y p

rA

ct

ices

- V

1.0

55

Activities

A . Maintain list of recommended software frameworksAcross software projects within the organization identify commonly used third-party soft-ware libraries and frameworks in use. Generally, this need not be an exhaustive search for dependencies, but rather focus on capturing the high-level components that are most often used.

From the list of components, group them into functional categories based on the core fea-tures provided by the third-party component. Also, note the usage prevalence of each com-ponent across project teams to weight the reliance upon the third-party code. Using this weighted list as a guide, create a list of components to be advertised across the development organization as recommended components.

Several factors should contribute to decisions for inclusion on the recommended list. Al-though a list can be created without conducting research specifically, it is advisable to inspect each for incident history, track record for responding to vulnerabilities, appropriateness of functionality for the organization, excessive complexity in usage of the third-party compo-nent, etc.

This list should be created by senior developers and architects, but also include input from managers and security auditors. After creation, this list of recommended components matched against functional categories should be advertised to the development organization. Ultimately, the goal is to provide well-known defaults for project teams.

B . Explicitly apply security principles to designDuring design, technical staff on the project team should use a short list of guiding security principles as a checklist against detailed system designs. Typically, security principles include defense in depth, securing the weakest link, use of secure defaults, simplicity in design of secu-rity functionality, secure failure, balance of security and usability, running with least privilege, avoidance of security by obscurity, etc.

In particular for perimeter interfaces, the design team should consider each principle in the context of the overall system and identify features that can be added to bolster security at each such interface. Generally, these should be limited such that they only take a small amount of extra effort beyond the normal implementation cost of functional requirements and anything larger should be noted and scheduled for future releases.

While this process should be conducted by each project team after being trained with secu-rity awareness, it is helpful to incorporate more security-savvy staff to aide in making design decisions.

Insert consideration of proactive security guidance into the software design process

resuLts

✦ Ad hoc prevention of unexpected dependencies and one-off implementation choices ✦ Stakeholders aware of increased project risk due to libraries and frameworks chosen ✦ Established protocol within development for proactively applying security mechanisms to a design

success Metrics

✦ >80% of development staff briefed on software framework recommendations in past 1 year ✦ >50% of projects self-reporting application of security principles to design

costs

✦ Buildout, maintenance, and awareness of software framework recommendations ✦ Ongoing project overhead from analysis and application of security principles

personneL

✦ Architects (2-4 days/yr) ✦ Developers (2-4 days/yr) ✦ Security Auditors (2-4 days/yr) ✦ Managers (2 days/yr)

reLated LeveLs

✦ Education & Guidance - 1

Secure Architecture 1SA

Page 56: Software Assurance Maturity Model

sAM

M /

th

e se

cu

rit

y p

rA

ct

ices

- V

1.0

56

Activities

A . Identify and promote security services and infrastructureOrganizations should identify shared infrastructure or services with security functionality. These will typically include single-sign-on services, corporate directory systems, access con-trol or entitlements services, and authentication systems. By collecting and evaluating reus-able systems, assemble a list of such resources and categorize them by the security mecha-nism they fulfill. It is also helpful to consider each resource in terms of why a development team would want to integrate with it, i.e. the benefits of using the shared resource.

If multiple resources exist in each category, an organization should select and standardize on one or more shared service per category. Because future software development will rely on these selected services, each should be thoroughly audited to ensure the baseline security posture is understood. For each selected service, design guidance should be created for development teams to understand how to integrate with the system. After such guidance is assembled, it should be made available to development teams through training, mentorship, guidelines, and standards.

The benefits of doing this include promotion of known-secure systems, simplified security guidance for project design teams, and clearer paths to building assurance around the ap-plications utilizing the shared security services.

B . Identify security design patterns from architectureAcross software projects at an organization, each should be categorized in terms of the generic architecture type. Common categories include client-server applications, embedded systems, desktop applications, web-facing applications, web services platforms, transactional middleware systems, mainframe applications, etc. Depending on your organizations specialty, more detailed categories may need to be developed based upon language, or processor architecture, or even era of deployment.

For the generic software architecture type, a set of general design patterns representing sound methods of implementing security functionality can be derived and applied to the individual designs of an organization’s software projects. These security design patterns rep-resent general definitions of generic design elements they can be researched or purchased, and it is often even more effective if these patterns are customized to be made more spe-cific to your organization. Example patterns include a single-sign-on subsystem, a cross-tier delegation model, a hardened interface design, separation-of-duties authorization model, a centralized logging pattern, etc.

The process of identification of applicable and appropriate patterns should be carried out by architects, senior developers, and other technical stakeholders during the design phase.

Direct the software design process toward known-secure services and secure-by-default designs

resuLts

✦ Detailed mapping of assets to user roles to encourage better compartmentalization in design ✦ Reusable design building blocks for provision of security protections and functionality ✦ Increased confidence for software projects from use of established design techniques for security

add’L success Metrics

✦ >80% of projects with updated permission matrix in past 6 months ✦ >80% of project teams briefed on applicable security patterns in past 6 months

add’L costs

✦ Buildout or license of applicable security patterns ✦ Ongoing project overhead from maintenance of permission matrix

add’L personneL

✦ Architects (2-4 days/yr) ✦ Developers (1-2 days/yr) ✦ Managers (1-2 days/yr) ✦ Business Owners (1 day/yr) ✦ Security Auditors (1-2 days/yr)

reLated LeveLs

✦ Education & Guidance - 1

Secure Architecture2SA

Page 57: Software Assurance Maturity Model

sAM

M /

th

e se

cu

rit

y p

rA

ct

ices

- V

1.0

57

Activities

A . Establish formal reference architectures and platformsAfter promoting integration with shared security services and working with security pat-terns specific to each type of architecture, a collection of code implementing these pieces of functionality should be selected from project teams and used as the basis for a shared code-base. This shared code-base can initially start as a collection of commonly recom-mended libraries that each project needs to use and it can grow over time into one or more software frameworks representing reference platforms upon which project teams build their software. Examples of reference platforms include frameworks for model-view-controller web applications, libraries supporting transactional back-end systems, frameworks for web services platforms, scaffolding for client-server applications, frameworks for middle-ware with pluggable business logic, etc.

Another method of building initial reference platforms is to select a particular project early in the life-cycle and have security-savvy staff work with them to build the security functional-ity in a generic way so that it could be extracted from the project and utilized elsewhere in the organization.

Regardless of approach to creation, reference platforms have advantages in terms of speed-ing audit and security-related reviews, increasing efficiency in development, and lowering maintenance overhead.

Architects, senior developers and other technical stakeholders should participate in design and creation of reference platforms. After creation, a team must maintain ongoing support and updates.

B . Validate usage of frameworks, patterns, and platformsDuring routine audits of projects conduct additional analysis of project artifacts to measure usage of recommended frameworks, design patterns, shared security services, and refer-ence platforms. Though conducted during routine audits, the goal of this activity is to collect feedback from project teams as much as to measure their individual proactive security effort.

Overall, it is important to verify several factors with project teams. Identify use of non-recommended frameworks to determine if there may be a gap in recommendations versus the organization’s functionality needs. Examine unused or incorrectly used design patterns and reference platform modules to determine if updates are needed. Additionally, there may be more or different functionality that project teams would like to see implemented in the reference platforms as the organization evolves.

This analysis can be conducted by any security-savvy technical staff. Metrics collected from each project should be collated for analysis by managers and stakeholders.

Formally control the software design process and validate utilization of secure components

resuLts

✦ Customized application development platforms that provide built-in security protections ✦ Organization-wide expectations for proactive security effort in development ✦ Stakeholders better able to make tradeoff decisions based on business need for secure design

add’L success Metrics

✦ >50% of active projects using reference platforms ✦ >80% of projects reporting framework, pattern, and platform usage feedback in past 6 months ✦ >3.0 Likert on usefulness of guidance/platforms reported by project teams

add’L costs

✦ Buildout or license of reference platform(s) ✦ Ongoing maintenance and support of reference platforms ✦ Ongoing project overhead from usage validation during audit

add’L personneL

✦ Managers (1 day/yr) ✦ Business Owners (1 day/yr) ✦ Architects (3-4 days/yr) ✦ Developers (2-3 days/yr) ✦ Security Auditors (2 days/yr)

reLated LeveLs

✦ Policy & Compliance - 2 ✦ Design Review - 3 ✦ Code Review - 3 ✦ Security Testing - 3

Secure Architecture 3SA

Page 58: Software Assurance Maturity Model

sAM

M /

th

e se

cu

rit

y p

rA

ct

ices

- V

1.0

58

1DR 2DR 3DR

Design Review

objective Support ad hoc reviews of software design to ensure baseline mitigations for known risks

Offer assessment services to review software design against comprehensive best practices for security

Require assessments and validate artifacts to develop detailed understanding of protection mechanisms

Activities A. Identify software attack surfaceB. Analyze design against known

security requirements

A. Inspect for complete provision of security mechanisms

B. Deploy design review service for project teams

A. Develop data-flow diagrams for sensitive resources

B. Establish release gates for design review

Assessment ✦ Do project teams document the attack perimeter of software designs? ✦ Do project teams check software designs against known security risks?

✦ Do most project teams specifically analyze design elements for security mechanisms? ✦ Are most project stakeholders aware of how to obtain a formal design review?

✦ Does the design review process incorporate detailed data-level analysis? ✦ Does routine project audit require a baseline for design review results?

resUlts ✦ High-level understanding of security implications from perimeter architecture ✦ Enable development teams to self-check designs for security best-practices ✦ Lightweight process for conducting project-level design reviews

✦ Formally offered assessment service to consistently review architecture for security ✦ Pinpoint security flaws in maintenance-mode and legacy systems ✦ Deeper understanding amongst project stakeholders on how the software provides assurance protections

✦ Granular view of weak points in a system design to encourage better compartmentalization ✦ Organization-level awareness of project standing against baseline security expectations for architecture ✦ Comparisons between projects for efficiency and progress toward mitigating known flaws

Page 59: Software Assurance Maturity Model

sAM

M /

th

e se

cu

rit

y p

rA

ct

ices

- V

1.0

59

Activities

A . Identify software attack surfaceFor each software project, create a simplified view of the overall architecture. Typically, this should be created based on project artifacts such as high-level requirements and design documents, interviews with technical staff, or module-level review of the code base. It is im-portant to capture the high-level modules in the system, but a good rule of thumb for granu-larity is to ensure that the diagram of the whole system under review fits onto one page.

From the single page architecture view, analyze each component in terms of accessibility of the interfaces from authorized users, anonymous users, operators, application-specific roles, etc. The components providing the interfaces should also be considered in the context of the one-page view to find points of functional delegation or data pass-through to other com-ponents on the diagram. Group interfaces and components with similar accessibility profiles and capture this as the software attack surface.

For each interface, further elaborate the one-page diagram to note any security-related functionality. Based on the identified interface groups comprising the attack surface, check the model for design-level consistency for how interfaces with similar access are secured. Any breaks in consistency can be noted as assessment findings

This analysis should be conducted by security-savvy technical staff, either within the project team or external. Typically, after initial creation, the diagram and attack surface analysis only needs to be updated during the design phase when additions or changes are made to the edge system interfaces.

B . Analyze design against known security requirementsSecurity requirements, either formally identified or informally known, should be identified and collected. Additionally, identify and include any security assumptions upon which safe operation of the system relies.

Review each item on the list of known security requirements against the one-page diagram of the system architecture. Elaborate the diagram to show the design-level features that address each security requirement. Separate, granular diagrams can be created to simplify capturing this information if the system is large and/or complex. The overall goal is to verify that each known security requirement has been addressed by the system design. Any security requirements that are not clearly provided at the design level should be noted as assessment findings.

This analysis should be conducted by security-savvy technical staff with input from architects, developers, managers, and business owners as needed. It should be updated during the design phase when there are changes in security requirements or high-level system design.

Support ad hoc reviews of software design to ensure baseline mitigations for known risks

resuLts

✦ High-level understanding of security implications from perimeter architecture ✦ Enable development teams to self-check designs for security best-practices ✦ Lightweight process for conducting project-level design reviews

success Metrics

✦ >50% of projects with updated attack surface analysis in past 12 months ✦ >50% of projects with updated security requirements design-level analysis in past 12 months

costs

✦ Buildout and maintenance of architecture diagrams for each project ✦ Ongoing project overhead from attack surface and security requirement design inspection

personneL

✦ Architects (2-3 days/yr) ✦ Developers (1-2 days/yr) ✦ Managers (1 day/yr) ✦ Security Auditor (1 day/yr)

reLated LeveLs

✦ Security Requirements - 1

Design Review 1DR

Page 60: Software Assurance Maturity Model

sAM

M /

th

e se

cu

rit

y p

rA

ct

ices

- V

1.0

60

Activities

A . Inspect for complete provision of security mechanismsFor each interface on a module in the high-level architecture diagram, formally iterate through the list of security mechanisms and analyze the system for their provision. This type of analysis should be performed on both internal interfaces, e.g. between tiers, as well as external ones, e.g. those comprising the attack surface.

The six main security mechanisms to consider are authentication, authorization, input valida-tion, output encoding, error handling and logging. Where relevant, also consider the mecha-nisms of cryptography and session management. For each interface, determine where in the system design each mechanism is provided and note any missing or unclear features as findings.

This analysis should be conducted by security-savvy staff with assistance from the project team for application-specific knowledge. This analysis should be performed once per release, usually toward the end of the design phase. After initial analysis, subsequent releases are required to update the findings based on changes being made during the development cycle.

B . Deploy design review service for project teamsInstitute a process whereby project stakeholders can request an design review. This service may be provided centrally within the organization or distributed across existing staff, but all reviewers must be trained on performing the reviews completely and consistently.

The review service should be centrally managed in that the review request queue should be triaged by senior managers, architects, and stakeholders that are familiar with the overall business risk profile for the organization. This allows prioritization of project reviews in align-ment with overall business risk.

During a design review, the review team should work with project teams to collect informa-tion sufficient to formulate an understanding of the attack surface, match project-specific se-curity requirements to design elements, and verify security mechanisms at module interfaces.

Offer assessment services to review software design against comprehensive best practices for security

resuLts

✦ Formally offered assessment service to consistently review architecture for security ✦ Pinpoint security flaws in maintenance-mode and legacy systems ✦ Deeper understanding amongst project stakeholders on how the software provides assurance protections

add’L success Metrics

✦ >80% of stakeholders briefed on status of review requests in past 6 months ✦ >75% of projects undergoing design review in past 12 months

add’L costs

✦ Buildout, training, and maintenance of design review team ✦ Ongoing project overhead from review activities

add’L personneL

✦ Architects (1-2 days/yr) ✦ Developers (1 day/yr) ✦ Managers (1 day/yr) ✦ Security Auditors (2-3 days/yr)

reLated LeveLs

✦ Education & Guidance - 2 ✦ Strategy & Metrics - 2

Design Review2DR

Page 61: Software Assurance Maturity Model

sAM

M /

th

e se

cu

rit

y p

rA

ct

ices

- V

1.0

61

Activities

A . Develop data-flow diagrams for sensitive resourcesBased on the business function of the software project, conduct analysis to identify details on system behavior around high-risk functionality. Typically, high-risk functionality will correlate to features implementing creation, access, update, and deletion of sensitive data. Beyond data, high-risk functionality also includes project-specific business logic that is critical in nature, either from a denial-of-service or compromise perspective.

For each identified data source or business function, select and use a standardized notation to capture relevant software modules, data sources, actors, and messages that flow amongst them. It is often helpful to start with a high-level design diagram and iteratively flesh out relevant detail while removing elements that do not correspond to the sensitive resource.

With data-flow diagrams created for a project, conduct analysis over them to determine internal choke-points in the design. Generally, these will be individual software modules that handle data with differing sensitivity levels or those that gate access to several business func-tions of various levels of business criticality.

B . Establish release gates for design reviewHaving established a consistent design review program, the next step of enforcement is to set a particular point in the software development life-cycle where a project cannot pass until an design review is conducted and findings are reviewed and accepted. In order to accomplish this, a baseline level of expectations should be set, e.g. no projects with any high-severity findings will be allowed to pass and all other findings must be accepted by the business owner.

Generally, design reviews should occur toward the end of the design phase to aide early detection of security issues, but it must occur before releases can be made from the project team.

For legacy systems or inactive projects, an exception process should be created to allow those projects to continue operations, but with an explicitly assigned timeframe for each to be reviewed to illuminate any hidden vulnerabilities in the existing systems. Exceptions for should be limited to no more than 20% of all projects.

Require assessments and validate artifacts to develop detailed understanding of protection mechanisms

resuLts

✦ Granular view of weak points in a system design to encourage better compartmentalization ✦ Organization-level awareness of project standing against baseline security expectations for architecture ✦ Comparisons between projects for efficiency and progress toward mitigating known flaws

add’L success Metrics

✦ >80% of projects with updated data-flow diagrams in past 6 months ✦ >75% of projects passing design review audit in past 6 months

add’L costs

✦ Ongoing project overhead from maintenance of data-flow diagrams ✦ Organization overhead from project delays caused by failed design review audits

add’L personneL

✦ Developers (2 days/yr) ✦ Architects (1 day/yr) ✦ Managers (1-2 days/yr) ✦ Business Owners (1-2 days/yr) ✦ Security Auditors (2-3 days/yr)

reLated LeveLs

✦ Secure Architecture - 3 ✦ Code Review - 3

Design Review 3DR

Page 62: Software Assurance Maturity Model

sAM

M /

th

e se

cu

rit

y p

rA

ct

ices

- V

1.0

62

1CR 2CR 3CR

Code Review

objective Opportunistically find basic code-level vulnerabilities and other high-risk security issues

Make code review during development more accurate and efficient through automation

Mandate comprehensive code review process to discover language-level and application-specific risks

Activities A. Create review checklists from known security requirements

B. Perform point-review of high-risk code

A. Utilize automated code analysis tools

B. Integrate code analysis into development process

A. Customize code analysis for application-specific concerns

B. Establish release gates for code review

Assessment ✦ Do most project teams have review checklists based on common problems? ✦ Are project teams generally performing review of selected high-risk code?

✦ Can most project teams access automated code analysis tools to find security problems? ✦ Do most stakeholders consistently require and review results from code reviews?

✦ Do project teams utilize automation to check code against application-specific coding standards? ✦ Does routine project audit require a baseline for code review results prior to release?

resUlts ✦ Inspection for common code vulnerabilities that lead to likely discovery or attack ✦ Lightweight review for coding errors that lead to severe security impact ✦ Basic code-level due diligence for security assurance

✦ Development enabled to consistently self-check for code-level security vulnerabilities ✦ Routine analysis results to compile historic data on per-team secure coding habits ✦ Stakeholders aware of unmitigated vulnerabilities to support better tradeoff analysis

✦ Increased confidence in accuracy and applicability of code analysis results ✦ Organization-wide baseline for secure coding expectations ✦ Project teams with an objective goal for judging code-level security

Page 63: Software Assurance Maturity Model

sAM

M /

th

e se

cu

rit

y p

rA

ct

ices

- V

1.0

63

Activities

A . Create review checklists from known security requirementsFrom the known security requirements for a project, derive a lightweight code review check-list for security. These can be checks specific to the security concerns surrounding the func-tional requirements or checks for secure coding best practices based on the implementation language, platform, typical technology stack, etc. Due to these variations, often a set of check-list are needed to cover the different types of software development within an organization.

Regardless, of whether created from publicly available resources or purchased, technical stakeholders such as development managers, architects, developers, and security auditors should review the checklists for efficacy and feasibility. It is important to keep the lists short and simple, aiming to catch high-priority issues that are straightforward to find in code either manually or with simple search tools. Code analysis automation tools may also be used to achieve this same end, but should also be customized to reduce the overall set of security checks to a small, valuable set in order to make the scan and review process efficient.

Developers should be briefed on the goals of checklists appropriate to their job function.

B . Perform point-review of high-risk codeSince code-level vulnerabilities can have dramatically increased impacts if they occur in se-curity-critical parts of software, project teams should review high-risk modules for common vulnerabilities. Common examples of high-risk functionality include authentication modules, access control enforcement points, session management schemes, external interfaces, input validators and data parsers, etc.

Utilizing the code review checklists, the analysis can be performed as a normal part of the development process where members of the project team are assigned modules to review when changes are made. Security auditors and automated review tools can also be utilized for the review.

During development cycles where high-risk code is being changed and reviewed, develop-ment managers should triage the findings and prioritize remediation appropriately with input from other project stakeholders.

Opportunistically find basic code-level vulnerabilities and other high-risk security issues

resuLts

✦ Inspection for common code vulnerabilities that lead to likely discovery or attack ✦ Lightweight review for coding errors that lead to severe security impact ✦ Basic code-level due diligence for security assurance

success Metrics

✦ >80% of project teams briefed on relevant code review checklists in past 6 months ✦ >50% of project teams performing code review on high-risk code in past 6 months ✦ >3.0 Likert on usefulness of code review checklists reported by developers

costs

✦ Buildout or license of code review checklists ✦ Ongoing project overhead from code review activities of high-risk code

personneL

✦ Developers (2-4 days/yr) ✦ Architects (1-2 days/yr) ✦ Managers (1-2 days/yr) ✦ Business Owners (1 day/yr)

reLated LeveLs

✦ Security Requirements - 1

Code Review 1CR

Page 64: Software Assurance Maturity Model

sAM

M /

th

e se

cu

rit

y p

rA

ct

ices

- V

1.0

64

Activities

A . Utilize automated code analysis toolsMany security vulnerabilities at the code level are complex to understand and require care-ful inspection for discovery. However, there are many useful automation solutions available to automatically analyze code for bugs and vulnerabilities.

There are both commercial and open-source products available to cover popular program-ming languages and frameworks. Selection of an appropriate code analysis solution is based on several factors including depth and accuracy of inspection, product usability and usage model, expandability and customization features, applicability to the organization’s architec-ture and technology stack(s), etc.

Utilize input from security-savvy technical staff as well as developers and development man-agers in the selection process, and review overall results with stakeholders.

B . Integrate code analysis into development processOnce a code analysis solution is selected, it must be integrated into the development process to encourage project teams to utilize its capabilities. An effective way to accomplish this is to setup the infrastructure for the scans to run automatically at build time or from code in the project’s code repository. In this fashion, results are available earlier thus enabling develop-ment teams to self-check along the way before release.

A potential problem with legacy systems or large ongoing projects is that code scanners will typically report findings in modules that were not being updated in the release. If automatic scanning is setup to run periodically, an effective strategy to avoid review overhead is to limit consideration of findings to those that have been added, removed, or changed since the previous scan. If is critical to not ignore the rest of the results however, so development managers should take input from security auditors, stakeholders, and the project team to formulate a concrete plan for addressing the rest of the findings.

If unaddressed findings from code review remain at release, these must be reviewed and ac-cepted by project stakeholders.

Make code review during development more accurate and efficient through automation

resuLts

✦ Development enabled to consistently self-check for code-level security vulnerabilities ✦ Routine analysis results to compile historic data on per-team secure coding habits ✦ Stakeholders aware of unmitigated vulnerabilities to support better tradeoff analysis

add’L success Metrics

✦ >50% of projects with code review and stakeholder sign-off in past 6 months ✦ >80% of projects with access to automated code review results in past 1 month

add’L costs

✦ Research and selection of code analysis solution ✦ Initial cost and maintenance of automation integration ✦ Ongoing project overhead from automated code review and mitigation

add’L personneL

✦ Developers (1-2 days/yr) ✦ Architects (1 day/yr) ✦ Managers (1-2 days/yr) ✦ Security Auditors (3-4 days/yr)

reLated LeveLs

Code Review2CR

Page 65: Software Assurance Maturity Model

sAM

M /

th

e se

cu

rit

y p

rA

ct

ices

- V

1.0

65

Activities

A . Customize code analysis for application-specific concernsCode scanning tools are powered by built-in a knowledge-base of rules to check code based on language APIs and commonly used libraries, but have limited ability to understand custom APIs and designs to apply analogous checks. However, through customization, a code scan-ner can be a powerful, generic analysis engine for finding organization and project-specific security concerns.

While details vary between tools in terms of ease and power of custom analysis, code scan-ner customization generally involves specifying checks to be performed at specific APIs and function call sites. Checks can include analysis for adherence to internal coding standards, unchecked tainted data being passed to custom interfaces, tracking and verification of sensi-tive data handling, correct usage of an internal API, etc.

Checkers for usage of shared code-bases are an effective place to begin scanner customiza-tions since the created checkers can be utilized across multiple projects. To customize a tool for a code-base, a security auditor should inspect both code and high-level design to identify candidate checkers to discuss with development staff and stakeholders for implementation.

B . Establish release gates for code reviewTo set a code-level security baseline for all software projects, a particular point in the soft-ware development life-cycle should be established as a checkpoint where a minimum stan-dard for code review results must be met in order to make a release.

To begin, this standard should be straightforward to meet, for example by choosing one or two vulnerability types and a setting the standard that no project may pass with any corre-sponding findings. Over time, this baseline standard should be improved by adding additional criteria for passing the checkpoint.

Generally, the code review checkpoint should occur toward the end of the implementation phase, but must occur before release.

For legacy systems or inactive projects, an exception process should be created to allow those projects to continue operations, but with an explicitly assigned timeframe for mitiga-tion of findings. Exceptions should be limited to no more that 20% of all projects.

Mandate comprehensive code review process to discover language-level and application-specific risks

resuLts

✦ Increased confidence in accuracy and applicability of code analysis results ✦ Organization-wide baseline for secure coding expectations ✦ Project teams with an objective goal for judging code-level security

add’L success Metrics

✦ >50% of projects using code analysis customizations ✦ >75% of projects passing code review audit in past 6 months

add’L costs

✦ Buildout and maintenance of custom code review checks ✦ Ongoing project overhead from code review audit ✦ Organization overhead from project delays caused by failed code review audits

add’L personneL

✦ Architects (1 day/yr) ✦ Developers (1 day/yr) ✦ Security Auditors (1-2 days/yr) ✦ Business Owners (1 day/yr) ✦ Managers (1 day/yr)

reLated LeveLs

✦ Policy & Compliance - 2 ✦ Secure Architecture - 3

Code Review 3CR

Page 66: Software Assurance Maturity Model

sAM

M /

th

e se

cu

rit

y p

rA

ct

ices

- V

1.0

66

1ST 2ST 3ST

Security Testing

objective Establish process to perform basic security tests based on implementation and software requirements

Make security testing during development more complete and efficient through automation

Require application-specific security testing to ensure baseline security before deployment

Activities A. Derive test cases from known security requirements

B. Conduct penetration testing on software releases

A. Utilize automated security testing tools

B. Integrate security testing into development process

A. Employ application-specific security testing automation

B. Establish release gates for security testing

Assessment ✦ Are projects specifying some security tests based on requirements? ✦ Do most projects perform penetration tests prior to release? ✦ Are most stakeholders aware of the security test status prior to release?

✦ Are projects using automation to evaluate security test cases? ✦ Do most projects follow a consistent process to evaluate and report on security tests to stakeholders?

✦ Are security test cases comprehensively generated for application-specific logic? ✦ Do routine project audits demand minimum standard results from security testing?

resUlts ✦ Independent verification of expected security mechanisms surrounding critical business functions ✦ High-level due diligence toward security testing ✦ Ad hoc growth of a security test suite for each software project

✦ Deeper and more consistent verification of software functionality for security ✦ Development teams enabled to self-check and correct problems before release ✦ Stakeholders better aware of open vulnerabilities when making risk acceptance decisions

✦ Organization-wide baseline for expected application performance against attacks ✦ Customized security test suites to improve accuracy of automated analysis ✦ Project teams aware of objective goals for attack resistance

Page 67: Software Assurance Maturity Model

sAM

M /

th

e se

cu

rit

y p

rA

ct

ices

- V

1.0

67

Activities

A . Derive test cases from known security requirementsFrom the known security requirements for a project, identify a set of test cases to check the software for correct functionality. Typically, these test cases are derived from security con-cerns surrounding the functional requirements and business logic of the system, but should also include generic tests for common vulnerabilities based on the implementation language or technology stack.

Often, it is most effective to use the project team’s time to build application-specific test cases and utilize publicly available resources or purchased knowledge bases to select appli-cable general test cases for security. Although not required, automated security testing tools can also be utilized to cover the general security test cases.

This test case planning should occur during the requirements and/or design phases, but must occur before final testing prior to release. Candidate test cases should be reviewed for ap-plicability, efficacy, and feasibility by relevant development, security, and quality assurance staff.

B . Conduct penetration testing on software releasesUsing the set of security test cases identified for each project, penetration testing should be conducted to evaluate the system’s performance against each case. It is common for this to occur during the testing phase prior to release.

Penetration testing cases should include both application-specific tests to check soundness of business logic as well as common vulnerability tests to check the design and implementa-tion. Once specified, security test cases can be executed by security-savvy quality assurance or development staff, but first-time execution of security test cases for a project team should be monitored by a security auditor to assist and coach team members.

Prior to release or deployment, stakeholders must review results of security tests and ac-cept the risks indicated by failing security tests at release time. In the latter case, a concrete timeline should be established to address the gaps over time.

Establish process to perform basic security tests based on implementation and software requirements

resuLts

✦ Independent verification of expected security mechanisms surrounding critical business functions ✦ High-level due diligence toward security testing ✦ Ad hoc growth of a security test suite for each software project

success Metrics

✦ >50% of projects specifying security test cases in past 12 months ✦ >50% of stakeholders briefed on project status against security tests in past 6 months

costs

✦ Buildout or license of security test cases ✦ Ongoing project overhead from maintenance and evaluation of security test cases

personneL

✦ QA Testers (1-2 days/yr) ✦ Security Auditor (1-2 days/yr) ✦ Developers (1 day/yr) ✦ Architects (1 day/yr) ✦ Business Owners (1 day/yr)

reLated LeveLs

✦ Security Requirements - 1

Security Testing 1ST

Page 68: Software Assurance Maturity Model

sAM

M /

th

e se

cu

rit

y p

rA

ct

ices

- V

1.0

68

Activities

A . Utilize automated security testing toolsIn order to test for security issues, a potentially large number of input cases must be checked against each software interface, which can make effective security testing using manual test case implementation and execution unwieldy. Thus, automated security test tools should be used to automatically test software, resulting in more efficient security testing and higher quality results.

Both commercial and open-source products are available and should be reviewed for ap-propriateness for the organization. Selecting a a suitable tool is based on several factors including robustness and accuracy of built-in security test cases, efficacy at testing architec-ture types important to organization, customization to change or add test cases, quality and usability of findings to the development organization, etc..

Utilize input from security-savvy technical staff as well as development and quality assurance staff in the selection process, and review overall results with stakeholders.

B . Integrate security testing into development processWith tools to run automated security tests, projects within the organization should routinely run security tests and review results during development. In order to make this scalable with low overhead, security testing tools should be configured to automatically run on a routine basis, e.g. nightly or weekly, and findings should be inspected as they occur.

Conducting security tests as early as the requirements or design phases can be beneficial. While traditionally, used for functional test cases, this type of test-driven development ap-proach involves identifying and running relevant security test cases early in the development cycle, usually during design. With the automatic execution of security test cases, projects enter the implementation phase with a number of failing tests for the non-existent function-ality. Implementation is complete when all the tests pass. This provides a clear, upfront goal for developers early in the development cycle, thus lowering risk of release delays due to security concerns or forced acceptance of risk in order to meet project deadlines.

For each project release, results from automated and manual security tests should be pre-sented to management and business stakeholders for review. If there are unaddressed find-ings that remain as accepted risks for the release, stakeholders and development managers should work together to establish a concrete timeframe for addressing them.

Make security testing during development more complete and efficient through automation

resuLts

✦ Deeper and more consistent verification of software functionality for security ✦ Development teams enabled to self-check and correct problems before release ✦ Stakeholders better aware of open vulnerabilities when making risk acceptance decisions

add’L success Metrics

✦ >50% of projects with security testing and stakeholder sign-off in past 6 months ✦ >80% of projects with access to automated security testing results in past 1 month

add’L costs

✦ Research and selection of automated security testing solution ✦ Initial cost and maintenance of automation integration ✦ Ongoing project overhead from automated security testing and mitigation

add’L personneL

✦ Developers (1 days/yr) ✦ Architects (1 day/yr) ✦ Managers (1-2 days/yr) ✦ Security Auditors (2 days/yr) ✦ QA Testers (3-4 days/yr)

reLated LeveLs

Security Testing2ST

Page 69: Software Assurance Maturity Model

sAM

M /

th

e se

cu

rit

y p

rA

ct

ices

- V

1.0

69

Activities

A . Employ application-specific security testing automationThrough either customization of security testing tools, enhancements to generic test case execution tools, or buildout of custom test harnesses, project teams should formally iterate through security requirements and build a set of automated checkers to test the security of the implemented business logic.

Additionally, many automated security testing tools can be greatly improved in accuracy and depth of coverage if they are customized to understand more detail about the specific software interfaces in the project under test. Further, organization-specific concerns from compliance or technical standards can be codified as a reusable, central test battery to make audit data collection and per-project management visibility simpler.

Project teams should focus on buildout of granular security test cases based on the busi-ness functionality of their software, and an organization-level team led by a security auditor should focus on specification of automated tests for compliance and internal standards.

B . Establish release gates for security testingTo prevent software from being released with easily found security bugs, a particular point in the software development life-cycle should be identified as a checkpoint where an estab-lished set of security test cases must pass in order to make a release from the project. This establishes a baseline for the kinds of security tests all projects are expected to pass.

Since adding too many test cases initially can result in an overhead cost bubble, begin by choosing one or two security issues and include a wide variety of test cases for each with the expectation that no project may pass if any test fails. Over time, this baseline should be improved by selecting additional security issues and adding a variety of corresponding test cases.

Generally, this security testing checkpoint should occur toward the end of the implementa-tion or testing, but must occur before release.

For legacy systems or inactive projects, an exception process should be created to allow those projects to continue operations, but with an explicitly assigned timeframe for mitiga-tion of findings. Exceptions should be limited to no more that 20% of all projects.

Require application-specific security testing to ensure baseline security before deployment

resuLts

✦ Organization-wide baseline for expected application performance against attacks ✦ Customized security test suites to improve accuracy of automated analysis ✦ Project teams aware of objective goals for attack resistance

add’L success Metrics

✦ >50% of projects using security testing customizations ✦ >75% of projects passing all security tests in past 6 months

add’L costs

✦ Buildout and maintenance of customizations to security testing automation ✦ Ongoing project overhead from security testing audit process ✦ Organization overhead from project delays caused by failed security testing audits

add’L personneL

✦ Architects (1 day/yr) ✦ Developers (1 day/yr) ✦ Security Auditors (1-2 days/yr) ✦ QA Testers (1-2 days/yr) ✦ Business Owners (1 day/yr) ✦ Managers (1 day/yr)

reLated LeveLs

✦ Policy & Compliance - 2 ✦ Secure Architecture - 3

Security Testing 3ST

Page 70: Software Assurance Maturity Model

sAM

M /

th

e se

cu

rit

y p

rA

ct

ices

- V

1.0

70

1VM 2VM 3VM

Vulnerability Management

objective Understand high-level plan for responding to vulnerability reports or incidents

Elaborate expectations for response process to improve consistency and communications

Improve analysis and data gathering within response process for feedback into proactive planning

Activities A. Identify point of contact for security issues

B. Create informal security response team(s)

A. Establish consistent incident response process

B. Adopt a security issue disclosure process

A. Conduct root cause analysis for incidents

B. Collect per-incident metrics

Assessment ✦ Do most projects have a point of contact for security issues? ✦ Does your organization have an assigned security response team? ✦ Are most project teams aware of their security point(s) of contact and response team(s)?

✦ Does the organization utilize a consistent process for incident reporting and handling? ✦ Are most project stakeholders aware of relevant security disclosures related to their software projects?

✦ Are most incidents inspected for root causes to generate further recommendations? ✦ Do most projects consistently collect and report data and metrics related to incidents?

resUlts ✦ Lightweight process in place to handle high-priority vulnerabilities or incidents ✦ Framework for stakeholder notification and reporting of events with security impact ✦ High-level due diligence for handling security issues

✦ Communications plan for dealing with vulnerability reports from third-parties ✦ Clear process for releasing security patches to software operators ✦ Formal process for tracking, handling, and internally communicating about incidents

✦ Detailed feedback for organizational improvement after each incident ✦ Rough cost estimation from vulnerabilities and compromises ✦ Stakeholders better able to make tradeoff decisions based on historic incident trends

Page 71: Software Assurance Maturity Model

sAM

M /

th

e se

cu

rit

y p

rA

ct

ices

- V

1.0

71

Activities

A . Identify point of contact for security issuesFor each division within the organization or for each project team, establish a point of con-tact to serve as a communications hub for security information. While generally this respon-sibility will not claim much time from the individuals, the purpose of having a predetermined point of contact is to add structure and governance for vulnerability management.

Examples of incidents that might cause the utilization include receipt of a vulnerability report from an external entity, compromise or other security failure of software in the field, inter-nal discovery of high-risk vulnerabilities, etc. In case of an event, the closest contact would step in as an extra resource and advisor to the affected project team(s) to provide technical guidance and brief other stakeholders on progress of mitigation efforts.

The point of contact should be chosen from security-savvy technical or management staff with a breadth of knowledge over the software projects in the organization. A list of these assigned security points of contact should be centrally maintained and updated at least every six months. Additionally, publishing and advertising this list allows staff within the organization to request help and work directly with one another on security problems.

B . Create informal security response team(s)From the list of individuals assigned responsibility as a security point of contact or from dedicated security personnel, select a small group to serve as a centralized technical security response team. The responsibilities of the team will include directly taking ownership of security incidents or vulnerability reports and being responsible for triage, mitigation, and reporting to stakeholders.

Given their responsibility when tapped, members of the security response team are also responsible for executive briefings and upward communication during an incident. It is likely that most of the time, the security response team would not be operating in this capacity, though they must be flexible enough to be able to respond quickly or a smooth process must exist for deferring and incident to another team member.

The response team should hold a meeting at least annually to brief security points of contact on the response process and high-level expectations for security-related reporting from project teams.

Understand high-level plan for responding to vulnerability reports or incidents

resuLts

✦ Lightweight process in place to handle high-priority vulnerabilities or incidents ✦ Framework for stakeholder notification and reporting of events with security impact ✦ High-level due diligence for handling security issues

success Metrics

✦ >50% of the organization briefed on closest security point of contact in past 6 months ✦ >1 meeting of security response team and points of contact in past 12 months

costs

✦ Ongoing variable project overhead from staff filling the security point of contact roles ✦ Identification of appropriate security response team

personneL

✦ Security Auditors (1 day/yr) ✦ Architects (1 day/yr) ✦ Managers (1 day/yr) ✦ Business Owners (1 day/yr)

reLated LeveLs

✦ Education & Guidance - 2 ✦ Strategy & Metrics - 3

Vulnerability Management 1VM

Page 72: Software Assurance Maturity Model

sAM

M /

th

e se

cu

rit

y p

rA

ct

ices

- V

1.0

72

Activities

A . Establish consistent incident response processExtending from the informal security response team, explicitly document the organization’s incident response process as well as the procedures that team members are expected to follow. Additionally, each member of the security response team must be trained on this material at least annually.

There are several tenets to sound incident response process and they include initial triage to prevent additional damage, change management and patch application, managing project personnel and others involved in the incident, forensic evidence collection and preservation, limiting communication about the incident to stakeholders, well-defined reporting to stake-holders and/or communications trees, etc.

With development teams, the security responders should work together to conduct the technical analysis to verify facts and assumptions about each incident or vulnerability report. Likewise, when project teams detect an incident or high-risk vulnerability, they should follow an internal process that puts them in contact with a member of the security response team.

B . Adopt a security issue disclosure processFor most organizations, it is undesirable to let news of a security problem become public, but there are several important ways in which internal-to-external communications on security issues should be fulfilled.

The first and most common is through creation and deployment of security patches for the software produced by the organization. Generally, if all software projects are only used internally, then this becomes less critical, but for all contexts where the software is being operated by parties external to the organization, a patch release process must exist. It should provide for several factors including change management and regression testing prior to patch release, announcement to operators/users with assigned criticality category for the patch, sparse technical details so that an exploit cannot be directly derived, etc.

Another avenue for external communications is with third parties that report security vul-nerabilities in an organization’s software. By adopting and externally posting the expected process with timeframes for response, vulnerability reporters are encouraged to follow responsible disclosure practices.

Lastly, many states and countries legally require external communications for incidents in-volving data theft of personally identifiable information and other sensitive data type. Should this type of incident occur, the security response team should work with managers and busi-ness stakeholders to determine appropriate next-steps.

Elaborate expectations for response process to improve consistency and communications

resuLts

✦ Communications plan for dealing with vulnerability reports from third-parties ✦ Clear process for releasing security patches to software operators ✦ Formal process for tracking, handling, and internally communicating about incidents

add’L success Metrics

✦ >80% of project teams briefed on incident response process in past 6 months ✦ >80% of stakeholders briefed on security issue disclosures in past 6 months

add’L costs

✦ Ongoing organization overhead from incident response process

add’L personneL

✦ Security Auditors (3-5 days/yr) ✦ Managers (1-2 days/yr) ✦ Business Owners (1-2 days/yr) ✦ Support/Operators (1-2 days/yr)

reLated LeveLs

Vulnerability Management2VM

Page 73: Software Assurance Maturity Model

sAM

M /

th

e se

cu

rit

y p

rA

ct

ices

- V

1.0

73

Activities

A . Conduct root cause analysis for incidentsThough potentially time consuming, the incident response process should be augmented to include additional analysis to identify the key, underlying security failures. These root causes can be technical problems such as code-level vulnerabilities, configuration errors, etc. or they can be people/process problems such as social engineering, failure to follow procedures, etc.

Once a root cause is identified for an incident, it should be used as a tool to find other potential weaknesses in the organization where an analogous incident could have occurred. For each identified weakness additional recommendations for proactive mitigations should be communicated as part of closing out the original incident response effort.

Any recommendations based on root cause analysis should be reviewed by management and relevant business stakeholders in order to either schedule mitigation activities or note the accepted risks.

B . Collect per-incident metricsBy having a centralized process to handle all compromise and high-priority vulnerability reports, an organization is enabled to take measurements of trends over time to determine impact and efficiency of initiatives for security assurance.

Records of past incidents should be stored and reviewed at least every 6 months. Group similar incidents and simply tally the overall count for each type of problem. Additional measurements to take from the incidents include frequency of software projects affected by incidents, system downtime and cost from loss of use, human resources taken in handling and cleanup of the incident, estimates of long-term costs such as regulatory fines or brand damage, etc. For root causes that were technical problems in nature, it is also helpful to identify what kind of proactive, review, or operational practice might have detected it earlier or lessened the damage.

This information is concrete feedback into the program planning process since it represents the real security impact that the organization has felt over time.

Improve analysis and data gathering within response process for feedback into proactive planning

resuLts

✦ Detailed feedback for organizational improvement after each incident ✦ Rough cost estimation from vulnerabilities and compromises ✦ Stakeholders better able to make tradeoff decisions based on historic incident trends

add’L success Metrics

✦ >80% of incidents documented with root causes and further recommendations in past 6 months ✦ >80% of incidents collated for metrics in the past 6 months

add’L costs

✦ Ongoing organization overhead from conducting deeper research and analysis of incidents ✦ Ongoing organization overhead from collection and review of incident metrics

add’L personneL

✦ Security Auditors (3 days/yr) ✦ Managers (2 days/yr) ✦ Business Owners (2 days/yr)

reLated LeveLs

✦ Strategy & Metrics - 3

Vulnerability Management 3VM

Page 74: Software Assurance Maturity Model

sAM

M /

th

e se

cu

rit

y p

rA

ct

ices

- V

1.0

74

1EH 2EH 3EH

Environment Hardening

objective Understand baseline operational environment for applications and software components

Improve confidence in application operations by hardening the operating environment

Validate application health and status of operational environment against known best practices

Activities A. Maintain operational environment specification

B. Identify and install critical security upgrades and patches

A. Establish routine patch management process

B. Monitor baseline environment configuration status

A. Identify and deploy relevant operations protection tools

B. Expand audit program for environment configuration

Assessment ✦ Do the majority of projects document some requirements for the operational environment? ✦ Do most projects check for security updates to third-party software components?

✦ Is a consistent process used to apply upgrades and patches to critical dependencies? ✦ Do most project leverage automation to check application and environment health?

✦ Are stakeholders aware of options for additional tools to protect software while running in operations? ✦ Does routine audit check most projects for baseline environment health?

resUlts ✦ Clear understanding of operational expectations within the development team ✦ High-priority risks from underlying infrastructure mitigated on a well-understood timeline ✦ Software operators with a high-level plan for security-critical maintenance of infrastructure

✦ Granular verification of security characteristics of systems in operations ✦ Formal expectations on timelines for infrastructure risk mitigation ✦ Stakeholders consistently aware of current operations status of software projects

✦ Reinforced operational environment with layered checks for security ✦ Established and measured goals for operational maintenance and performance ✦ Reduced likelihood of successful attack via flaws in external dependencies

Page 75: Software Assurance Maturity Model

sAM

M /

th

e se

cu

rit

y p

rA

ct

ices

- V

1.0

75

Activities

A . Maintain operational environment specificationFor each project, a concrete definition of the expected operating platforms should be cre-ated and maintained. Depending on the organization, this specification should be jointly cre-ated with development staff, stakeholders, support and operations groups, etc.

Begin this specification should by capturing all details that must be true about the operating environment based upon the business function of the software. These can include factors such as processor architecture, operating system versions, prerequisite software, conflict-ing software, etc. Further, note any known user or operator configurable options about the operating environment that affect the way in which the software will behave.

Additionally, identify any relevant assumptions about the operating environment that were made in design and implementation of the project and capture those assumptions in the specification.

This specification should be reviewed and updated at least every 6 months for active projects or more often if changes are being made to the software design or the expected operating environment.

B . Identify and install critical security upgrades and patchesMost applications are software that runs on top of another large stack of software com-posed of built-in programming language libraries, third-party components and development frameworks, base operating systems, etc. Because security flaws contained in any module in that large software stack affect the overall security of the organization’s software, critical security updates for elements of the technology stack must be installed.

As such, regular research or ongoing monitoring of high-risk dependencies should be per-formed to stay abreast of the latest fixes to security flaws. Upon identification of a critical upgrade or patch that would impact the security posture of the software project, plans should be made to get affected users and operators to update their installations. Depending on the type of software project, details on doing this can vary.

Understand baseline operational environment for applications and software components

resuLts

✦ Clear understanding of operational expectations within the development team ✦ High-priority risks from underlying infrastructure mitigated on a well-understood timeline ✦ Software operators with a high-level plan for security-critical maintenance of infrastructure

success Metrics

✦ >50% project with updated operational environment specification in past 6 months ✦ >50% of projects with updated list of relevant critical security patches in past 6 months

costs

✦ Ongoing project overhead from buildout and maintenance of operational environment specification ✦ Ongoing project overhead from monitoring and installing critical security updates

personneL

✦ Developers (1-2 day/yr) ✦ Architects (1-2 day/yr) ✦ Managers (2-4 day/yr) ✦ Support/Operators (3-4 days/yr)

reLated LeveLs

✦ Operational Enablement - 2

Environment Hardening 1EH

Page 76: Software Assurance Maturity Model

sAM

M /

th

e se

cu

rit

y p

rA

ct

ices

- V

1.0

76

Activities

A . Establish routine patch management processMoving to a more formal process than ad hoc application of critical upgrades and patches, an ongoing process should be created in the organization to consistently apply updates to software dependencies in the operating environment.

In the most basic form, the process should aim to make guarantees for time lapse between release and application of security upgrades and patches. To make this process efficient, orga-nizations typically accept high latency on lower priority updates, e.g. maximum of 2 days for critical patches spanning to a maximum of 30 days for low priority patches.

This activity should be primarily conducted by support and operations staff, but routine meetings with development should also be conducted to keep the whole project abreast of past changes and scheduled upgrades.

Additionally, development staff should share a list of third-party components upon which the software project internally depends so that support and operations staff can monitor those as well to cue development teams on when an upgrade is required.

B . Monitor baseline environment configuration statusGiven the complexity of monitoring and managing patches alone across the variety of com-ponents composing the infrastructure for a software project, automation tools should be utilized to automatically monitor systems for soundness of configuration.

There are both commercial and open-source tools available to provide this type of function-ality, so project teams should select a solution based on appropriateness to the organization’s needs. Typical selection criteria includes ease of deployment and customization, applicability to the organization’s platforms and technology stacks, built-in features for change manage-ment and alerting, metrics collection and trend tracking etc.

In addition to host and platform checks, monitoring automation should be customized to perform application-specific health checks and configuration verifications. Support and op-erations personnel should work with architects and developers to determine the optimal amount of monitoring for a given software project.

Ultimately, after a solution is deployed for monitoring the environment’s configuration status, unexpected alerts or configuration changes should be collected and regularly reviewed by project stakeholders as often as weekly but at least once per quarter.

Improve confidence in application operations by hardening the operating environment

resuLts

✦ Granular verification of security characteristics of systems in operations ✦ Formal expectations on timelines for infrastructure risk mitigation ✦ Stakeholders consistently aware of current operations status of software projects

add’L success Metrics

✦ >80% of project teams briefed on patch management process in past 12 months ✦ >80% of stakeholders aware of current patch status in past 6 months

add’L costs

✦ Ongoing organization overhead from patch management and monitoring ✦ Buildout or license of infrastructure monitoring tools

add’L personneL

✦ Architects (1-2 days/yr) ✦ Developers (1-2 days/yr) ✦ Business Owners (1-2 days/yr) ✦ Managers (1-2 days/yr) ✦ Support/Operators (3-4 days/yr)

reLated LeveLs

Environment Hardening2EH

Page 77: Software Assurance Maturity Model

sAM

M /

th

e se

cu

rit

y p

rA

ct

ices

- V

1.0

77

Activities

A . Identify and deploy relevant operations protection toolsIn order to build a better assurance case for software in its operating environment, addi-tional tools can be used to enhance the security posture of the overall system. Operational environments can vary dramatically, thus the appropriateness of given protection technology should be considered in the project context.

Commonly used protections tools include web application firewalls, XML security gateways for web services, anti-tamper and obfuscation packages for client/embedded systems, net-work intrusion detection/prevention systems for legacy infrastructure, forensic log aggrega-tion tools, host-based integrity verification tools, etc.

Based on the organization and project-specific knowledge, technical stakeholders should work with support and operations staff to identify and recommend selected operations protection tools to business stakeholders. If deemed a valuable investment in terms of risk-reduction versus cost of implementation, stakeholders should agree on plans for a pilot, widespread rollout, and ongoing maintenance.

B . Expand audit program for environment configurationWhen conducting routine project-level audits, expand the review to include inspection of artifacts related to hardening the operating environment. Beyond an up-to-date specification for the operational environment, audits should inspect current patch status and historic data since the previous audit. By tapping into monitoring tools, audits can also verify key factors about application configuration management and historic changes. Audits should also inspect the usage of operations protections tools against those available for the software’s architecture type.

Audits for infrastructure can occur at any point after a project’s initial release and deploy-ment, but should occur at least every 6 months. For legacy systems or projects without active development, infrastructure audits should still be conducted and reviewed by busi-ness stakeholders. An exception process should be created to allow special-case projects to continue operations, but with an explicitly assigned timeframe for mitigation of findings. Exceptions should be limited to no more that 20% of all projects.

Validate application health and status of operational environment against known best practices

resuLts

✦ Reinforced operational environment with layered checks for security ✦ Established and measured goals for operational maintenance and performance ✦ Reduced likelihood of successful attack via flaws in external dependencies

add’L success Metrics

✦ >80% of stakeholders briefed on relevant operations protection tools in past 6 months ✦ >75% of projects passing infrastructure audits in past 6 months

add’L costs

✦ Research and selection of operations protection solutions ✦ Buildout or license of operations protections tools ✦ Ongoing operations overhead from maintenance of protection tools ✦ Ongoing project overhead from infrastructure-related audits

add’L personneL

✦ Business Owners (1 day/yr) ✦ Managers (1-2 days/yr) ✦ Support/Operators (3-4 days)

reLated LeveLs

✦ Policy & Compliance - 2

Environment Hardening 3EH

Page 78: Software Assurance Maturity Model

sAM

M /

th

e se

cu

rit

y p

rA

ct

ices

- V

1.0

78

1OE 2OE 3OE

Operational Enablement

objective Enable communications between development teams and operators for critical security-relevant data

Improve expectations for continuous secure operations through provision of detailed procedures

Mandate communication of security information and validate artifacts for completeness

Activities A. Capture critical security information for deployment

B. Document procedures for typical application alerts

A. Create per-release change management procedures

B. Maintain formal operational security guides

A. Expand audit program for operational information

B. Perform code signing for application components

Assessment ✦ Do you deliver security notes with the majority of software releases? ✦ Are security-related alerts and error conditions documented for most projects?

✦ Are most project utilizing a change management process that’s well understood? ✦ Do project teams deliver an operational security guide with each product release?

✦ Are most projects being audited to check each release for appropriate operational security information? ✦ Is code signing routinely performed on software components using a consistent process?

resUlts ✦ Ad hoc improvements to software security posture through better understanding of correct operations ✦ Operators and users aware of their role in ensuring secure deployment ✦ Improved communications between software developers and users for security-critical information

✦ Detailed guidance for security-relevant changes delivered with software releases ✦ Updated information repository on secure operating procedures per application ✦ Alignment of operations expectations among developers, operators, and users.

✦ Organization-wide understanding of expectations for security-relevant documentation ✦ Stakeholders better able to make tradeoff decisions based on feedback from deployment and operations ✦ Operators and/or users able to independently verify integrity of software releases

Page 79: Software Assurance Maturity Model

sAM

M /

th

e se

cu

rit

y p

rA

ct

ices

- V

1.0

79

Activities

A . Capture critical security information for deploymentWith software-specific knowledge, project teams should identify any security-relevant con-figuration and operations information and communicate it to users and operators. This en-ables the actual security posture of software at deployment sites to function in the same way that designers in the project team intended.

This analysis should begin with architects and developers building a list of security features built-in to the software. From that list, information about configuration options and their security impact should be captured as well. For projects that offer several different deploy-ment models, information about the security ramifications of each should be noted to better inform users and operators about the impact of their choices.

Overall, the list should be lightweight and aim to capture the most critical information. Once initially created, it should be reviewed by the project team and business stakeholders for agreement. Additionally, it is effective to review this list with select operators or users in order to ensure the information is understandable and actionable. Project teams should re-view and update this information with every release, but must do so at least every 6 months.

B . Document procedures for typical application alertsWith specific knowledge of ways in which software behaves, project teams should identify the most important error and alert messages which require user/operator attention. From each identified event, information related to appropriate user/operator actions in response to the event should be captured.

From the potentially large set of events that the software might generate, select the highest priority set based on relevance in terms of the business purpose of the software. This should include any security-related events, but also may include critical errors and alerts related to software health and configuration status.

For each event, actionable advice should be captured to inform users and operators of required next steps and potential root causes of the event. These procedures must be re-viewed by the project team and updated at every major product release, every 6 months, but can be done more frequently, e.g. with each release.

Enable communications between development teams and operators for critical security-relevant data

resuLts

✦ Ad hoc improvements to software security posture through better understanding of correct operations ✦ Operators and users aware of their role in ensuring secure deployment ✦ Improved communications between software developers and users for security-critical information

success Metrics

✦ >50% of projects with updated deployment security information in past 6 months ✦ >50% of projects with operational procedures for events updated in past 6 months

costs

✦ Ongoing project overhead from maintenance of deployment security information ✦ Ongoing project overhead from maintenance of critical operating procedures

personneL

✦ Developers (1-2 days/yr) ✦ Architects (1-2 days/yr) ✦ Managers (1 days/yr) ✦ Support/Operators (1 days/yr)

reLated LeveLs

Operational Enablement 1OE

Page 80: Software Assurance Maturity Model

sAM

M /

th

e se

cu

rit

y p

rA

ct

ices

- V

1.0

80

Activities

A . Create per-release change management proceduresTo more formally update users and operators on relevant changes in the software, each release must include change management procedures relevant to upgrade and first-time installation. Overall, the goal is to capture the expected accompanying steps that ensure the deployment will be successful and not incur excessive downtime or degradation of security posture.

To build these procedures during development, the project teams should setup a lightweight internal process for capturing relevant items that would impact deployments. It is effective to have this process in place early in the development cycle so that this information can be re-tained as soon as it is identified while in the requirements, design, and implementation phases.

Before each release, the project team should review the list as a whole for completeness and feasibility. For some projects, extensive change procedures accompanying a given release may warrant special handling, such as building automated upgrade scripts to prevent errors during deployment.

B . Maintain formal operational security guidesStarting from the information captured on critical software events and the procedures for handling each, project teams should build and maintain formal guides that capture all the security-relevant information that users and operators need to know.

Initially, this guide should be built from the known information about the system, such as security-related configuration options, event handling procedures, installation and upgrade guides, operational environment specifications, security-related assumptions about the de-ployment environment, etc. Extending this, the formal operational security guide should elaborate on each of these to cover more details such that the majority of the users and operators will be informed for all the questions they might have had. For large or complex systems, this can be challenging, so project teams should work with business stakeholders to determine the appropriate level of documentation. Additionally, project teams should docu-ment any recommendations for deployments that would enhance security.

The operational security guide, after initial creation, should be reviewed by project teams and updated with each release.

Improve expectations for continuous secure operations through provision of detailed procedures

resuLts

✦ Detailed guidance for security-relevant changes delivered with software releases ✦ Updated information repository on secure operating procedures per application ✦ Alignment of operations expectations among developers, operators, and users.

add’L success Metrics

✦ >50% of projects with updated change management procedures in past 6 months ✦ >80% of stakeholders briefed on status of operational security guides in past 6 months

add’L costs

✦ Ongoing project overhead from maintenance of change management procedures ✦ Ongoing project overhead from maintenance of operational security guides

add’L personneL

✦ Developers (1-2 days/yr) ✦ Architects (1-2 days/yr) ✦ Managers (1 days/yr) ✦ Support/Operators (1 days/yr)

reLated LeveLs

✦ Environment Hardening - 1

Operational Enablement2OE

Page 81: Software Assurance Maturity Model

sAM

M /

th

e se

cu

rit

y p

rA

ct

ices

- V

1.0

81

Activities

A . Expand audit program for operational informationWhen conducting routine project-level audits, expand the review to include inspection of ar-tifacts related to operational enablement for security. Projects should be checked to ensure they have an updated and complete operational security guides as relevant to the specifics of the software.

These audits should begin toward the end of the development cycle close to release, but must be completed and passed before a release can be made. For legacy systems or inactive projects, this type of audit should be conducted and a one-time effort should be made to address findings and verify audit compliance, after which additional audits for operational enablement are no longer required.

Audit results must be reviewed with business stakeholders prior to release. An exception process should be created to allow projects failing an audit to continue with a release, but these projects should have a concrete timeline for mitigation of findings. Exceptions should be limited to no more that 20% of all active projects.

B . Perform code signing for application componentsThough often used with special-purpose software, code signing allows users and operators to perform integrity checks on software such that they can cryptographically verify the authenticity of a module or release. By signing software modules, the project team enables deployments to operate with a greater degree of assurance against any corruption or modi-fication of the deployed software in its operating environment.

Signing code incurs overhead for management of signing credentials for the organization. An organization must follow safe key management processes to ensure the ongoing confidential-ity of the signing keys. When dealing with any cryptographic keys, project stakeholders must also consider plans for dealing with common operational problems related to cryptography such as key rotation, key compromise, or key loss.

Since code signing is not appropriate for everything, architects and developers should work with security auditors and business stakeholders to determine which parts of the software should be signed. As projects evolve, this list should be reviewed with each release, especially when adding new modules or making changes to previously signed components.

Mandate communication of security information and validate artifacts for completeness

resuLts

✦ Organization-wide understanding of expectations for security-relevant documentation ✦ Stakeholders better able to make tradeoff decisions based on feedback from deployment and operations ✦ Operators and/or users able to independently verify integrity of software releases

add’L success Metrics

✦ >80% of projects with updated operational security guide in last 6 months ✦ >80% of stakeholders briefed on code signing options and status in past 6 months

add’L costs

✦ Ongoing project overhead from audit of operational guides ✦ Ongoing organization overhead from management of code signing credentials ✦ Ongoing project overhead from identification and signing of code modules.

add’L personneL

✦ Developers (1 days/yr) ✦ Architects (1 days/yr) ✦ Managers (1 days/yr) ✦ Security Auditors (1-2 days/yr)

reLated LeveLs

Operational Enablement 3OE

Page 82: Software Assurance Maturity Model

Case StudiesA walkthrough of example scenarios

Page 83: Software Assurance Maturity Model

This section features a selection of scenarios in which the application of SAMM is explained in the context of a specific business case. Using the roadmap templates as a guide, the case studies tell the story of how an organization might adapt best practices and take into account organization-specific risks when building a security assurance program.

Page 84: Software Assurance Maturity Model

sAM

M /

cA

se s

tu

die

s -

V1.

0

84

VirtualWareCase Study: Medium-sized Independent Software Vendor

bUsiness profile

VirtualWare is a leader within their market for providing integrated virtualized application platforms to help organizations consolidate their application interfaces into a single environment. Their tech-nology is provided as a server application and desktop client built for multiple environments including Microsoft, Apple and Linux plat-forms.

The organization is of medium size (200-1000 employees) and has a global presence around the world with branch offices in most major countries.

orgAnizAtion

VirtualWare has been developing their core software platform for over 8 years. During this time they have had limited risk from com-mon web vulnerabilities due to minimal usage of web interfaces. Most of the VirtualWare platforms are run through either a server based systems or thick clients running on the desktop.

Recently VirtualWare started a number of new project streams, which deliver their client and server interfaces via web technology. Knowing the extent of common attacks seen over the web, this has driven the organization to review their software security strategy and ensure that it adequately addresses possible threats towards their organization going forward.

Previously the organization had undertaken basic reviews of the application code, and has been more focused on performance and functionality rather than security. VirtualWare developers have been using a number of code quality analysis tools to identify bugs and address them within the code.

With this in mind, the upper management team has set a strategic objective to review the current status of the security of their ap-plications and determine the best method of identifying, removing, and preventing vulnerabilities in them.

environment

VirtualWare develops their virtualization technology on a mixture of Java, C++ and Microsoft .NET technology. Their core application virtualization technology has been written in C++ and has had a number of reviews for bugs and security, but currently no formal processes exists for identifying and fixing known or unknown se-curity bugs.

VirtualWare has chosen to support their web technology on Java, although the back-end systems are built using Microsoft and C++ technologies. The development team focused on the new web inter-faces is primarily composed of Java developers.

VirtualWare employs over 300 developers, with staff broken up into teams based on the projects that they work on. There are 12 teams with around 20–40 developers per team. Within each team there is minimal experience with software security, and although senior developers perform basic assessments of their code, security is not considered a critical goal within the organization.

Each team within VirtualWare adopts a different development model. Currently the two primary methodologies used are Agile SCRUM and iterative Waterfall style approaches. There is minimal to no guidance from the IT department or project architects on software security.

Page 85: Software Assurance Maturity Model

sAM

M /

cA

se s

tu

die

s -

V1.

0

85

key chAllenges

✦Rapid release of application features to ensure they maintain their competitive edge over rivals ✦Limited experience with software security concepts—currently minimal effort is associated with security related tasks ✦Developers leave the organization and are replaced with less experienced developers ✦Multiple technologies used within applications, with legacy applications that have not been updated since originally built ✦No understanding of existing security posture or risks facing the organization

VirtualWare wanted to focus on ensuring that their new web appli-cations would be delivered securely to their customers. Therefore the initial focus on implementing the security assurance program was on education and awareness for their development teams, as well as providing some base technical guidance on secure coding and testing standards.

The organization previously had received bug requests and secu-rity vulnerabilities through their [email protected] address. However as this was a general support address, existing requests were not always filtered down to the appropriate teams within the organization and handled correctly. The need to implement a for-mal security vulnerability response program was also identified by VirtualWare.

implementAtion strAtegy

The adoption of a security assurance program within an organiza-tion is a long term strategy, and significantly impacts on the culture of developers and the process taken by the business to develop and deliver business applications. The adoption of this strategy is set over a 12 month period, and due to the size of the organization will be relatively easy to implement in that period.

Strategy &Metrics

Policy &Compliance

Education &Guidance

ThreatAssessment

SecurityRequirements

SecureArchitecture

DesignReview

OperationalEnablement

EnvironmentHardening

VulnerabilityManagement

SecurityTesting

CodeReview

Phase 2Phase 3Phase 4

Phase 1

Page 86: Software Assurance Maturity Model

sAM

M /

cA

se s

tu

die

s -

V1.

0

86

Vir

tual

War

e -

Ph

ase

1

phAse 1 (months 0 – 3) – AwAreness & plAnning

VirtualWare previously identified that they had limited knowledge and awareness of application security threats to their organization and limited secure coding experience. The first phase of the deployment within VirtualWare focused on training developers and implementing guidance and programs to identify current security vulnerabilities.

Development teams within VirtualWare had limited experience in secure coding techniques therefore, an initial training program was developed that can be provided to the developers within the organiza-tion on defensive programming techniques.

With over 300 developers and multiple languages supported within the organization one of the key challenges for VirtualWare was to provide an education program that was technical enough to teach developers some of the basic’s in secure coding concepts. The objective of this initial education course was primarily on coding techniques and testing tools. The course developed and delivered within the organization lasted for 1 day and covered the basics of secure coding.

VirtualWare was aware that they had a number of applications with vulnerabilities and no real strategy in which to identify existing vulnerabilities and address the risks in a reasonable time-frame. A basic risk assessment methodology was adopted and the organization undertook a review of the existing application platforms.

This phase also included implementing a number of concepts for the development team to enhance their security tools. The development teams already had a number of tools available to perform quality type assessments. Additional investigation into code review and security testing tools was performed.

tArget objectives

During this phase of the project, VirtualWare implemented the following SAMM Practices & Activities.

A. Estimate overall business risk profileB. Build and maintain assurance program roadmap1SM

A. Conduct technical security awareness trainingB. Build and maintain technical guidelines

A. Derive security requirements from business functionalityB. Evaluate security and compliance guidance for requirements

A. Create review checklists from known security requirementsB. Perform point-review of high-risk code

A. Derive test cases from known security requirementsB. Conduct penetration testing on software releases

A. Identify point of contact for security issuesB. Create informal security response team(s)

1EG

1SR

1CR

1ST

1VM

Page 87: Software Assurance Maturity Model

sAM

M /

cA

se s

tu

die

s -

V1.

0

87

To achieve these maturity levels VirtualWare implemented a num-ber of programs during this phase of the roll-out. The following ini-tiatives were adopted;

✦1 Day Secure Coding Course (High-level) for all developers; ✦Build a technical guidance whitepaper for application security on technologies used within the organization; ✦Create a risk process and perform high-level business risk assessments for the application platforms and review business risk; ✦Prepare initial technical guidelines and standards for developers; ✦Perform short code reviews on application platforms that present significant risk to the organization; ✦Develop test and use cases for projects and evaluate the cases against the applications; ✦Appointed a role to application security initiatives; ✦Generated a Draft strategic roadmap for the next phase of the assurance program.

Due to the limited amount of expertise in-house within Virtual-Ware, the company engaged with a third party security consulting group to assist with the creation of the training program, and as-sist in writing the threat modeling and strategic roadmap for the organization.

One of the key challenges faced during this phase, was to get all 300 developers through a one day training course. To achieve this Virtu-alWare ran 20 course days, with only a small number of developers from each team attending the course at one time. This reduced the overall impact on staff resources during the training period.

During this phase of the project, VirtualWare invested significant resources effort into the adoption of a risk review process and re-viewing the business risk to the organization. Although considerable effort was focused on these tasks, they were critical to ensuring that the next steps implemented by VirtualWare were in line with the business risks faced by the organization.

VirtualWare management received positive feedback from most de-velopers within the organization on the training program. Although not detailed, developers felt that the initial training provided some basic skills that could assist them immediately day to day in writing secure code.

implementAtion costs

A significant amount of internal resources and costs were invested in this phase of the project. There were three different types of costs associated with this phase.

Internal Resource RequirementsInternal resource effort used in the creation of content, workshops and review of application security initiatives within this phase. Effort is shown in total days per role.

Training Resource Requirements (Training per per-son for period)Each developer within VirtualWare was required to attend a train-ing course, and therefore every developer had a single day allocated to the application security program.

Outsourced ResourcesDue to the lack of knowledge within VirtualWare, external resourc-es were used to assist with the creation of content, and create/deliver the training program to the developers.

14days

Developer

10days

Architect

8days

Manager

8days

Business Owner

3days

QA Tester

9days

SecurityAuditor

1day

Developer (per person)

15days

Consultant (Security)

22days

Consultant (Training)

Page 88: Software Assurance Maturity Model

sAM

M /

cA

se s

tu

die

s -

V1.

0

88

Vir

tual

War

e -

Ph

ase

2

phAse 2 (months 3 – 6) – edUcAtion & testing

VirtualWare identified in phase 1 that a number of their applications contained vulnerabilities that may be exploited by external threats. Therefore one of the key objectives of this phase was to implement basic testing and review capabilities to identify the vulnerabilities and address them in the code.

The introduction of automated tools to assist with code coverage and findings weaknesses was identi-fied as one of the biggest challenges in this phase of the implementation. Traditionally in the past devel-opers have used automated tools with great difficultly and therefore implementing new tools was seen as a significant challenge.

To ensure a successful rollout of the automation tools within the organization, VirtualWare proceeded with a staged roll-out. The tools would be given to senior team leaders first, with other developers coming online over a period of time. Teams were encouraged to adopt the tools, however, no formal process was put in place for their use.

This phase of the implementation also saw the introduction of a more formal education and awareness program. Developers from the previous training requested more specific training in the areas of web services, and data validation. The new 6 hour specific training course was developed with these two focus areas. VirtualWare also implemented additional training programs for Architects and Managers, and adopted an awareness campaign within the organization.

tArget objectives

During this phase of the project, VirtualWare implemented the following SAMM Practices & Activities.

A. Classify data and applications based on business riskB. Establish and measure per-classification security goals2SM

A. Conduct role-specific application security trainingB. Utilize security coaches to enhance project teams

A. Build and maintain application-specific threat modelsB. Develop attacker profile from software architecture

A. Utilize automated code analysis toolsB. Integrate code analysis into development process

A. Utilize automated security testing toolsB. Integrate security testing into development process

A. Capture critical security information for deploymentB. Document procedures for typical application alerts

2EG

1TA

2CR

2ST

1OE

A. Identify software attack surfaceB. Analyze design against known security requirements1DR

Page 89: Software Assurance Maturity Model

sAM

M /

cA

se s

tu

die

s -

V1.

0

89

To achieve these maturity levels VirtualWare implemented a num-ber of programs during this phase of the roll-out. The following ini-tiatives were adopted;

✦Additional Education & Training courses for QA Testers, Managers & Architects; ✦Conduct data asset classification and set security goals; ✦Develop the risk assessment methodology into a threat modeling approach with attack tress and profiles; ✦Review and identify security requirements per application platform; ✦ Introduction of automated tools to assist with code coverage and security analysis of existing applications and new code bases; ✦Review and enhance existing penetration testing programs; ✦Enhance the existing software development life-cycle to support security testing as a part of the development process.

VirtualWare adapted the existing application security training pro-gram, to provider a smaller less technical version as a Business Ap-plication Security awareness program. This was a shorter 4 hour course, and was extended to Managers, Business Owners of the organization.

A high-level review of the existing code review and penetration testing programs identified that the process was inadequate and needed to be enhanced to provide better testing and results on application security vulnerabilities. The team set out to implement a new program of performing penetration testing and code reviews. As a part of this program, each senior developer in a program team was allocated approximately 4 days to perform a high-level source code review of their application.

VirtualWare management understood that the infrastructure and applications are tightly integrated, and during this phase the op-erational side of the application platforms (infrastructure) was re-viewed. This phase looked at the infrastructure requirements and application integration features between the recommended de-ployed hardware and the application interfaces.

During this phase the strategic roadmap and methodology for ap-plication security was reviewed by the project team. The objective of this review and update was to formally classify data assets and set the appropriate level of business risk associated with the data assets and applications. From this the project team was able to set security goals for these applications.

implementAtion costs

A significant amount of internal resources and costs were invested in this phase of the project. There were three different types of costs associated with this phase.

Internal Resource RequirementsInternal resource effort used in the creation of content, workshops and review of application security initiatives within this phase. Effort is shown in total days per role.

Training Resource Requirements (Training per per-son for period)Additional personnel within VirtualWare was required to attend a training course, and therefore several roles had time allocated to training on application security.

Outsourced ResourcesDue to the lack of knowledge within VirtualWare, external resourc-es were used to assist with the creation of content, and create/deliver the training program to the developers.

22days

Consultant (Security)

5days

Consultant (Training)

8days

Developer

10days

Architect

8days

Manager

5days

Business Owner

3days

QA Tester

15days

SecurityAuditor

2days

Support Operations

1/2day

Manager(per person)

1day

Architect(per person)

1/2day

Bus. Owner(per person)

Page 90: Software Assurance Maturity Model

sAM

M /

cA

se s

tu

die

s -

V1.

0

90

phAse 3 (months 6 – 9) – ArchitectUre & infrAstrUctUre

The third phase of the assurance program implementation within VirtualWare builds on from the previ-ous implementation phases and focuses on risk modeling, architecture, infrastructure and operational enablement capabilities.

The key challenge in this phase was establishing a tighter integration between the application platforms and operational side of the organization. In the previous phase VirtualWare teams were introduced to vulnerability management and the operational side of application security. During this phase Virtual-Ware has adopted the next phase of these areas and introduced clear incident response processed and detailed change control procedures.

VirtualWare has chosen to start two new areas for this implementation. Although VirtualWare is not impacted by regulatory compliance, a number of their customers have started to ask about whether the platforms can assist in passing regulatory compliance. A small team has been setup within VirtualWare to identify the relevant compliance drivers and create a checklist of drivers.

In the previous phase VirtualWare introduced a number of new automated tools to assist with the re-view and identification of vulnerabilities. Although not focused on in this phase, the development teams have adopted the new tools and have reported that they are starting to gain a benefit from using these tools within their groups.

tArget objectives

During this phase of the project, VirtualWare implemented the following SAMM Practices & Activities.

A. Build and maintain abuse-case models per projectB. Adopt a weighting system for measurement of threats2TA

A. Maintain list of recommended software frameworksB. Explicitly apply security principles to design1SA

A. Establish consistent incident response processB. Adopt a security issue disclosure process2VM

Vir

tual

War

e -

Ph

ase

3

A. Identify and monitor external compliance driversB. Build and maintain compliance guidelines1PC

A. Build an access control matrix for resources and capabilitiesB. Specify security requirements based on known risks

A. Inspect for complete provision of security mechanismsB. Deploy design review service for project teams

A. Create per-release change management proceduresB. Maintain formal operational security guides

2SR

2DR

2OE

Page 91: Software Assurance Maturity Model

sAM

M /

cA

se s

tu

die

s -

V1.

0

91

To achieve these maturity levels VirtualWare implemented a num-ber of programs during this phase of the roll-out. The following ini-tiatives were adopted;

✦Define and publish technical guidance on security requirements and secure architecture for projects within the organization; ✦ Identify and document compliance and regulatory requirements; ✦ Identify and create guidelines for security of application infrastructure; ✦Create a defined list of approved development frameworks; ✦Enhance the existing threat modeling process used within VirtualWare; ✦Adopt an incident response plan and prepare a security disclosure process; ✦ Introduce Change Management procedures and formal guidelines for all projects.

To coincide with the introduction of automated tools for develop-ers (from the previous phase), formal technical guidance on secure coding techniques was introduced into the organization. These were specific technical documents relating to languages and technology and provided guidance on secure coding techniques in each relevant language/application.

With a combined approach from the education and awareness pro-grams, technical guidance and then the introduction of automation tools to help the developers, VirtualWare started to see a visible difference in the code being delivered into production versions of their applications. Developers provided positive feedback on the tools and education made available to them under the program.

For the first time in VirtualWare project teams became responsible for their security and design of their application platforms. Dur-ing this phase a formal review process and validation against best practices were performed by each team. Some teams identified gaps relating to both security and business design that needed to be reviewed. A formal plan was put in place to ensure these gaps were addressed.

A formal incident response plan and change management proce-dures were introduced during this phase of the project. This was a difficult process to implement, and VirtualWare teams initially strug-gled with the process as the impact on culture and the operational side of the business was significant. However over time each team member identified the value in the new process and the changes were accepted by the team over the implementation period.

implementAtion costs

A significant amount of internal resources and costs were invested in this phase of the project. There were two different types of costs associated with this phase.

Internal Resource RequirementsInternal resource effort used in the creation of content, workshops and review of application security initiatives within this phase. Effort is shown in total days per role.

Outsourced ResourcesDue to the lack of knowledge within VirtualWare, external resourc-es were used to assist with the creation of content, and create/deliver the processes, guidelines and assist teams.

20days

Consultant (Security)

5days

Developer

7days

Architect

9days

Manager

6days

Business Owner

10days

SecurityAuditor

3days

Support Operations

Page 92: Software Assurance Maturity Model

sAM

M /

cA

se s

tu

die

s -

V1.

0

92

A. Build policies and standards for security and complianceB. Establish project audit practice2PC

A. Create formal application security support portalB. Establish role-based examination/certification3EG

phAse 4 (months 9 – 12) – governAnce & operAtionAl secUrity

The fourth phase of the assurance program implementation within VirtualWare continues on from the previous phases, by enhancing existing security functions within the organization. By now VirtualWare has implemented a number of critical application security processes and mechanisms to ensure that applications are developed and maintained securely.

A core focus in this phase is bolstering the Alignment & Governance Discipline. These three functions play a critical role in the foundation of an effective long term application security strategy. A completed education program is implemented, whilst at the same time a long term strategic roadmap is put in place for VirtualWare.

The other key focus within this phase is on the operational side of the implementation. VirtualWare management identified previously that the need for incident response plans and dedicated change man-agement processes are critical to the long term strategy.

VirtualWare saw this phase as the stepping stones to their long term future. This phase saw the orga-nization implement a number of final measures to cement the existing building blocks that have been laid down in the previous phases. In the long term this will ensure that the processes, concepts and controls put in place will continue to work within the organization to ensure the most secure outcome for their application platforms.

VirtualWare chose this phase to introduce their customers to their new application security initiatives, provide details of a series of programs to VirtualWare customers about application security, deploy-ing applications securely and reporting of vulnerabilities in VirtualWare applications. The key goal from these programs is to instill confidence in their customer base that VirtualWare applications are built with security in-mind, and VirtualWare can assist customers in ensuring their application environments using their technology are secure.

tArget objectives

During this phase of the project, VirtualWare implemented the following SAMM Practices & Activities.

A. Build security requirements into supplier agreementsB. Expand audit program for security requirements3SR

A. Conduct root cause analysis for incidentsB. Collect per-incident metrics3VM

Vir

tual

War

e -

Ph

ase

4

A. Conduct periodic industry-wide cost comparisonsB. Collect metrics for historic security spend3SM

A. Customize code analysis for application-specific concernsB. Establish release gates for code review

A. Expand audit program for operational informationB. Perform code signing for application components

3CR

3OE

Page 93: Software Assurance Maturity Model

sAM

M /

cA

se s

tu

die

s -

V1.

0

93

To achieve these maturity levels VirtualWare implemented a num-ber of programs during this phase of the roll-out. The following ini-tiatives were adopted;

✦Create well defined security requirements and testing program for all projects; ✦Create and implement a incident response plan; ✦Reviewed existing alerts procedure for applications and document a process for capturing events; ✦Create a customer security white-paper on deploying applications security; ✦Review existing security spend within projects and determine if appropriate budget has been allocated to each project for security; ✦ Implement the final education and awareness programs for application roles; ✦Complete a long term application security strategy roadmap for the organization.

In previous phases VirtualWare had released a formal incident re-sponse plan for customers to submit vulnerabilities found with their code. During this phase, VirtualWare took the results of the submit-ted vulnerabilities and conducted assessments of why the problem occurred, how and attempted a series of reporting to determine any common theme identified amongst the reported vulnerabilities.

As a part of the ongoing effort to ensure applications are deployed internally securely as well as on customer networks, VirtualWare created a series of white-papers, provided to customers based on industry standards for recommended environment hardening. The purpose of these guidelines is to provide assistance to customers on the best approach to deploying their applications.

During this phase, VirtualWare implemented a short computer based training module so that existing and new developers could maintain their skills in application security. It was also mandated that all “application” associated roles undertake a mandatory 1 day course per year. This was completed to ensure that the skills given to developers were not lost and new developers would be up skilled during their time with the company.

One of the final functions implemented within VirtualWare was to complete a “AS IS” gap assessment and review, and determine how effective the past 12 months had been. During this short program questionnaires were sent to all team members involved as well as a baseline review against SAMM. The weaknesses and strengths iden-tified during this review were documented into the final strategic roadmap for the organization and the next twelve months strategy was set for VirtualWare.

implementAtion costs

A significant amount of internal resources and costs were invested in this phase of the project. There were two different types of costs associated with this phase.

Internal Resource RequirementsInternal resource effort used in the creation of content, workshops and review of application security initiatives within this phase. Effort is shown in total days per role.

Outsourced ResourcesDue to the lack of knowledge within VirtualWare, external resourc-es were used to assist with the implementation of this phase, includ-ing documentation, processes and workshops.

22days

Consultant (Security)

4days

Developer

7days

Architect

9days

Manager

6days

Business Owner

1day

QA Tester

11days

SecurityAuditor

Page 94: Software Assurance Maturity Model

sAM

M /

cA

se s

tu

die

s -

V1.

0

94

ongoing (months 12+)Over the past twelve months VirtualWare has started by implementing a number of training and educa-tion programs, to developing internal guidelines and policies. In the final phase of the assurance program implementation, VirtualWare began to publish externally and work with their customers to enhance the security of their customer application platforms.

VirtualWare Management set an original mandate to ensure that software developed within the com-pany was secure, and to ensure that the market was aware of the security initiatives taken and to assist customers in securing their application platforms.

To achieve these management goals the first twelve months set the path for an effective strategy within VirtualWare, and finally by starting to assist customers in securing their application environments. Moving forward VirtualWare has set a number of initiatives within the organization to ensure that the company doesn’t fall into their old habits. Some of these programs include:

✦Business Owners and Team Leaders are aware of the risk associated with their applications and are required to sign-off on applications before release; ✦Team Leaders now require all applications to formally go through the security process, and code reviews are performed weekly by developers; ✦Ongoing yearly training and education programs (including CBT) are provided to all project staff and developers are required to attend a course at least once a year; ✦A dedicated Team Leader for Application Security has been created, and is now responsible for customer communications, and customer technical papers and guidelines.

Going forward VirtualWare now has a culture of security being a part of their SDL, thus ensuring that applications developed and provided to customers are secure and robust. An effective process has been put in place where vulnerabilities can be reported on and handled by the organization when required.

During the final implementation phase a project gap assessment was performed to identify any weak-nesses that appeared during the implementation. In particular due to the high-turnover of staff, Virtual-Ware needed to constantly train new developers as they started with the organization. A key objective set to address this problem was an induction program to be introduced specifically for developers so that they receive formal security training when they start with the organization. This will also help to create the mindset that security is important within the organization and its development team.

Vir

tual

War

e -

On

goin

g

Page 95: Software Assurance Maturity Model

sAM

M /

cA

se s

tu

die

s -

V1.

0

95

mAtUrity scorecArd

The maturity scorecard was completed as a self assessment dur-ing the implementation of the software assurance program by Vir-tualWare. The final scorecard (shown to the right) represents the status of VirtualWare at the time it began and the time it finished its four-phase improvement project.

after

before

1

3

1

2

0+

3

0

3

0+

3

0

1

0+

2

0+

3

1

2

1

3

0+

0+

0+

3

Strategy &Metrics

Policy &Compliance

Education &Guidance

ThreatAssessment

SecurityRequirements

SecureArchitecture

DesignReview

OperationalEnablement

EnvironmentHardening

VulnerabilityManagement

SecurityTesting

CodeReview

Page 96: Software Assurance Maturity Model

AUthor & project leAd

Pravir Chandra

contribUtors/reviewers

Fabio ArciniegasMatt BartoldusSebastien DeleersnyderJonathan CarterDarren Challey

Brian ChessDinis CruzJustin DerryBart De WinJames McGovern

Matteo MeucciJeff PayneGunnar PetersonJeff PiperAndy Steingruebl

John StevenChad ThunbergColin WatsonJeff Williams

cognostic•usOWASPThe Open Web Application Security Project

supporters

Thanks to the following organizations for helping review and support the SAMM Project. Note: OWASP and the SAMM Project do not endorse any commercial products or services

For the Latest version and additionaL inFo, pLease see the project web site at

http://www.opensamm.org

sponsors

Thanks to the following organizations that have made significant contributions to the SAMM Project.

License

This work is licensed under the Creative Commons Attribution-Share Alike 3.0 License. To view a copy of this license, visit http://creativecommons.org/licenses/by-sa/3.0/ or send a letter to Creative Commons, 171 Second Street, Suite 300, San Francisco, California, 94105, USA.