a security practice evaluation framework · • threat modeling • use least privilege ... •...

21
Science of Security Lablet Security Metrics-Driven Evaluation, Design, Development, & Deployment A Security Practice Evaluation Framework Patrick Morrison Advisor: Dr. Laurie Williams North Carolina State University

Upload: others

Post on 16-Jul-2020

1 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: A Security Practice Evaluation Framework · • Threat Modeling • Use Least Privilege ... • standardize architectural descriptions (include data flow) • Educate • create top

Science of Security Lablet

Security Metrics-Driven Evaluation, Design, Development, & Deployment

A Security Practice Evaluation Framework

Patrick Morrison

Advisor: Dr. Laurie Williams North Carolina State University

Page 2: A Security Practice Evaluation Framework · • Threat Modeling • Use Least Privilege ... • standardize architectural descriptions (include data flow) • Educate • create top

Science of Security Lablet

Security Metrics-Driven Evaluation, Design, Development, & Deployment

A Security Practice Evaluation Framework

Page 3: A Security Practice Evaluation Framework · • Threat Modeling • Use Least Privilege ... • standardize architectural descriptions (include data flow) • Educate • create top

Science of Security Lablet

Security Metrics-Driven Evaluation, Design, Development, & Deployment • Verification

• Threat Modeling • Use Least Privilege • Implement Sandboxing • Minimize Use of Unsafe String and Buffer Functions • Validate Input and Output to Mitigate Common

Vulnerabilities • Use Robust Integer Operations for Dynamic Memor • Use Canonical Data Formats • Avoid String Concatenation for Dynamic SQL

Statements • Eliminate Weak Cryptography • Use Logging and Tracing • Use Appropriate Testing Tools • Perform Fuzz / Robustness Testing • Perform Penetration Testing • Team Training • Public design discussions, bug reports • Security Feature Review • Risk Questionnaire • Implementation • Use Code Analysis Tools • Customize tactics to current project • Private vulnerability reporting • project team develops and confirms fix • Enforce Coding Standards • Top N Bugs List • Testing • Use fuzzing tools • recheck all previous security assertions • Apply test suites, load testing • run external marketing program • host external software security events • create top N bugs list (real data preferred) (T:

training) • have SSG perform ad hoc review • use automated tools along with manual review • use automated tools with tailored rules • build capability for eradicating specific bugs from

entire codebase • This is a placeholder, just to see if you’re looking. • make code review mandatory for all projects • perform security feature review • define/use AA process • perform design review for high-risk applications • have SSG lead review efforts • standardize architectural descriptions (include data

flow) • Educate

• create top N bugs list (real data preferred) (T: training)

• have SSG perform ad hoc review • use automated tools along with manual review • use automated tools with tailored rules • build capability for eradicating specific bugs from

entire codebase • make code review mandatory for all projects • perform security feature review • define/use AA process • perform design review for high-risk applications • have SSG lead review efforts • standardize architectural descriptions (include data

flow) • educate executives • provide awareness training • hold satellite training/events • create security standards (T: sec features/design) • create security portal • promote executive awareness of compliance/privacy

obligations • offer role-specific advanced curriculum (tools,

technology stacks, bug parade) • create/use material specific to company history • offer on-demand individual training • provide training for vendors or outsource workers • require annual refresher • make SSG available as AA resource/mentor • have software architects lead review efforts • build internal forum to discuss attacks (T:

standards/req) • identify open source in apps • identify software bugs found in ops monitoring and

feed back to dev • use application input monitoring • use application behavior monitoring and

diagnostics • publish installation guides created by SSDL • use code signing • use code protection • drive feedback from SSDL data back to policy (T:

strategy/metrics) • create secure coding standards • create a standards review board • implement/track controls for compliance • enforce coding standards

• identify PII data in systems (inventory) • publish data about software security internally • identify gate locations, gather necessary artifacts • identify metrics and drive initiative budgets with

them • require security sign-off • use internal tracking application with portfolio

view • create standards for technology stacks • translate compliance constraints to requirements • paper all vendor contracts with SLAs compatible

with policy • impose policy on vendors • communicate standards to vendors • gain buy-in from legal department and standardize

approach • ensure QA supports edge/boundary value

condition testing • share security results with QA • allow declarative security/security features to drive

tests • use external pen testers to find problems • integrate black box security tools into the QA

process (including protocol fuzzing) • feed results to defect management/mitigation (T:

config/vuln mgmt) • use pen testing tools internally • include security tests in QA automation • have a science team that develops new attack

methods arm testers and auditors • create and use automation to do what the attackers

will do • to application APIs • drive tests with risk analysis results • leverage coverage analysis • patterns • control open source risk • create/interface with incident response • have emergency codebase response • fix all occurrences of software bugs from ops in the

codebase (T: code review) • enhance dev processes (SSDL) to prevent cause of

software bugs found in ops • track software bugs found during ops through the

fix process

Links: www.bsimm.com, www.opensamm.org

Page 4: A Security Practice Evaluation Framework · • Threat Modeling • Use Least Privilege ... • standardize architectural descriptions (include data flow) • Educate • create top

Science of Security Lablet

Security Metrics-Driven Evaluation, Design, Development, & Deployment

“Is Software Security a Waste of Time?”*

• “For most companies it’s going to be far cheaper and serve their customers a lot better if they don’t do anything about [security bugs] until something happens” – John Viega, SilverSky

• “An Exploit that works against Reader or Flash puts more than a billion computers at risk.” – Brad Arkin, Adobe

• * - RSA 2013 Panel Title

http://securitywatch.pcmag.com/none/308760-rsa-is-software-security-a-waste-of-time

Page 5: A Security Practice Evaluation Framework · • Threat Modeling • Use Least Privilege ... • standardize architectural descriptions (include data flow) • Educate • create top

Science of Security Lablet

Security Metrics-Driven Evaluation, Design, Development, & Deployment

Measurement Supports Management

• Goal: Empirically measure the relationship between secure development practices and security outcomes

• Uses: – Correlate practices and outcomes – Advise practice selection

• Alternate Goal Statement: Front Page Avoidance

Page 6: A Security Practice Evaluation Framework · • Threat Modeling • Use Least Privilege ... • standardize architectural descriptions (include data flow) • Educate • create top

Science of Security Lablet

Security Metrics-Driven Evaluation, Design, Development, & Deployment

”[…] became public […] because of a programming error that […] sent it around the world on the Internet. Computer security experts […] gave it a name: Stuxnet.”

Page 7: A Security Practice Evaluation Framework · • Threat Modeling • Use Least Privilege ... • standardize architectural descriptions (include data flow) • Educate • create top

Science of Security Lablet

Security Metrics-Driven Evaluation, Design, Development, & Deployment

Security Outcome: CVE-2008-4250

Page 8: A Security Practice Evaluation Framework · • Threat Modeling • Use Least Privilege ... • standardize architectural descriptions (include data flow) • Educate • create top

Science of Security Lablet

Security Metrics-Driven Evaluation, Design, Development, & Deployment

Software Release •Netapi32.dll • Developed over the 80’s, 90’s • TCP/IP connection added in 2000

http://support.microsoft.com/kb/958644

Page 9: A Security Practice Evaluation Framework · • Threat Modeling • Use Least Privilege ... • standardize architectural descriptions (include data flow) • Educate • create top

Science of Security Lablet

Security Metrics-Driven Evaluation, Design, Development, & Deployment

Context Factors

Source: http://www.ohloh.net/p/samba

Page 10: A Security Practice Evaluation Framework · • Threat Modeling • Use Least Privilege ... • standardize architectural descriptions (include data flow) • Educate • create top

Science of Security Lablet

Security Metrics-Driven Evaluation, Design, Development, & Deployment

Tactics

• “Public Enemy Number 1: The Buffer Overrun” – “Writing Secure Code”, Howard and LeBlanc

• “Minimize Use of Unsafe String and Buffer Functions” – SAFECode practice recommendations

• “Create secure coding standards” – BSIMM

Page 11: A Security Practice Evaluation Framework · • Threat Modeling • Use Least Privilege ... • standardize architectural descriptions (include data flow) • Educate • create top

Science of Security Lablet

Security Metrics-Driven Evaluation, Design, Development, & Deployment

Adherence Issue

• Originally, netapi32.dll’s buffer overflows were ‘just bugs’

•When connected to every computer in the world, they became security vulnerabilities

• Optional debugging practices became mandatory security practices

Page 12: A Security Practice Evaluation Framework · • Threat Modeling • Use Least Privilege ... • standardize architectural descriptions (include data flow) • Educate • create top

Science of Security Lablet

Security Metrics-Driven Evaluation, Design, Development, & Deployment

Example Summary

Page 13: A Security Practice Evaluation Framework · • Threat Modeling • Use Least Privilege ... • standardize architectural descriptions (include data flow) • Educate • create top

Science of Security Lablet

Security Metrics-Driven Evaluation, Design, Development, & Deployment

Strategy

Microsoft Security Development Lifecycle (SDL)

http://msdn.microsoft.com/en-us/library/cc307406

Page 14: A Security Practice Evaluation Framework · • Threat Modeling • Use Least Privilege ... • standardize architectural descriptions (include data flow) • Educate • create top

Science of Security Lablet

Security Metrics-Driven Evaluation, Design, Development, & Deployment

“Only amateurs attack machines; professionals target people.” – Bruce Schneier http://www.schneier.com/crypto-gram-0010.html

Page 15: A Security Practice Evaluation Framework · • Threat Modeling • Use Least Privilege ... • standardize architectural descriptions (include data flow) • Educate • create top

Science of Security Lablet

Security Metrics-Driven Evaluation, Design, Development, & Deployment

Context Factors

Source http://commons.wikimedia.org/wiki/File:T

l J l j

http://www.learnersdictionary.com/search/fishing

Page 16: A Security Practice Evaluation Framework · • Threat Modeling • Use Least Privilege ... • standardize architectural descriptions (include data flow) • Educate • create top

Science of Security Lablet

Security Metrics-Driven Evaluation, Design, Development, & Deployment

Practice Adherence

http://www.syslog.com/~jwilson/pics-i-like/kurios119.jpg

Page 17: A Security Practice Evaluation Framework · • Threat Modeling • Use Least Privilege ... • standardize architectural descriptions (include data flow) • Educate • create top

Science of Security Lablet

Security Metrics-Driven Evaluation, Design, Development, & Deployment

Security Outcome

http://www.amazon.com/NEVER-MIND-BEWARE-OWNER-Plastic/dp/B0040BPJ60

Page 18: A Security Practice Evaluation Framework · • Threat Modeling • Use Least Privilege ... • standardize architectural descriptions (include data flow) • Educate • create top

Science of Security Lablet

Security Metrics-Driven Evaluation, Design, Development, & Deployment

Next Steps

• Complete first draft of Security Practices Evaluation Framework

• Develop survey for investigating practice adherence in development teams

•Work with industrial partner(s) to apply survey, analyze results

Page 19: A Security Practice Evaluation Framework · • Threat Modeling • Use Least Privilege ... • standardize architectural descriptions (include data flow) • Educate • create top

Science of Security Lablet

Security Metrics-Driven Evaluation, Design, Development, & Deployment

We’d like to find…

• Organizations and teams willing to take our practice adherence survey

• Organizations willing to do case studies applying SP-EF to one or more projects

Page 20: A Security Practice Evaluation Framework · • Threat Modeling • Use Least Privilege ... • standardize architectural descriptions (include data flow) • Educate • create top

Science of Security Lablet

Security Metrics-Driven Evaluation, Design, Development, & Deployment

We can offer…

• Survey results • Support for building a feedback loop

around security practices and results for your organization

Page 21: A Security Practice Evaluation Framework · • Threat Modeling • Use Least Privilege ... • standardize architectural descriptions (include data flow) • Educate • create top

Science of Security Lablet

Security Metrics-Driven Evaluation, Design, Development, & Deployment

Come See The Poster