cri$cising)the)common)criteria)argument) - …)the(development...
TRANSCRIPT
Cri$cising the Common Criteria Argument
Patrick Graydon
A Contribu$on to the Development of the Common Criteria
A long 1me ago . . .
• Assurance Based Development of a shadow implementa1on of Tokeneer – Tokeneer developed for Common Criteria evalua1on at EAL 5
– Work raised some cri1cisms of the standard – No point in cri1cising 2.1 when 3.1 is current
07/02/2012 3 Cri1cising the Common Criteria Argument
Evalua1ng Standards
• Ques1on: Can you evaluate a standard by extrac1ng and cri1cising its argument? – Requirements form evidence – Explana1on of requirements explains evidence
• Case study target: The Common Criteria For Informa1on Technology Security Evalua1on – A major interna1onal security standard
07/02/2012 4 Cri1cising the Common Criteria Argument
The Standard
Cri1cising the Standard’s Argument
Requirements (Elements)
Explana1ons (Objec1ves)
Is the evidence adequately explained?
Is the security evidence sufficient?
07/02/2012 5 Cri1cising the Common Criteria Argument
The Common Criteria Standard
• The standard comprises three parts: Part 1: Introduc1on and general model
Part 2: Security func1onal components
Part 3: Security assurance components
• This trilogy has a Part 4 (sort of): The Common Methodology for Informa3on Technology Security Evalua3on
07/02/2012 6 Cri1cising the Common Criteria Argument
Security Targets and Evalua1on
Target of Evalua1on
TOE Evalua1on
Security Problem (Assets, threats, etc.) Security
Target Template Func3onal
Requirements Part 2
Template Assurance
Requirements Part 3
ST Evalua1on
07/02/2012 7 Cri1cising the Common Criteria Argument
Capturing the Argument
Part 3
Classes Objec1ves
Families Objec1ves
Elements
Part 1 Objec1ves & model
. . .
. . .
07/02/2012 8 Cri1cising the Common Criteria Argument
Capturing the Elements
Development security
documenta1on
Developer Ac$on Element: ALC_DVS.2.1D The developer shall produce and provide development security documenta1on. Content and presenta$on elements: ALC_DVS.2.1C The development security documenta1on shall describe all the physical, procedural, personnel, and other security measures that are necessary to protect the confiden1ality and integrity of the TOE design and implementa1on in its development environment. . . .
The development security policies are complete (ALC_DVS.1.1C)
Development security policies are complete if they describe all the physical, procedural, personnel, and other security measures that are necessary to protect the confiden1ality and integrity of the TOE design and implementa1on in its development environment
07/02/2012 9 Cri1cising the Common Criteria Argument
Capturing the Elements
The evaluators’
report on site visit(s)
Evaluator ac$on elements: ALC_DVS.2.1E The evaluator shall confirm that the informa1on provided meets all requirements for content and presenta1on of evidence. ALC_DVS.2.2E The evaluator shall confirm that the security measures are being applied.
The development security procedures are being applied
(ALC_DVS.1.2E)
Not explicitly modelled in the argument (corresponds to confirming that the argument reflects the real world)
Reflects applica1on
notes
07/02/2012 10 Cri1cising the Common Criteria Argument
Capturing the Objec1ves Objec$ves Development security is concerned with physical, procedural, personnel, and other security measures that may be used in the development environment to protect the TOE and its parts. It includes the physical security of the development loca1on and any procedures used to select development staff. Applica$on notes This family deals with measures to remove or reduce threats exis1ng at the developer's site. . . . The evaluator should visit the site(s) in order to assess evidence for development security. . . . Any decision not to visit shall be agreed with the evalua1on authority. . . . It is recognised that confiden1ality may not always be an issue for the protec1on of the TOE in its development environment. The use of the word “necessary” allows for the selec1on of appropriate safeguards.
Not an objec$ve!
Measures to remove or reduce threats exis1ng at the developer’s site are
adequate (ALC_DVS, EALs 3-‐7)
07/02/2012 11 Cri1cising the Common Criteria Argument
Evalua1on Assurance Levels
• Standard defines 7 EALs – Standard levels of assurance – Each a package of SARs – Each builds on the one below
• We modelled all 7 EALs – Modelled only families that appear in an EAL (did not model ALC_FLR [“flaw remedia1on”])
07/02/2012 12 Cri1cising the Common Criteria Argument
The Captured Argument
• Not a security argument – No system-‐specific detail – Argues over (not through) threats and SFRs
• Not a full ra1onale – No evidence of technique adequacy – No men1on of trade-‐offs made
07/02/2012 13 Cri1cising the Common Criteria Argument
Reviewing the Argument
• Phased process1, i.e. one family’s worth at a 1me • Looked for: – Vagueness or poten1al misinterpreta1on – Unreasonable assump1ons (implicit or explicit) – Logical fallacies (from taxonomy2) – Omission of expected prac1ce – Insufficient evidence
1. P. Graydon, J. Knight, and M. Green, “Cer1fica1on and safety cases”, Proc. of the 28th Interna3onal Systems Safety Conference (ISSC), 2010
2. W. Greenwell, J. Knight, C. M. Holloway, and J. J. Pease, “A taxonomy of fallacies in system safety arguments”, Proc. of the 24th Interna3onal System Safety Conference (ISSC), 2006
07/02/2012 14 Cri1cising the Common Criteria Argument
Example Cri1cism
07/02/2012 15 Cri1cising the Common Criteria Argument
ST’s security requirements ra1onale
The security requirements are sufficient
Shows reliance, not sufficiency
Each SFR is traced back to the security objec1ves for
the TOE
The SFRs meet all security
objec1ves for the TOE
The reason why the SARs were chosen has
been explained
‘Adequately’ meet?
Does the explana$on show adequacy?
Sentencing the Issues
52
121
Issues created during argument capture
Issues traceable to the standard
• First, check the standard: did we introduce the issue in the process of argument capture?
07/02/2012 16 Cri1cising the Common Criteria Argument
Further Classifica1on
Impact
L M H Total
Missing or inadequate evidence 21 34 5 60
Misleading or inadequate explana1on 18 7 3 28
Other (mostly vagueness) 22 7 4 33
61 48 12 121
07/02/2012 17 Cri1cising the Common Criteria Argument
Missing or Inadequate Evidence
• Either: – The standard’s argument omits a premise – Evidence that could prac1cably increase confidence not demanded
• We use the As Confident As Reasonably Prac1cable (ACARP) principle
07/02/2012 18 Cri1cising the Common Criteria Argument
System Level Tes1ng Against SFRs Security Problem
Security Objec1ves
Func1onal Requirements
Func1onal Specifica1on
Design Descrip1on
Implementa1on Representa1on
Implementa1on
Func1onal Tes1ng
E.g. source code
An analysis of test coverage demonstrates the correspondence between the tests . . . and the TSFIs
An analysis of test depth shows all subsystems have been tested
ADV_FSP requires a tracing
(not proof of
refinement)
07/02/2012 19 Cri1cising the Common Criteria Argument
Adequacy of Structural Coverage
Func1onal Requirements
Func1onal Specifica1on
Design Descrip1on
Implementa1on Representa1on
Implementa1on
Func1onal Tes1ng
An analysis of test depth shows all subsystems have been tested (and, at EAL 7, that “the TSF operates in accordance with
its implementa1on representa1on”)
An analysis of test coverage demonstrates the correspondence between the tests . . . and the TSFIs
Evidence of structural (code) coverage?
07/02/2012 20 Cri1cising the Common Criteria Argument
Adequacy of Proof Process Func1onal Requirements
Func1onal Specifica1on
Design Descrip1on
Implementa1on Representa1on
Must prove refinement at EAL 7
“The proof of correspondence between the formal specifica1ons of the TSF subsystems [given in the design documenta1on] and of [sic] the func1onal specifica1on shall demonstrate that all behaviour described in the TOE design is a correct and complete refinement of the TSFI that invoked it.” (EAL 7)
• Proof process determines resul1ng confidence – Machine checked? – Translated? By hand? – Quality of tools?
The CC does not specify
07/02/2012 21 Cri1cising the Common Criteria Argument
Correctness Of Security Policy Model
Security Objec1ves
Func1onal Requirements
Func1onal Specifica1on
Design Descrip1on
(Formal) Security Policy
Model
Formal proof of correspondence between the model and any formal func1onal specifica1on
Evidence that model’s defini$on of “secure” is appropriate? Dashed line shown in Fig. 10 of Part 3, but . . .
“Throughout the design, implementa1on, and review processes, the modelled security requirements may be used as precise
design and implementa1on guidance . . .”
07/02/2012 22 Cri1cising the Common Criteria Argument
Sufficiency of SARs
Security Target
Template Assurance
Requirements Part 3
ST Evalua1on
We modelled the EALs’ SARs, but . . .
Must “explain why the SARs were chosen”
Security Requirements Ra1onale
Security Assurance
Requirements
CEM: “Any explana1on is correct so long as it is coherent and neither the SARs nor the explana1on have obvious inconsistencies with the remainder of the ST”
No evidence that SARs are sufficient
07/02/2012 23 Cri1cising the Common Criteria Argument
Clarity of the Security Problem
Security Target
ST Evalua1on
Security Problem Defini1on
ADV_SPD Objec$ve: “Evalua1on of the security problem defini1on is required to demonstrate that the security problem intended to be addressed by the TOE and its opera1onal environment, [sic] is clearly defined.”
No evidence that the
descrip$on is clear
07/02/2012 24 Cri1cising the Common Criteria Argument
Security Objec1ves
ADV_SPD Elements: “The security problem defini1on shall describe the assump1ons about the opera1onal environment of the TOE.” . . .
Appropriateness of Architecture
07/02/2012 25 Cri1cising the Common Criteria Argument
Security Architecture
TOE Evalua1on
ADV_ARC objec$ve: “The security architecture descrip1ons supports the implicit claim that security analysis of the TOE can be achieved by examining the TSF; without a sound architecture, the en1re TOE func1onality would have to be examined. ”
Beaer evidence could be provided
ADV_ARC Elements: Evaluator must confirm that architecture descrip1on “demonstrates” that the TSF: • “protects itself from tampering” • “prevents bypass of the SFR-‐enforcing
func1onality”
How good is good enough?
How does the evaluator decide?
Must we follow “best prac3ce”?
TOE Generated From Given Source Implementa1on Representa1on
Implementa1on
Func1onal Tes1ng
E.g. source code
07/02/2012 26 Cri1cising the Common Criteria Argument
Test Executable(s)
At EALs 4–7, must “define the TSF to a level of detail such that the TSF can be generated without further design decisions”
At EALs 4–7, must “support the produc1on of the TOE by automated means”
CM plan
CM system At EALs 3–7, must “describe how the CM system is used for the development of the system”
At EALs 6–7, evaluator must “determine that the applica1on of the produc1on support procedures results in a TOE as provided by the developer for tes1ng ac1vi1es”
Good enough?
?
Details of Code Obfusca1on
Implementa1on Representa1on
07/02/2012 27 Cri1cising the Common Criteria Argument
Implementa1on
At EALs 4–7, must “be in the form used by the development personnel”
Obfuscated (“Shrouded”)
Implementa1on Representa1on
ADV_IMP applica$on notes: “The components [of ADV_IMP] require details on the shrouding tools/algorithms . . . to gain confidence that the shrouding process does not compromise any security func1onality.”
No element of ADV_IMP explicitly requires this detail
Implementa1on Standards
07/02/2012 28 Cri1cising the Common Criteria Argument
Implementa1on Representa1on
Tools Implementa1on Standards
At EALs 5–7: • Developer must “describe and provide the implementa1on standards that are being applied”
• Evaluator must “confirm that the implementa1on standards have been applied”
ALC_TAT applica$on notes: “Implementa1on guidelines may be accepted as an implementa1on standard if they have been approved by some group of experts (e.g. academic experts, standards bodies). Implementa1on standards are normally public, well accepted and common prac1se in a specific industry, but developer-‐specific implementa1on guidelines may also be accepted as a standard; the emphasis is on exper1se.”
No element of ALC_TAT requires
evidence of adequacy
Tool Correctness
07/02/2012 29 Cri1cising the Common Criteria Argument
Implementa1on Representa1on
Tools Implementa1on Standards
At EALs 4–7, tools must be “well-‐defined”
Impact of tool failure varies: • Might miss a defect or vulnerability
• Might introduce a defect or vulnerability
In some cases, tool failure might have sever consequences.
No element of ALC_TAT requires
evidence of correctness
“These are tools that are clearly and completely described. For example, programming languages and computer aided design (CAD) system that are based on a standard published by standards bodies are considered to be well-‐defined.
Misleading or Inadequate Explana1on
• Either: – The objec1ves are too narrow (i.e. don’t support the higher levels of argument)
– The objec1ves are too wide (i.e. aren’t supported by evidence)
• Explana1on is important – Defines the “spirit” of the requirements – Basis for making judgments of degree
07/02/2012 30 Cri1cising the Common Criteria Argument
Objec1ve of Exhaus1ve Tes1ng
07/02/2012 Cri1cising the Common Criteria Argument 31
ATE_COV Objec$ve at EALs 6 & 7: “In this component, the objec1ve is to confirm that the developer performed exhaus8ve tests on all interfaces in the func1onal specifica1on. The objec1ve of this component is to confirm that all parameters of all of the TSFIs have been tested.”
ATE_COV Applica$on Notes: “The developer is required to demonstrate that the tests exercise all of the parameters of all TSFIs. This addi1onal requirement includes bounds tes1ng . . . and nega1ve tes1ng . . . . This kind of tes1ng is not, strictly speaking, exhaus8ve, because not every possible value of the parameters is expected to be checked.”
Objec$ve of exhaus$ve tes$ng deliberately not sa$sfied
(and completely is undefined)
ATE_COV Elements at EALs 6 & 7: The TSFIs must be “completely tested”.
Other
• Issues that did not fit either of the other two categories
• Nearly all were issues of vagueness
07/02/2012 32 Cri1cising the Common Criteria Argument
“Focused” and “Methodical”
• At EALs 2 and 3: “The evaluator shall perform an independent vulnerability analysis of the TOE using the guidance documenta1on, func1onal specifica1on, TOE design and security architecture descrip1on to iden1fy poten1al vulnerabili1es in the TOE.”
• At EAL 4, this analysis must be “independent” • At EAL 5, it must also be “focused” • At EALs 6 and 7, it must instead be “methodical” • Neither “focused” nor “methodical” is defined
07/02/2012 Cri1cising the Common Criteria Argument 33
Findings
• We applied our evalua1on method to the CC – It was feasible
• Took one analyst several weeks to capture and cri1cize – It found issues of interest
07/02/2012 34 Cri1cising the Common Criteria Argument
Threats to Validity of Findings
• Overlooking expected prac1ce – Inexperienced analyst might not find all issues
• Might fail to find remote defini1ons – We searched the CC and norma1ve references – Should be cross-‐referenced anyway
• Might not know common security parlance – Defini1ons vary; should be captured anyway
07/02/2012 35 Cri1cising the Common Criteria Argument
Threats to Validity of Findings
• Es1mates of cost or feasibility might be wrong – Would lead to false report of missing evidence – We considered only forms of evidence with established techniques and tool support
– Suggest wri1ng standards to keep with ACARP
07/02/2012 36 Cri1cising the Common Criteria Argument
Threats to Generalisability
• Some standards might lack a complete arg. – Writers might consider it ‘complete’ without explana1on at all levels • E.g., DO-‐178B lacks top-‐level argument
– Might lead to false-‐posi1ve ‘insufficient evidence’ • Analyst must assume the highest importance
– There really ought to be an argument • Might be presented separately • How else would we know the standard is sufficient?
07/02/2012 37 Cri1cising the Common Criteria Argument