srs phase ii pi meeting 18 december 2007 1 strengthen, prepare, detect, react strengthen, prepare,...
Post on 30-Jan-2016
215 views
TRANSCRIPT
SRS Phase II PI Meeting 18 December 2007
1
Strengthen, Prepare, Detect, ReactStrengthen, Prepare, Detect, React(SPDR)
to Mitigate the Insider ThreatPI Meeting
18 December 2007Tom Haigh, Dick O’Brien
Adventium Labs
SRS Phase II PI Meeting 18 December 2007
2
Outline
• Overview– Problem and objective– SPDR solution
• Progress– Stable online system– Off-line components– Red team ROE
• Attribution– Mission tracking– Attribution Module
SRS Phase II PI Meeting 18 December 2007
3
Problem & Objectives
Problem• Insiders have detailed knowledge and legitimate access
• Possible to mount particularly dangerous attacks
• No good methods for identifying malicious insiders
Objectives
• Severely limit the malicious insider’s potential to do harm.– Thwart 50% of attacks
(incorrect thwart : correct thwart) < 1– Detect and attribute 90% of attacks
(false positives : correct D&A) < 10
• Enable individuals to accomplish their authorized tasks
SRS Phase II PI Meeting 18 December 2007
4
SPDR Approach
SRS Phase II PI Meeting 18 December 2007
5
Scenario and Testbed
Scenario• Carrier-based air mission planning
• Uses GD tactical SOA
• Embedded in larger carrier network
• Appears consistent with Navy direction: single network, SOA-based applications
Testbed: Winter 07• DSM and Web server are DRED-less
• No non-mission workstation
• Smartcard Readers on each DRED
IOS CDC SOS AOS
DREDSnortproxyiptables
CardReader DRED
Snortproxyiptables
CR
DREDSnortproxyiptables
CR
DREDSnortproxyiptables
CR
DSM
ProvenanceMonitor
MissionMonitor
PlanRecognition
ResponseModule
PolicyManagement
WebServer
SRS Phase II PI Meeting 18 December 2007
6
Progress and Plans
Internal Red T
eam
External R
ed Team
• Finalize ROE
• Separate testbed from Adventium network
• Populate network & adversary models
• Attribution Module
• Completed proxy and filters
• Stabilized smart card support
• Completed DSM - Plan Recognition interface
• Completed initial attack tree
• Completed set of responses & response selection process • Refine the plan library
• Field additional sensors
• Adjust Plan Recognition probability model
Fine-Tuning
Now January February April
In Progress
SRS Phase II PI Meeting 18 December 2007
7
Online Components Are Complete
DRED migration to Linux completed● More robust smart card support● Better proxy support● SELinux policy enforcement● Do not expect to implement honeynet● Using Red Hat: would minimize for an operational system
Ready to test with scenario
Maintain stable target for evaluation January to June
Plan Recognition
Linux-basedDREDs
Response
SensedEvents
ResponseCommands
PlanLikelihood
Provenance &Mission Monitors
MissionProgress
Proxy generatedMeta-data
MissionProgress
PolicyDatabase
Per host &user policy
SRS Phase II PI Meeting 18 December 2007
8
Response Module
PlanRecognitionsyslog
Response
displayinfo
Configuration
DB
statestate
events
DRED
DRED
DRED
DRED
Response
Pop-upWarning
Plan Recognition informs Response Module of possible active attack plans and their likelihood
Based on likelihood of attack Response module initiates predetermined response
Possible Responses:● Pop up a warning
● Stop the mission● Check a host● Review before proceeding
● Block a host● Block a user (on a host)● Block a service (for a user) (on a host)
SRS Phase II PI Meeting 18 December 2007
9
Response Display
Displays current status including:
• Current blocks
• Current users logged in to clients and authenticated to DREDs
• All messages sent by the PRM to the Response Module
• All responses taken by the Response Module
• Status of current response
RM also records this information in a database for use by the Attribution Module and for scoring purposes
SRS Phase II PI Meeting 18 December 2007
10
Threat Model• Adversary goals• Adversary capabilities
Network Model• Topology• Hosts, services, protocols• Critical assets• Detectable events
Reasoning Engine
Attack Plan & Sensor Generation
• Attack plan library
• Sensors and locations (use host-based sensors)
Under construction Complete
SRS Phase II PI Meeting 18 December 2007
11
Initial Attack Trees
Goals: Disrupt air misson; induce incorrect mission– Exfiltrate/modify
• Known threats• Available assets• Flight plans or objectives
– Avoid attribution
– Delay• Receipt or processing of
threat information• Preparation of mission plan• Cancellation of incorrectly
planned mission
Attack Mechanisms– Insider specific exploits
• Physical presence • Authorized access• Subtle semantic attacks
– More general exploits• Introduction of malware• Chaff to overwhelm SPDR
components
SRS Phase II PI Meeting 18 December 2007
12
Confidentiality Attack Tree (CA)
Confidentiality Attack (CA)
Steal information
1. Access Storage1. Use legitimate privileges
2. Escalate to sufficient privilege on host
3. Bypass operating system
4. Authenticate as user with sufficient privileges
5. Exploit a vulnerability in an application with sufficient privileges
6. Trick user with sufficient privilege into installing spyware that steals information
2. Intercept Communications
3. Visual Acquisitions
4. Trojan an application
Exfiltrate information
1. Conceal in channel (iterate if desirable)
1. Select channel, &
2. Select encoding,&
3. Encode and Xmit
2. Use physical medium
Utilize chaff (option)
& &
SRS Phase II PI Meeting 18 December 2007
13
CA-1: Access Storage
MI reads fP reads fP reads fP reads fP reads fP reads f
MI opens fP opens fP opens fP opens fP opens fP opens f
Spyware runs P as URun P as URun P(I)Run P as URun P
U installs spywareMount exploit to escalate to privileges of U
Run program X with exploit to escalate to privileges of U
MI boots H(I) with new OS
U authenticates on HAuthenticate on H but not as U
Authenticate as UAuthenticate as MIAuthenticate as MI
MI halts H(I)MI sends U (link to) spyware
MI steals U’s credential
1-3 MI bypasses operating system
1-6 Trick U into installing spyware {U can run P)}
1-5 Escalate privilege {service S on H is vulnerable & U can run P on H
1-4 Steal credential {U can run P}
1-2: Local escalate privilege {U can run P}
1-1: Use legitimate privileges {MI can run P
MI learns information in file f on host H using program P.
Blue => event could be sensed
SRS Phase II PI Meeting 18 December 2007
14
SPDR Red Team Evaluation
• Draft ROE complete– Currently refining definitions and metrics White Team providing good input
– Identifying non-metric related objectives– Establishing principles for attack distribution
• Issue– Obtaining proprietary GD SOA code for Red Team– Need object code to run the scenario code– Alerted GD of need to establish NDA with SRA/RABA
SRS Phase II PI Meeting 18 December 2007
15
Attribution
Two components
• Mission independent sensors and attribution
– DRED network attack sensors
– Host-based sensors
– Authentication anomalies
• Mission dependent sensors and attribution
– Mission tracking
IOS CDC SOS AOS
Snortproxyiptables
DREDCard
Reader
WebServer
DSM
ProvenanceMonitor
MissionMonitor
ResponseModule
PolicyManagement
PlanRecognition
DREDSnortproxyiptables
CR
DREDSnortproxyiptables
CR
DREDSnortproxyiptables
CR
IOS CDC SOS AOS
DREDSnortproxyiptables
CardReader
DREDSnortproxyiptables
CR
DREDSnortproxyiptables
CR
DREDSnortproxyiptables
CR
DSM
ProvenanceMonitor
MissionMonitor
PlanRecognition
ResponseModule
PolicyManagement
WebServer
SRS Phase II PI Meeting 18 December 2007
16
Mission Tracking
• Misson model captures mission workflow
– Temporal dependencies
– Functional dependencies
– Message invariants, pre and post conditions
• XML message schemas capture message format
• Mission model and message schemas allow monitoring of on-going missions to
– Identify possible attempts to corrupt the mission by an insider – Attribute the failure of a mission to the persons most likely responsible
SRS Phase II PI Meeting 18 December 2007
17
Mission Service Proxy
• Mission software based on Service Oriented Architecture
• SOA Services are Proxied in the DRED– Messages from local service are Invisibly redirected
to the internal proxy, and from there to the remote peer
• XML Schema impose strong constraints on legal messages
• DRED SOA proxy validates each mission message – Alert on defective messages– Can drop or scrub
• Proxy hardened via SELinux and vulnerability analysis
http proxy
Remote peer
DRED
XMLValidator
Local Service
schema
DSM
SRS Phase II PI Meeting 18 December 2007
18
Mission & Provenance Monitors
• MM tracks the progress of each mission with respect to its plan– Uses functional dependency and timing model of the mission– Receives mission message notifications from DRED proxy– Reports mission step anomalies or delays to the PRM
PM preserves message data & metadata that can be used, if a mission fails, to help identify the cause of the failure and the user responsible
Receives message information from DREDs
Stores info in database for use in attribution analysis
Checks message data and notifies PRM of consistency errors
GIG IOS CDC SOS AOSThreatUpdated CarrierThreatUpdated
CreateFlightPlan QueryAssets
GetFlightPlanResp
BriefFlightPlanResp
ReleaseAssets
SRS Phase II PI Meeting 18 December 2007
19
Attribution Module
AM
DRED AttackEffectsMMPMPRM Host
Accusations & Likelihoods
Observables
Performs on-line and post mortem attribution of attacksAttributes attack to source host and probable insider based on observed events
Observables DRED
Snort alerts on network attacks Proxy alerts on bad messages Heartbeat alerts Authentications
PRM – likely attacks in progress PM – inconsistent/bad message content MM – mission status, abnormal states Attack Effects – success/failure of mission Host – Logins, other TBD
SRS Phase II PI Meeting 18 December 2007
20
Attribution Reasoning
Bayesian belief network used to assess likelihood of possible attribution hypotheses Leaf nodes are potential observables Attribution assessment updated on the arrival of new evidence
Mission functional dependencies built into network to support attribution of “semantic attacks” by the insider that are not observable but that can be detected post mortem
Much of network is mission independent
SRS Phase II PI Meeting 18 December 2007
21
Mission Derived Elements
IosCorruptTargetPos
FailureBadTargetPos
CdcCorruptTargetPos
Portions of the attribution graph can be automatically derived from the mission model e.g. Corruption of the mission threat's target position can only occur in a few steps, and can only beinfluenced by 2 stations.
This is offline analysis compiled into the operational attribution engine.
SRS Phase II PI Meeting 18 December 2007
22
1: Requirements, Architecture, Design, Evaluation Plan (CDRL A003)
2: Spiral 1 (Component Development)
3: Spiral 2 (Integrated System)
4: Spiral 3 (System Hardening)
5: Test and Evaluation
6: Program Management & Travel
1
1 4
Dec Mar Jun Sep Dec Mar Jun
FY-08FY-07
2 3
1. CDRL A003 Delivered (Feb.) Revisions in July 07, January 08, and June 08.2. Initial DRED: User-based IP filtering and sensing (Mar.)3. Initial planner: Strategic and tactical levels (Apr.) NetBase ontology (network model) shared with BBN4. Scenario defined: Carrier flight planning (Mar.) 5. Most Components operational (Jul.)
4
5
9
10
11
6. Functional Model of Scenario (Jun.)7. Testbed complete (Oct.)8. Initial integration (Oct.)9. E2E demonstration (Jan.)10. Final software build complete (Mar.)11. Red Team evaluation complete (May)
SPDR Plans and Timeline
8
7
2 3
Program Review Deliverable (update) Final Report
6
SRS Phase II PI Meeting 18 December 2007
23
MAIN ACHIEVEMENT (12/07):Stable online components integrated and working with evaluation testbed.
Insert new figure with DREDs and Intelligent core.
HOW IT WORKS: • AI planning creates library of insider attack plans
and generates sensors to detect the plans.
• Unbypassable, tamper-resistant sensor-effector platform (DRED).
• Automated plan recognition drives effector responses.
• Provenance monitoring drives attribution.
ASSUMPTIONS AND LIMITATIONS:
• Network plays essential role in attacks.Limited ability to thwart effects on attacker's host.
• Applications and missions are well-structured, amenable to automated modeling and analysis.
Thwart 50% of attacks (false thwart:correct thwart < 1:1)
Detect/attribute 90% of attacks (false positive:correct < 10:1)
Less than 10% loss in productivity for honest users.
Strengthen, Prepare, Detect, React (SPDR)
Audit and IDS tools are human intensive, error-prone, post facto.
Perpetrators can operate for years before being detected.
Satisfy metrics above for carrier-based mission planning testbed.
Identify technology path for extending SPDR capabilities to more general DoD networks.
QU
AN
TIT
ATIV
E IM
PA
CT
EN
D-O
F-P
HA
SE G
OA
L
SPDR ACHIEVEMENT
STA
TU
S Q
UO
NEW
IN
SIG
HTS
Scalable, model-driven analysis tools provide leverage for traditional IDS/P.
Automate formerly human intensive processes for speed & completeness.
Affordable, easily deployed hardware for transparent, highly assured monitoring and control of insiders
Detect and thwart malicious insiders using automated, predictive reasoning.