efficient detection of split personalities in malware
DESCRIPTION
Efficient Detection of Split Personalities in Malware. Davide Balzarotti, Marco Cova, Christoph Karlberger, Christopher Kruegel, Engin Kirda and Giovanni Vigna NDSS 2011 Feb. OUTLINE. Introduction and Related Work Our Approach Implementation Evaluation Conclusion. Introduction. - PowerPoint PPT PresentationTRANSCRIPT
Efficient Detection of Split Efficient Detection of Split Personalities in MalwarePersonalities in Malware
Davide Balzarotti, Marco Cova, Christoph Karlberger, Christopher Kruegel, Engin Kirda
and Giovanni Vigna NDSS 2011 Feb.
1
OUTLINEOUTLINEIntroduction and Related WorkOur ApproachImplementationEvaluationConclusion
2
IntroductionIntroductionMalware detection
◦ Static◦ Dynamic
Sandboxes(Anubis, CWSandbox, Joebox, Norman Sandbox)
Counterattack◦ Attacks on Virtual Machine Emulators◦ CPU semantics, timing attacks◦ Environment attacks
Processes, drivers, or registry values
3
SoluSolu11:Transparent malware :Transparent malware analysisanalysisCobra
◦ Code blocks◦ Replace instruction with a safe version
Ether◦ Hardware virtualization
More difficult to detect by malicious code.Great, but slow.
4
Solu2:Detect different Solu2:Detect different behavesbehaves“Emulating Emulation-Resistant
Malware”, 2009◦ Reference system vs. emulated
environment◦ Compare execution path◦ Use Ether to produce the reference trace
But executing the same program twice can lead to different execution runs.
5
OUTLINEOUTLINEIntroduction and Related WorkOur ApproachImplementationEvaluationConclusion
6
Our approachOur approachRecording and Replaying
◦ Reference system vs. emulated environment
◦ system call trace: types and arguments
If there is a different behavior◦ Rerun it in a transparent framework(Ether)
Detect malware reliably and efficiently
7
Reliability Two systems are execution-
equivalence if all program that◦ Start from the same initial state◦ Same inputs on both systems => Same runtime behavior => Same sequence of the system calls?
Assume no race condition
8
Reliability(cont.)If our reference system and the
analysis system are execution-equivalence, any difference in the observed behavior => split-personality
Also, this discrepancy is the result of CPU semantics or timing attacks
9
Making Systems Making Systems Execution-Execution-EquivalenceEquivalenceSame OS environment
Same address space layout of a process at load time
Same inputs to a program◦ Run program on the reference system in
log mode◦ Run program on the analysis system in
replay mode◦ System call matching
10
ReplayReplay ProblemProblemA number of system calls are not safe
to replay◦ Allocating memory, spawning threads
Only replay for those system calls that read data from the environment◦ other system calls are passed directly to the
underlying OS
Delay cause additional system calls◦ WaitForSingleObject()
11
System Call MatchingSystem Call Matching
12
OUTLINEOUTLINEIntroduction and Related WorkOur ApproachImplementationEvaluationConclusion
13
ImplementationImplementationA kernel driver
◦ Trap all the system calls Hook “System Service Descriptor Table”
◦ Each system call, two handler, log and replay
A user-space application◦ Start and control the driver◦ Start the process that has to be analyzed◦ Store the data generated during the
logging phase
14
Practical aspectsPractical aspectsHandles consistency
◦ Live handles and replayed handles◦ Check a list of all replayed handles
Networking◦ NtDeviceIOControlFile()◦ Device-dependent parameters
15
Practical aspects(cont.)Practical aspects(cont.)Deferred results
◦ STATUS_PENDING◦ NtWaitForSingleObject()
Thread Management◦ NtCreateThread()◦ Each thread has a new log
16
LimitationsLimitationsMemory Mapped Files
◦ DLLs◦ Create file with memory-mapped
Remove the system calls
Multiple processesRandom numbers
◦ KsecDDInter-process communication and
asynchronous callsPostponing check
17
OUTLINEOUTLINEIntroduction and Related WorkOur ApproachImplementation EvaluationConclusion
18
EvaluationEvaluationMicrosoft Windows XP Service Pack 3VMware virtual machineAnubis system(Qemu)
1. Log and Replay six programs(success)2. SDBot(fail)
◦ spawning new process, like NtCreateProcess◦ six different versions to detect
VMware(success) Red Pill, Scoopy, VMDetect, and SourPill
19
Evaluation(cont.)Evaluation(cont.)3. Real Malware with no VM-checks
20
Evaluation(cont.)Evaluation(cont.)4. Real malware with VM-checks
21
PerformancePerformanceDepends on the type of operation
◦ Average 1% overhead
Compresses a 1KB-long random fileCMD: 7za.exe a test.zip 1KB_rand_file
Anubis: 4.267 secEther: 77.325 secOur Vmware reference system: 1.640
sec22
OUTLINEOUTLINEIntroduction and Related WorkOur ApproachImplementation EvaluationConclusion
23
ConclusionConclusionA prototypeRecording system calls and replay
themNeed a fully transparent, analysis
system for further examination
24