log analysis with - nathan hunstad

Post on 02-Feb-2022

1 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

TRANSCRIPT

Log AnalysisLog Analysiswithwith

Presenter: Nathan HunstadMay 2015

Obligatory Disclaimer

● This talk represents my own work: I am not representing any employer, organization, or affiliated group, past, present, or future

● This talk is based on my experiences in my home lab network and not in an enterprise setting

● This is an overview only and is provided without warranty: do not rely on what you learn here for compliance or legal obligations!

What is Log Analysis?

Not this:

What is Log Analysis?

Or this:

What is Log Analysis, really?

● Forensics: Reconstructing events that have already happened

● Incident Response: Acting on logs in real-time to identify, contain, and remediate security incidents

● Troubleshooting: Evaluating systems for faults or unintended behavior and fixing as necessary

Handling Logs

● Help!

Splunk

● Splunk captures all kinds of machine data – app log files, syslog, text files, configuration files...basically any text data can be ingested

● Splunk provides a powerful search engine based on MapReduce for fast searching1

● Splunk has add-ins that allow for quickly setting up dashboards and reports for common log sources

● No, I do not work for Splunk

1 https://www.splunk.com/content/dam/splunk2/pdfs/technical-briefs/splunk-and-mapreduce.pdf

Splunk Licensing

● Splunk Enterprise: based on log volume

● Splunk Free: fewer features, 500MB/day

– Go over? You will lose search access!– But good enough for home use

Splunk Licensing

● Average Logging Volume

Adding Data to Splunk

Getting Data Into Splunk

● Splunk Forwarder

– Install on any system to read log files locally and forward to Splunk Indexer

– Versions available for Windows, MacOS, Linux, Solaris, HPUX, AIX, and FreeBSD

– Configure using GUI or edit .conf files

Getting Data Into Splunk

● Listen on port

– Splunk daemon binds to a port to listen for traffic (TCP or UDP)

– Typically used with syslog data

Getting Data Into Splunk

● Monitor Files/Directories

– Splunk daemon monitors individual files or an entire directory for new files/changes to files

– Computes CRC and bytes read on files to detect changes

– Can automatically decompress common formats like zip files

Getting Data Into Splunk

● Remote Hosts

– What if you can't install a forwarder on a remote host (for example, your shared web host?)

– My Solution: cron job + monitoring files

Splunk Basics

Indexes

● Indexes are the logical buckets into which data is stored● By default, all data gets stored in the main index, but

other indexes can be defined● A number of internal indexes exist for tracking Splunk

functionality and start with _, such as _internal and _audit

● Data retention and access control* is done on a per-index basis

Buckets

● Buckets are collections of index data and metadata

● Buckets age through several stages: Hot, Warm, Cold, and Frozen

● Not terribly important for home use, but managing retention becomes important for large data sets

My Environment

● Splunk server: located on server running CentOS

● Feeds from VMs, Windows desktops, EdgeOS router, managed switch, application logs, external website

My Environment

● Data is split up into multiple indexes for logical grouping

● Indexes for firewall, switch, Linux, Windows, website, and BOINC events, plus a throwaway index for testing

Windows Events

● Events from Security, Application, and System logs

Windows Events

● PerfMon performance monitoring events

Linux Events

● Syslog events

Website Logs

● Multiple access logs

Website Logs

● Apache access_combined

Firewall Logs

● Dropped and specific accepted connections

Switch Logs

● Connected devices

Application Logs

● BOINC (Berkeley Open Infrastructure for Network Computing) events

Basic Search Syntax

Search Syntax

● Basic search: just type in what you want to see

Search Syntax

● Limiting by fields

Search Syntax

● Counting events: stats count

Search Syntax

● Top events: top

Search Syntax

● Bucketing events and charting: timechart

Security Events

Brute Force Windows

● Using ncrack against RDP

Brute Force Windows

● Success!

Brute Force Linux

● Using Metasploitable ssh_login module and default root_userpass.txt

Port Scanning (External)

● Port Scanning: Same source IP, multiple destination ports

Port Scanning (Internal)

● Port Scanning: Same source IP, multiple destination ports

SQL Injection

● sqlmap against DVWA

● Apache logs sent to Splunk

Blind SQL Injection

● sqlmap/DVWA

XSS

● Persistent XSS on DVWA

Mimikatz

● Running mimikatz to dump hashes

● Nothing happens

Correlation

● Transactions: group events together that match a pattern

● Successful login following failed logins

Correlation

● Show attackers in a table

● index=linux | transaction host,rhost startswith="eventtype=sshd-login-failure" endswith="eventtype=ssh_open" | bucket _time span=30m | table _time, rhost

More Splunking

Field Extraction

● Splunk can handle some log types automatically pretty well, but adding rules for field extraction can help with searching and indexing

● A number of extractions come with Splunk ready for use, or you can add your own

● Uses regex for extraction

Lookups

● Uploading CSV files for extracting or expanding on data in logs

Lookups

● Previous timechart, now with names

Data Models

● Data Models are a powerful way of structuring data to generate specialized searches and visualizations

● Can be used to generate pivot tables and other complex objects

Pivot Tables

● Based on defined data models

● Display data in tabular format

Dashboards

● Bringing all your data to one spot, with user-selectable attributes

Visualizations

● Looking closer at the “Website Attacks” dashboard:

– Logarithmic Y-axis

– Daily Buckets

Visualizations

● Grouping Events

– Attacks: index=website joomlafailure sourcetype="php_error" | transaction IP maxpause=1h maxevents=5000| where eventcount>1 | table _time, IP, eventcount

– Port Scanning: index=firewall RuleName=WAN-*default-D | bucket _time span=30 | eventstats dc(DPT) AS PortsScanned by SRC, _time | where PortsScanned > 5 | dedup SRC, PortsScanned | table SRC, PortsScanned, _time

Visualizations

● Firewall Drops

Geolocation

● Splunk can Geolocate IP addresses

Geolocation

● Search: index=website joomlafailure sourcetype="php_error" | transaction IP maxpause=1h maxevents=5000 | where eventcount>1 | iplocation IP | geostats latfield=lat longfield=lon sum(eventcount)

Splunk TAs

● Splunk comes with TAs (Technology add-on) with pre-defined field extractions, transformations, and dashboards

Happy Logging!

top related