log_analytics_public
TRANSCRIPT
AGENDA• Parsing logs on the host - “old good times”
• Logs collection: error logs and request logs
• Error logs parsing with Splunk
• Request logs parsing with ETL and MySQL
• Centralized logging with rsyslog
• Error logs parsing with Sumo Logic
• ELK for request logs
LOG4J CONFIGURATION
Send a batch of log entries via e-mail based on two criteria:
✴ total number of WARN level messages equals 10MB
✴ first ERROR level message is encountered
LOG4J CONFIGURATION
Send a batch of log entries via e-mail based on two criteria:
✴ total number of WARN level messages equals 10MB
✴ first ERROR level message is encountered
LOG COLLECTION
Request logs requirements:
✴ big amount of logs processing (30GB/day)
✴ long term data store✴ statistics as main metric
CENTRALIZED LOGGING
Requirements:
✴ dynamic environment ready
✴ live log streaming
✴ guaranteed log transfer
✴ override and tag support
SUMO LOGIC
Why it’s better than Splunk?
✴ External SaaS✴ Pricing policy✴ s3 in-box support✴ scheduled queries✴ PD support
AGGREGATED METRICS
✴ Retention policy - 1 month
✴ Aggregation into separate ES index
✴ Custom python script using ES facets
✴ Aggregated dashboard
CLIENT ID TO CLIENT NAME
• URL contains client ID
• Logstash support a lot of community plugins
• Translate plugin
• YAML file with mappings
MULTIPLE DATACENTERS
• Spit Elasticsearch cluster between regions - not recommended scenario
• Elasticsearch tribe node feature