Event Logging with Kafka and ELK

If you don't already have an infrastructure to collect and deliver data to DarkLight this tutorial will guide you through the process of deploying some compatible server middleware and host agents. We will be recreating the image below, however, you can a break away from the tutorial at any time to implement your own solutions. We are recreating this specific infrastructure because of its use of Kafka which allows for other instances of ELK to test configurations or other infrastructure tools to run simultaneously along with the production ELK instance. This page will help you install and configure two Kafka Servers and install and configure a separate ElasticSearch, Logstash, Kibana (ELK) Linux server. The initial Kafka server will be your data pipe. One topic (beats.raw) with a constant flow of data from all of your sources. The ELK server will categorize that mass amounts of data and sort into indexes in Elasticsearch. Simultaneously, the data will be sent to the second Kafka server being sorted on the type received from Logstash. Your second Kafka server will contain topics sorted by event type from logstash (EX: Winlogbeat, Osquery, etc). From here you will create data feeds in DarkLight that subscribe to the different topics. This allows for a lot of flexibility with your data, by being able to select the topics of interest or even split your data between DarkLight instances by splitting your data feeds.

The following "Server Setup" section will walk you through setting up a KafkaELKKafka environment:

Prerequisites:

We find it easier to configure servers as virtual machines. If you are looking for a good virtual manager, we would highly suggest looking into Proxmox: https://www.proxmox.com/en/

  • 3 Centos 7 Linux systems (Minimal install OK - no graphical interface required)
  • static IP address, with access to internet to download install files

Kafka

Official Documentation: https://kafka.apache.org/quickstart

Follow the following instructions on two of the three CentOS servers you have setup. These will be the first and second Kafka servers.

Quick:

Kafka is very easy to install and run. The following commands can be copied and pasted into your terminal to quickly install and start your Kafka server. Below the code block, we will walk through each command step by step so you know what you're running.

yum install wget java -y
wget -O /opt/kafka.tgz http://mirrors.koehn.com/apache/kafka/2.0.0/kafka_2.11-2.0.0.tgz
mkdir /opt/kafka
tar -xzf /opt/kafka.tgz -C /opt/kafka --strip-components=1 && rm -rf /opt/kafka.tgz
sudo firewall-cmd --permanent --zone=public --add-port=2181/tcp
sudo systemctl restart network && sudo systemctl restart firewalld
/opt/kafka/bin/zookeeper-server-start.sh -daemon /opt/kafka/config/zookeeper.properties
/opt/kafka/bin/kafka-server-start.sh -daemon /opt/kafka/config/server.properties
Explained:

Download and install wget and java. Wget will be used to download the Kafka file. Kafka is a Java program, so we download the latest Java package.

yum install wget java -y

Download the Kafka 2.0.0 package to /opt/kafka.tgz. Check https://kafka.apache.org/quickstart where it says "Step 1: Download the code" for a newer version.

wget -O /opt/kafka.tgz http://mirrors.koehn.com/apache/kafka/2.0.0/kafka_2.11-2.0.0.tgz

Create the /opt/kafka directory:

mkdir /opt/kafka

Unzip the downloaded .tgz file into the /opt/kafka folder and remove the .tgz file:

tar -xzf /opt/kafka.tgz -C /opt/kafka --strip-components=1 && rm -rf /opt/kafka.tgz

Open tcp port 2181:

sudo firewall-cmd --permanent --zone=public --add-port=2181/tcp

Restart the network and firewalld service:

sudo systemctl restart network && sudo systemctl restart firewalld

Start the Zookeeper service with the default zookeeper.properties file:

/opt/kafka/bin/zookeeper-server-start.sh -daemon /opt/kafka/config/zookeeper.properties

Start the Kafka service with the default server.properties file:

/opt/kafka/bin/kafka-server-start.sh -daemon /opt/kafka/config/server.properties

ELK

As root (su root), run the following commands to install ElasticSearch, Logstash, Kibana (ELK), and Osquery. This is performed with a Chef script that downloads, installs, and configures everything you need to create a data collector and streamer. Osquery will collect the logs from this server and send them to Logstash (localhost) automatically.

Paste these commands into a root shell to download and execute the script (tip: double-click a line to highlight it):

yum install curl -y
curl -L https://raw.githubusercontent.com/champtc/Infrastructure/master/chef/install-elk-amq-osquery.bash | bash

View the contents of the script

Note this process will take several minutes, and may pause a few times. It will be finished when you see "Chef Client finished" and you see a new prompt. You may also see messages about "no config file found" - these are normal.

When the script is complete, reboot the system. The services you just installed will be running and ready for computers to send events to (see next section). You will need to change the Kafka input and output configuration files located at /etc/logstash/conf.d/. Once you have edited those files to fit your network, look around at the other files in that directory to see if there is other changes you want to make. See the Data Feed section of this page to start using your data in DarkLight.

Running ELK on AWS

ELK is also optimized for clusters, and can be run in a cloud-based system. Here's a great tutorial for getting setup on Amazon Web Services (AWS).

http://chrissimpson.co.uk/using-elasticsearch-on-amazon-ec2.html

Managing Your Server

Defaults:

Kibana:

  • User: kibanaadmin
  • PW: password

Change Password:

Kibana: htpasswd /etc/nginx/htpasswd.users kibanaadmin

  • NOTE: Reboot required.

Create New User:

Kibana: htpasswd /etc/nginx/htpasswd.users NEW_USERNAME_HERE (Replace "NEW_USERNAME_HERE" with your new username)

  • NOTE: Reboot required.

Winlogbeat is a "shipper" that can be installed on each Windows host in an enterprise. It forwards data from the windows event logs to a Logstash service on the ELK server.

Prerequisites:

  • Windows OS
  • Internet access
  • Powershell
  • Network connectivity from windows host to ELK/AMQ server.

Installation:

  1. Download the following powershell script.
  2. On each Windows host, open a powershell console as administrator (Start → "powershell" → right-click → Run as Administrator)
  3. Replace xxx.xxx.xxx.xxx with the IP address of your Kafka server and execute this command: powershell -ExecutionPolicy UnRestricted -File C:\path\to\your\edited\script\installWinlogbeat.ps1 -ip xxx.xxx.xxx.xxx
installWinlogbeat.ps1
param (
   [string]$ip
)
 
# stop and delete service if it exists
		if (Get-Service winlogbeat -ErrorAction SilentlyContinue) {
		$service = Get-WmiObject -Class Win32_Service -Filter "name='winlogbeat'"
		stop-service winlogbeat
		$service.delete()
}
 
#wait a few seconds to let the service release the folders
Start-Sleep -s 5
 
#download zip file
$zip = $env:TEMP+"\winlogbeat-6.2.3-windows-x86_64.zip"
(New-Object System.Net.WebClient).DownloadFile("https://artifacts.elastic.co/downloads/beats/winlogbeat/winlogbeat-6.2.3-windows-x86_64.zip", $zip)
 
#unzip it
$tmpzip = $env:TEMP+"\winlogbeat"
Add-Type -assembly "system.io.compression.filesystem"
[io.compression.zipfile]::ExtractToDirectory($zip, $tmpzip)
 
if(!(Test-Path -Path 'C:\Program Files\winlogbeat\')){
	New-Item -ItemType directory -Path 'C:\Program Files\winlogbeat\'
} else {
	if((Test-Path -Path 'C:\Program Files\winlogbeat_old\')){
	Remove-Item -Path 'C:\Program Files\winlogbeat_old\' -recurse
	}
	New-Item -ItemType directory -Path 'C:\Program Files\winlogbeat_old\'
	Move-Item -Path 'C:\Program Files\winlogbeat\*.*' -Destination 'C:\Program Files\winlogbeat_old\'
}
 
#copy to program files
Copy-Item -Path $tmpzip\winlogbeat-6.2.3-windows-x86_64\*.* -Destination 'C:\Program Files\winlogbeat\'
 
#download the yml file:
$ymlFile = "C:\Program Files\winlogbeat\winlogbeat.yml"
$TEMPymlFile = $env:TEMP+"\winlogbeat.yml"
$ymlDownload = "https://raw.githubusercontent.com/champtc/Infrastructure/master/winlogbeat/winlogbeat.yml"
(New-Object System.Net.WebClient).DownloadFile($ymlDownload, $TEMPymlFile)
 
(Get-Content $TEMPymlFile).replace('XXX.XXX.XXX.XXX', $ip) | Set-Content $ymlFile
 
#cleanup
Remove-Item -Path $tmpzip -recurse
Remove-Item -Path $TEMPymlFile
Remove-Item -Path $zip
 
 
#Install and start the service
PowerShell.exe -ExecutionPolicy UnRestricted -File 'C:\Program Files\winlogbeat\install-service-winlogbeat.ps1'
start-service winlogbeat

For Linux, we use Osquery, Syslog, and Filebeat. Filebeat is the "shipper" that can be installed on each Linux host in an enterprise. It forwards data from the Linux event logs to a Logstash service on the ELK server.

To install, replace "xxx.xxx.xxx.xxx" with the IP address of your ELK box and run the following command on your Linux host:

curl -L https://raw.githubusercontent.com/champtc/Infrastructure/master/osquery/linux-etc/install-osquery.bash | bash -s xxx.xxx.xxx.xxx

View the contents of the script

Once your Servers are up and running, you can use it as the source of a Data Feed in DarkLight (See: Data Feeds: Getting Data into DarkLight)

  • Connection Type: Kafka
  • Servers: xxx.xxx.xxx.xxx:9092 (use the IP address from your Kafka out server)
  • Subscription: EX: logstash.wineventlog (use a topic from your Kafka out server - Run the following command on your Kafka server to list topics: /opt/kafka/bin/kafka-topics.sh –list –zookeeper localhost:2181
  • GroupID: Unique set of characters

If you want to use a secure ssl connection to Kafka from DarkLight, see the note on the Certificate Manager page.

  • tips/eventlogging
  • Last modified: 2018/09/04 18:31