Installing ELK and Beats on Linux and Windows

A Cyber Security Research Dumpsite

Installing ELK and Beats on Linux and Windows

In this tutorial I aim to provide a clarification on how to install ELK on Linux (Ubuntu 18.04) and its Beats on Windows. I will also be providing configuration for each of the installation we make.

Notes prior to starting:

Commands will look like this:

  • Example

Configuration files will be within a grey box in orange, for example:

  • Hello
  • This is an example

When installing ELK, I will be providing two methods of installing; through their PGP Key and Manually. Personally, for me it only worked by installing it manually. However, to avoid any issues, I will be providing both ways of installing. I will also provide with the link to the official documentation in case there are any questions.

Warning: Filebeat

Becareful when running Filebeat as it would unload all the logs that are stored under winevt/logs directory and it can easily fill up all of the disk space. For example, I had 40GB of free space, I loaded Filebeat and it loaded up 8 million logs and logstash loaded another 12  million. I ran out of space and had to stop Filebeat and delete all the logs and increase the disk space. Perhaps, when loading Filebeat only do so for a certain log or period of time.

To do so, I provided a command under the Filebeat configuration file that would load up all logs under winevt/logs. I done this by specifying a wildcard using *:

  • - C:\Windows\System32\winevt\Logs\*

Instead of using * just use the name of the log you would like loaded up:

  • - C:\Windows\System32\winevt\Logs\Security

Or

  • - C:\Windows\System32\winevt\Logs\Application

Installing ELK and Beats on Linux and Windows

I am guessing you are either using Debian or Ubuntu. If using Ubuntu download the Debian packages as that is the only one that works with Ubuntu or any flavour of Ubuntu. In this report I have used Lubuntu which is a lighter & much cleaner version of Ubuntu.

Before installing anything; Open up PuTTy, connect to YOUR_IP_HERE port 22 and login with root1:root@1234

Once you’re logged in on PuTTy enter:

sudo -s

Type the password, and you should now be root.

Installing Java

Elasticsearch requires at least Java 8. Specifically, as of this writing, it is recommended that you use the Oracle JDK version 1.8.0_131.

This is the command to install Java 8:

sudo apt install openjdk-8-jre

check you have the right version by running:

java -version

It should display this:

Installing ELK and Beats on Linux and Windows

 Opening Ports:

It is important we open the ports that we will be needing, just so we don’t run into any problems.

On Linux run the following command:

Ufw status

That command will verify if the Firewall is active, if it’s not then execute the follow command:

ufw enable

That should gather all the open ports:

Installing ELK and Beats on Linux and Windows

 

Initially you won’t have these ports opened. To do so, run the following command

Ufw allow PORT

The ports you need open are:

OpenSSH, 80, 443, 5400, 5601, 9200, 5044, 22, 5044/tcp, 5000, 5000/tcp

So, the command would look like this:

Ufw allow 5601

Installing Elasticsearch

Link: https://www.elastic.co/guide/en/elasticsearch/reference/current/deb.html

Installation by important the Elasticsearch PGP key:

Download and install the public signing key:

wget -qO – https://artifacts.elastic.co/GPG-KEY-elasticsearch | sudo apt-key add –

if that returns:

gpg: no valid OpenPGP data found.

Try adding the first line:

wget -qO – https://artifacts.elastic.co/GPG-KEY-elasticsearch

then adding the second line:

sudo apt-key add –

If this works, you don’t have to get the key again because it will already be on the system and you can downloading the rest of the software by using the following command:

Apt-get install kibana

Apt-get install logstash

You may need to install the apt-transport-https package on Debian before proceeding:

sudo apt-get install apt-transport-https

Save the repository definition to /etc/apt/sources.list.d/elastic-6.x.list:

echo “deb https://artifacts.elastic.co/packages/6.x/apt stable main” | sudo tee -a /etc/apt/sources.list.d/elastic-6.x.list

After that, you can install the Elasticsearch Debian package with:

sudo apt-get update && sudo apt-get install elasticsearch

Downloading & Installing Elasticsearch package manually

The Debian package for Elasticsearch v6.3.1 can be downloaded from the website and installed as follows:

wget https://artifacts.elastic.co/downloads/elasticsearch/elasticsearch-6.3.1.deb

Then unpack it:

sudo dpkg -i elasticsearch-6.3.1.deb

Running Elasticsearch with systemd

To configure Elasticsearch to start automatically when the system boots up, run the following commands:

sudo /bin/systemctl daemon-reload

sudo /bin/systemctl enable elasticsearch.service

Elasticsearch can be started and stopped as follows:

sudo systemctl start elasticsearch.service

sudo systemctl stop elasticsearch.service

These commands will provide no feedback as to whether Elasticsearch was started successfully or not.

Run this command to check if Elasticsearch is running:

curl -X GET “YOUR_IP_HERE:9200/”

and it should display this:

{  “name” : “Cp8oag6”,  “cluster_name” : “elasticsearch”,  “cluster_uuid” : “AT69_T_DTp-1qgIJlatQqA”,  “version” : {    “number” : “6.3.1”,    “build_flavor” : “default”,    “build_type” : “zip”,    “build_hash” : “f27399d”,    “build_date” : “2016-03-30T09:51:41.449Z”,    “build_snapshot” : false,    “lucene_version” : “7.3.1”,    “minimum_wire_compatibility_version” : “1.2.3”,    “minimum_index_compatibility_version” : “1.2.3”  },  “tagline” : “You Know, for Search”}

That should be it. Elasticsearch should be installed now.

 Configuring Elasticsearch File.

Firstly, navigate to elasticsearch directory:

 

Installing ELK and Beats on Linux and Windows

Cd /etc/elasticsearch

The configuration file is called elasticsearch.yml. Open elasticsearch.yml with:

Nano elasticsearch.yml

Make sure that you have these configurations:

# ---------------------------------- Paths ------------------------------------

path.data: /var/lib/elasticsearch

path.logs: /var/log/elasticsearch

# ---------------------------------- Network -----------------------------------

network.host: 0.0.0.0

http.port: 9200

To save the file press CTRL + O and then exit the file with CTRL + Z

 Installing Kibana

Link: https://www.elastic.co/guide/en/kibana/current/install.html

Download and install the public signing key:

wget -qO – https://artifacts.elastic.co/GPG-KEY-elasticsearch | sudo apt-key add –

Again, if that doesn’t work break into two lines:

wget -qO – https://artifacts.elastic.co/GPG-KEY-elasticsearch

then:

sudo apt-key add –

Installing from the APT repository

You may need to install the apt-transport-https package on Debian before proceeding:

sudo apt-get install apt-transport-https

Save the repository definition to /etc/apt/sources.list.d/elastic-6.x.list:

echo “deb https://artifacts.elastic.co/packages/6.x/apt stable main” | sudo tee -a /etc/apt/sources.list.d/elastic-6.x.list

You can install the Kibana Debian package with:

sudo apt-get update && sudo apt-get install kibana

Download and install the Debian package manually

The Debian package for Kibana v6.3.1 can be downloaded from the website and installed as follows:

wget https://artifacts.elastic.co/downloads/kibana/kibana-6.3.1-amd64.deb

then unpack:

sudo dpkg -i kibana-6.3.1-amd64.deb

Running Kibana with systemd

To configure Kibana to start automatically when the system boots up, run the following commands:

sudo /bin/systemctl daemon-reload

sudo /bin/systemctl enable kibana.service

Kibana can be started and stopped as follows:

sudo systemctl start kibana.service

sudo systemctl stop kibana.service

These commands provide no feedback as to whether Kibana was started successfully or not.

Configuring Kibana

Firstly, navigate to Kibana directory:

 

Installing ELK and Beats on Linux and Windows

Cd /etc/kibana

The configuration file is called kibana.yml. Open kibana.yml with:

Nano kibana.yml

Make sure that you have these configurations:

# Kibana is served by a back-end server. This setting specifies the port to use.

server.port: 5601

server.host: 0.0.0.0

# The URL of the Elasticsearch instance to use for all your queries.

elasticsearch.url: "http://localhost:9200"

To save the file press CTRL + O and then exit the file with CTRL + Z

Check if Kibana is working

Open google chrome and navigate to Kibana:

YOUR_IP_HERE:5601

And Kibana should be opened:

 

Now we need to get logs into it.

Installing Logstash

Link: https://www.elastic.co/guide/en/logstash/current/installing-logstash.html

Installing from the APT repository

Download and install the Public Signing Key:

wget -qO – https://artifacts.elastic.co/GPG-KEY-elasticsearch | sudo apt-key add –

As always, if that doesn’t work – break it into two commands:

wget -qO – https://artifacts.elastic.co/GPG-KEY-elasticsearch

then:

sudo apt-key add –

You may need to install the apt-transport-https package on Debian before proceeding:

sudo apt-get install apt-transport-https

Save the repository definition to /etc/apt/sources.list.d/elastic-6.x.list:

echo “deb https://artifacts.elastic.co/packages/6.x/apt stable main” | sudo tee -a /etc/apt/sources.list.d/elastic-6.x.list

Run sudo apt-get update and the repository is ready for use. You can install it with:

sudo apt-get update && sudo apt-get install logstash

Downloading & Installing Logstash Manually

The Debian package for Logstash v6.3.1 can be downloaded from the website and installed as follows:

wget https://artifacts.elastic.co/downloads/logstash/logstash-6.3.1.deb

then unpack:

sudo dpkg -i logstash-6.3.1-amd64.deb

 Configuring Logstash

Firstly, navigate to Logstash directory:

 

Cd /etc/logstash

The configuration file is called logstash.yml. Open logstash.yml with:

Nano logstash.yml

Make sure that you have these configurations:

# ------------ Data path ------------------

path.data: /var/lib/logstash

# ------------ Debugging Settings --------------

path.logs: /var/log/logstash

To save the file press CTRL + O and then exit the file with CTRL + Z

Now on the same directory: /etc/logstash open pipeline.yml and make sure it has these configurations:

- pipeline.id: mypipeline_1

  path.config: "/etc/logstash/conf.d/*.conf"

To save the file press CTRL + O and then exit the file with CTRL + Z

Now navigate to the conf.d directory within /etc/logstash.

Cd conf.d

 

Installing ELK and Beats on Linux and Windows

Here we have different configuration files: each of these works together to process logs.

  • Logstash-beats.conf is necessary to get input from all the other beats.
  • Logstash-apache.conf is necessary if you want to process apache logs
  • Logstash-syslog.conf is necessary if you want to process syslogs
  • Demo-metrics-pipeline.conf is necessary to process metric beats.

So, create a file to get Beats input:

Nano logstash-beats.conf

And input these configurations into the file:

input {

  beats {

    port => 5044

  }

}

output {

  elasticsearch {

    hosts => "localhost:9200"

    manage_template => false

    index => "%{[@metadata][beat]}-%{[@metadata][version]}-%{+YYYY.MM.dd}"

  }

}

To save the file press CTRL + O and then exit the file with CTRL + Z

Now, to process Apache logs, create an empty file:

Nano logstash-apache.conf

And input these configurations into the file:

input {

  file {

    path => "/var/log/logstash/*_log"

  }

}

filter {

  if [path] =~ "access" {

    mutate { replace => { type => "apache_access" } }

    grok {

      match => { "message" => "%{COMBINEDAPACHELOG}" }

    }

    date {

      match => [ "timestamp" , "dd/MMM/yyyy:HH:mm:ss Z" ]

    }

  } else if [path] =~ "error" {

    mutate { replace => { type => "apache_error" } }

  } else {

    mutate { replace => { type => "random_logs" } }

  }

}

output {

  elasticsearch { hosts => ["localhost:9200"] }

  stdout { codec => rubydebug }

}

To save the file press CTRL + O and then exit the file with CTRL + Z

To process syslog, create a empty file:

Nano logstash-syslog.conf

Input this configuration into the file:

input {

  tcp {

    port => 5000

    type => syslog

  }

}

filter {

  if [type] == "syslog" {

    grok {

      match => { "message" => "%{SYSLOGTIMESTAMP:syslog_timestamp} %{SYSLOGHOST:syslog_hostname} %{DATA:syslog_program}(?:\[%{POSINT:syslog_pid}\])?: %{GREEDYDATA:syslog_message}" }

      add_field => [ "received_at", "%{@timestamp}" ]

      add_field => [ "received_from", "%{host}" ]

    }

    date {

      match => [ "syslog_timestamp", "MMM  d HH:mm:ss", "MMM dd HH:mm:ss" ]

    }

  }

}

output {

  elasticsearch { hosts => ["localhost:9200"] }

  stdout { codec => rubydebug }

}

To save the file press CTRL + O and then exit the file with CTRL + Z

Now create a configuration file for Metrics (we will use this later on):

Nano demo-metrics-pipeline.conf

Input these configurations:

input {

  beats {

    port => 5044

  }

}

output {

  elasticsearch {

    hosts => "localhost:9200"

    manage_template => false

    index => "%{[@metadata][beat]}-%{[@metadata][version]}-%{+YYYY.MM.dd}"

  }

}

To save the file press CTRL + O and then exit the file with CTRL + Z

Now create a simple configuration file:

Nano logstash-simple.conf

Input these configurations:

input { stdin { } }

filter {

  grok {

    match => { "message" => "%{COMBINEDAPACHELOG}" }

  }

  date {

    match => [ "timestamp" , "dd/MMM/yyyy:HH:mm:ss Z" ]

  }

}

output {

  elasticsearch { hosts => ["localhost:9200"] }

  stdout { codec => rubydebug }

}

To save the file press CTRL + O and then exit the file with CTRL + Z

You can run a configuration test to see if everything is working correctly:

/usr/share/logstash/bin/logstash -f INPUT_CONFIG_FILE_HERE.conf

INPUT_CONFIG_FILE_HERE is where you input the file name, run the tests and it should output Configuration OK.

Note: These filters in the configuration can be edited and enhanced to suit the business needs.

You can start logstash by running this command:

Service logstash start

 Checking if they are running:

Run these commands to check if the services are up and running:

  • Check Elasticsearch:

Service elasticsearch status

Output:

 

Installing ELK and Beats on Linux and Windows

  • Check Kibana:

Service kibana status

Output:

 

Installing ELK and Beats on Linux and Windows

  • Check Logstash:

Service logstash status

Output:

Installing ELK and Beats on Linux and Windows What are Beats?

What are Beats?

In short: Beats is the platform for single-purpose data shippers.

More information on: https://www.elastic.co/products/beats

 

Installing ELK and Beats on Linux and Windows

For now, we will be installing:

  • Filebeat
  • Metricbeat
  • Winlogbeat

Beats can be installed based on the organisations need. They are easily applicable.

Beats are great for gathering data. They sit on your servers and centralize data in Elasticsearch. And if you want more processing muscle, Beats can also ship to Logstash for transformation and parsing.

Therefore, we will be connecting our Beats with Logstash.

Installing Filebeat

Link: https://www.elastic.co/guide/en/beats/filebeat/current/filebeat-installation.html

Filebeat is used to forward and centralize log data. Filebeat monitors the log files or locations that you specify, collects log events, and forwards them to either Elasticsearch or Logstash for indexing. We will be using forward our logs to Logstash.

We will be using Filebeat to get the logs on winevt file. Therefore, we are installing Filebeat on windows.

Head over to the following link:

https://www.elastic.co/downloads/beats/filebeat

Downloading the Windows 64-bit version. Extract all the files locally or wherever you wish to have Filebeat installed.

 Configuring Filebeat

Now, navigate to the Filebeat directory.

Right click the file and open it with Notepad or Notepad++

Copy and paste the following configuration into the file:

#=========================== Filebeat inputs 

filebeat.inputs:

- type: log

  enabled: true

  paths:

    - C:\Windows\System32\winevt\Logs\*

#============================= Filebeat modules 

filebeat.config.modules:

  path: ${path.config}/modules.d/*.yml

  reload.enabled: false

#==================== Elasticsearch template setting 

setup.template.settings:

  index.number_of_shards: 3

#============================== Dashboards 

setup.dashboards.enabled: true

#============================== Kibana 

setup.kibana:

  host: "YOUR_IP_HERE:5601"

#----------------------------- Logstash output ---------------

output.logstash:

  hosts: ["YOUR_IP_HERE:5044"]     

Save and quit the file

 Installing:

Open PowerShell as Admin:

PowerShell.exe -ExecutionPolicy UnRestricted -File .\install-service-filebeat.ps1.

Loading the Index Template for Elasticsearch (Manually)

In Elasticsearch, index templates are used to define settings and mappings that determine how fields should be analysed.

Make sure you are on the directory that you installed Filebeat on.

Run this command to order the Index Template Elasticsearch:

PowerShell .\filebeat setup –template -E output.logstash.enabled=false -E ‘output.elasticsearch.hosts=[“YOUR_IP_HERE:9200”]’

If that doesn’t work, try this other method:

Downlod the template:

PowerShell .\filebeat.exe export template –es.version 6.3.1 | Out-File -Encoding UTF8 filebeat.template.json

Then install the template:

PowerShell Invoke-RestMethod -Method Put -ContentType “application/json” -InFile filebeat.template.json -Uri http://YOUR_IP_HERE:9200/_template/filebeat-6.3.1

That should be ok, now let’s start Filebeat.

Starting Filebeat:

To start Filebeat run:

Start-Service filebeat

 Installing MetricBeats: *Not Necessary

Link: https://www.elastic.co/guide/en/beats/metricbeat/current/metricbeat-getting-started.html

MetricBeat is a lightweight shipper that you can install on your servers to periodically collect metrics from the operating system and from services running on the server. MetricBeat takes the metrics and statistics that it collects and ships them to the output that you specify, such as Elasticsearch or Logstash.

To begin with, the following command will download MetricBeat from elastic:

curl -L -O https://artifacts.elastic.co/downloads/beats/metricbeat/metricbeat-6.3.1-amd64.deb

Then unpack the file:

sudo dpkg -i metricbeat-6.3.1-amd64.deb

 Configuring MetricBeat

 

Installing ELK and Beats on Linux and Windows

Now, navigate to MetricBeat directory:

Run the following command to edit the file:

Nano metricbeat.yml

Copy and paste the following configuration into the file:

#========================== Modules configuration 

metricbeat.config.modules:

  path: ${path.config}/modules.d/*.yml

  reload.enabled: false

#==================== Elasticsearch template setting 

setup.template.settings:

  index.number_of_shards: 1

  index.codec: best_compression

#================================ General

setup.dashboards.enabled: true

#============================== Kibana 

setup.kibana:

  host: "YOUR_IP_HERE:5601"

#----------------------------- Logstash output ---------------

output.logstash:

  hosts: ["YOUR_IP_HERE:5044"]

To save the file press CTRL + O and then exit the file with CTRL + Z

 Installing Winlogbeat:

Link: https://www.elastic.co/guide/en/beats/winlogbeat/current/index.html

What is Winlogbeat?

Winlogbeat ships Windows event logs to Elasticsearch or Logstash. You can install it as a Windows service.

Winlogbeat reads from one or more event logs using Windows APIs, filters the events based on user-configured criteria, then sends the event data to the configured outputs (Elasticsearch or Logstash). Winlogbeat watches the event logs so that new event data is sent in a timely manner. The read position for each event log is persisted to disk to allow Winlogbeat to resume after restarts.

Winlogbeat can capture event data from any event logs running on your system. For example, you can capture events such as:

  • application events
  • security events
  • system events

Therefore, this Beat has to be downloaded on a windows OS. Go to the environment you would like to collect event logs from and download winlogbeat by going to the link below:

https://www.elastic.co/downloads/beats/winlogbeat

Download the WINDOWS 64-BIT and unzip the file.

After unzipping the file, rename the file to winlogbeat.

Open a PowerShell prompt as an Administrator (right-click on the PowerShell icon and select Run As Administrator). Then navigate to Winlogbeat directory, for me it was Software. So I ran this command:

Cd C:\Software\winlogbeat\

 

Installing ELK and Beats on Linux and Windows

Since script execution is disabled on our system, you need to set the execution policy for the current session to allow the script to run. So, from the PowerShell prompt, run the following commands to install the service:

PowerShell.exe -ExecutionPolicy UnRestricted -File .\install-service-winlogbeat.ps1

Note: Don’t start the service yet, we need to edit the configuration file first.

 Configuring Winlogbeat:

Go to the winlogbeat directory, look for winlogbeat.yml and open it with notepad or notepad++.

Make sure the configuration file looks like this:

#======================= Winlogbeat specific options 

winlogbeat.event_logs:

  - name: Application

    ignore_older: 72h

  - name: Security

  - name: System

  - name: ForwardedEvents

#==================== Elasticsearch template setting 

setup.template.settings:

  index.number_of_shards: 3

#============================== Dashboards 

setup.dashboards.enabled: true

#============================== Kibana 

setup.kibana:

host: "YOUR_IP_HERE:5601"

#----------------------------- Logstash output ---------------

output.logstash:

  hosts: ["YOUR_IP_HERE:5044"]

#----------------------------- Logging -----------------------

logging:

  to_files: true

  files:

    path: C:/Software/winlogbeat/LogsWin

  level: info

Save and exit the file.

This is a good link for more information to adding events: https://www.elastic.co/guide/en/beats/winlogbeat/current/configuration-winlogbeat-options.html

Test the config file on PowerShell:

.\winlogbeat.exe test config -c .\winlogbeat.yml -e

Now we have load the Index Template in Elasticsearch

In Elasticsearch, index templates are used to define settings and mappings that determine how fields should be analysed.

On PowerShell, make sure you are on the Winlogbeat directory. Then, run this command:

PowerShell.exe .\winlogbeat setup –template -E output.logstash.enabled=false -E ‘output.elasticsearch.hosts=[“YOUR_IP_HERE:9200”]’

If successful, you should see “Loaded index template”.

For this command to run successfully, a connection to Elasticsearch is required.

If you don’t have the connection, then try the following exporting the index template:

PowerShell.exe .\winlogbeat.exe export template –es.version 6.3.1 | Out-File -Encoding UTF8 win logbeat.template.json

Then run this command to install the template:

PowerShell.exe Invoke-RestMethod -Method Put -ContentType “application/json” -InFile winlogbeat.

template.json -Uri http://YOUR_IP_HERE:9200/_template/winlogbeat-6.3.1

You should now be able to successfully run winlogbeat.

Run PowerShell, and make sure that you are on the Winlogbeat directory. Then run:

Start-Service winlogbeat

Deleting Logs:

I just ran into an issue:

2018-07-18T09:32:27.192+0100   ERROR      instance/beat.go:691     Exiting: Error importing Kibana dashboards: fail to import the dashboards in Kibana: Error importing directory C:\software\winlogbeat\kibana: Failed to import index-pattern: Failed to load directory C:\software\winlogbeat\kibana/6/index-pattern:

  error loading C:\software\winlogbeat\kibana\6\index-pattern\winlogbeat.json: blocked by: [FORBIDDEN/12/index read-only / allow delete (api)];. Response: {“objects”:[{“id”:”winlogbeat-*”,”type”:”index-pattern”,”error”:{“message”:”blocked by: [FORBIDDEN/12/index read-only / allow delete (api)];”}}]}

You would get this issue for a number of reasons, but in my case, it was because I ran out of disk space and therefore Elasticsearch index has turned to read-only.

To avoid running into this issue often, either increase the disk space or change the configuration so that it doesn’t log logs that are older than a certain period. For example, on the winlogbeat configuration I have stated not to log logs that are older than 72 hours for the Application logs.

To empty the disk space so that you can start logging again, you can delete all the logs under winlogbeat, logstash, metricbeat or filebeat, run the following command:

PowerShell.exe Invoke-RestMethod -Method Delete “http://YOUR_IP_HERE:9200/winlogbeat-*”

If you want to delete another file besides winlogbeat, just change the name – like I’ve done below:

PowerShell.exe Invoke-RestMethod -Method Delete http://YOUR_IP_HERE:9200/logstash-*

After deleting the logs, go to kibana then go to Dev Tools and run these commands:

PUT _settings

    {

    "index": {

    "blocks": {

    "read_only_allow_delete": "false"

    }

    }

    }

   

PUT YOUR_INDEX_NAME/_settings

    {

    "index": {

    "blocks": {

    "read_only_allow_delete": "false"

    }

    }

    }  

The issue should now be resolved.


Liked this tutorial on Installing ELK + Beats on Linux + Windows?

Leave a comment with your thoughts!

After installing ELK, you have to then learn how to operate Kibana web interface. While you are here, why not check my post on how to do so. The links below goes into depth on: Discover, Visualization and Dashboard.

Want a quick definitions of each section on Kibana? Click below:

Helpful Links: