Understand Elastic Stack under 5 minutes

source: www.elastic.co/

*At the end of this blog, you will be able to spin a basic Elastic Stack node.

What problem do Elastic Stack solve?

Let’s consider you have a complex microservices architecture and you need to collect logs from different nodes, and correlate them for monitoring and analyzing.

In this case, you will need to write an entirely different application that could collect logs from different nodes transform them in a single understandable format and push them for frontend visualization.

That’s where Elastic Stack comes to save your life.

In the early days elasticsearch was only used to store, search and retrieve data, but with time it has evolved continuously and now it can be used in multiple places such as:

1. As a multitenant-capable full-text search engine

2. To analyze metrics and logs from multiple nodes.

3. As an open-source SIEM solution.

4. To monitor application performance

5. For storing and indexing huge data

Ingredients of Elastic stack:

Elasticsearch, Kibana and Logstash make up the ELK stack but with addition of Beats it is now called Elastic stack.

Let’s dissect each component

Elasticsearch:

“Elasticsearch is a distributed RESTful search and analytics engine that centrally stores your data. Data inside elasticsearch is known as Documents and are represented in a JSON format. Every document has a unique ID.”

In simple words, Elasticsearch is a search engine that stores and indexes data and makes it available for near real-time fetching.

The reason for its fast performance is the underlying architecture. It indexes data in the form of shards and in each shard an apache Lucene index is running.

Logstash:

Logs generated from a firewall has a different format than the logs generated in an AD (Active Directory) server. The problem here is we need to transform data from multiple sources into a single format. Here, Logstash is the problem solver.

Simply, it’s the stash for logs. Technically, it refines the data coming from different sources into a single format for ingesting in elasticsearch. The predefined matching patterns understand the data of multiple formats and transform them into a single key-value format.

Kibana:

All of these things sound great but we as humans understand more in visual form. Here comes the Kibana into the game.

Kibana is an open-source data visualization application for elasticsearch. It is used to send queries to elasticsearch and fetch the desired data to visualize it in an interactive form. In kibana you can write queries to fetch data according to your need and also create customized dashboards to visualize data accordingly.

Beats:

You might be thinking that how you’re going to gather all of this data. Relax Beats have got your back.

Beats are lightweight agents used for collecting particular data from the servers and sends it to the elasticsearch or logstash. There are multiple agents that can be used to gather particular data. Commonly used agents include:

Winlog beat: used for collecting windows logs

Audit beats: used for collecting Linux audit logs

APM agent: used to collect application performance metrics

Packet beat: Used to colled network logs

How to set up a basic elastic stack node:

Our goal here is to collect logs from one of our critical windows server and monitor its activity.

Step 1:
Download and install Elasticsearch and kibana in your system:

*The default installation will work just fine

https://www.elastic.co/downloads/elasticsearch

https://www.elastic.co/downloads/kibana

Step 2: Install winlog beat in the server:

1>> Download the agent from here: https://www.elastic.co/downloads/beats/winlogbeat

2>> Extract the content in C:\ProgramFiles and rename it as “Winlogbeat

3>> Open powershell

4 >> cd in to winlogbeat directory and run this command :

PS C:\Program Files\Winlogbeat> .\install-service-winlogbeat.ps1

5>> open winlogbeat.yml file in your favorite text editor and add elasticsearch hostname and port like this:

output.elasticsearch:
hosts: ["myEShost:9200"]

6>> Configure which logs to send in elasticsearch

winlogbeat.event_logs:
- name: Application
- name: Security
- name: System

7>> To test your configuration run this command

PS C:\Program Files\Winlogbeat> .\winlogbeat.exe test config -c .\winlogbeat.yml -e

8>> Setup winlog beat using this command

PS > .\winlogbeat.exe setup -e

End>> At last start you winlog beat service:

PS C:\Program Files\Winlogbeat> Start-Service winlogbeat

Now, go to kibana dashboard there you will see an index pattern winlogbeat-*

Congratulations!! you have just set up a basic Elastic Stack node.