ELK Stack

ELK logo

ELK Stack stands for Elasticsearch, Logstash and Kibana.

Need for Log Analysis:

Each and every application generate some logs which helps us to get an idea about how an application is performing, whether the application is performing in the way it is expected to or there is some issue which will be going to come in the near future. All those things we can very well get to know with the help of logs.

Log analysis can be centralized, or it can be de-centralized. De-centralized is something where the logs are generated on each and every webserver and one has to login to each and every webserver to drill down and troubleshoot the log issues. This is not an ideal approach also, a time-consuming process.

Therefore, it is recommended to store the logs at central place for analysis.

How ELK Stack helps with Log Analysis?

Log analysis is the process of analyzing the computer/machine generated data which includes collection of the log data, cleaning of data, conversion into structured form, analysis of data and obtaining the result.

Logs are always unstructured form of data. One has to collect the logs in a place, extract it, convert it and analysis the data and then obtain the result.

Log analysis needed for issue debugging, predictive analysis, security analysis, performance analysis, Internet of Things (IoT) and debugging.

The below diagram shows the diagrammatic flow of log analysis and its working.

1

Fig: Process of Log Analysis

Problems with Log Analysis:

  1. Non-consistent log format:

We have different web applications, suppose a Tomcat, an IIS server, Apache server etc., each of these applications have their own syntax for writing logs. This is what we called a non-consistent log format. One should be aware about the particular syntax or log of particular application.

  1. Non-consistent time format:

Each and every application have different time format. Say for example, some have UTC time, central or eastern time.

  1. De-centralized logs:

Each and every server has their own log directories, so they are located in a de-centralized way. We have to login to each and every server for troubleshoot.

  1. Expert knowledge requirement:

Each and every member in team do not have access to log directories to visualize the logs.

What is ELK Stack?

ELK stack is a combination of three opensource tools (Elasticsearch, Logstash and Kibana) which forms a log management tool/ platform that helps in deep searching, analyzing and visualizing the log generated from different machines.

Elasticsearch:

It is a tool which plays a major role in storing the logs in the JSON format, indexing it and allowing the searching of the log.

Features:

  1. Search engine/ search server.
  2. NoSQL database i.e., can’t use SQL for queries.
  3. Based on Apache Lucene and provides RESTful API.
  4. Provides horizontal scalability, reliability and multenant capability for real time search.
  5. Uses indexes to search which makes it faster.

Logstash:

It is an opensource tool which use to collect, parse and filter the syslog as input.

Features:

  1. Data pipeline tool.
  2. Centralizes the data processing.
  3. Collects, parses and analyzes large variety of structured/ unstructured data and events.
  4. Provides plugins to connect to various types of input sources and platforms.

Kibana:

It is a web interface which is align us to search, display and compile the data. It is responsible to presenting the data in the visual format in your user interface. It helps in designing the charts, bar-graphs, reports, etc. It is a graphical tool.

Features:

  1. Visualization tool
  2. Provides real-time analysis, summarization, charting and debugging capabilities.
  3. Provides instinctive and user-friendly interface.
  4. Allows sharing of snapshots of the logs searched through.
  5. Permits saving the dashboard and managing multiple dashboards.

How ELK Stack works?

There are some servers which are maintaining their own logs in their own directories. ELK stack collecting the logs in central place from the servers, pulling out the logs with the help of Logstash. Elasticsearch working upon that data which is there in the data pipeline which is been collected by Logstash and it uses the search and analysis to index the data into useful information.

Later, Kibana is presenting the data into the form of charts and graphs. The same data which is been collected by Elasticsearch, collected index into useful information.

Below diagram shows the workflow of ELK Stack.

2

Fig: Workflow of ELK Stack

Summary!

ELK stack is a very useful an opensource tool which is been used by many companies like LinkedIn, OpenStack, Medium, etc., which helps in logs analysis. This blog gives you a complete overview of ELK stack, log analysis and working of ELK stack.

Leave a Reply