Komlan Florient dogbe

Nov 7, 2021

5 min read

Elasticsearch - Logstash - Kibana: Understanding and implementation using Docker .

In this topic related to ELK i will take you through some application and understanding of all the concept. Road map will take you to some theories and practices. We will cover installation from very zero of Elasticsearch, Logstash and Kibana.

Consider you have a web server in you infrastructure that is build or running on some base Operating System. A client in part of the world try to access a resource on your webpage. Some operation(Input - output) are done on your webpage like fill a form and so on. Architecture is bases on 3 tiers application in which we have one database behind. Data store in database will be record on log file storage in some location.

❓ 😞 Question come in mind.

  • How to monitored and observed service or activity ?
  • How to make decision base on statistic ?
  • Which tools can i used to do some operation on file or logs ?

Elasticsearch is a best tools nowadays for search and log analyses for different type of data.

Lets understanding What’s is Elasticsearch ?

Elasticsearch is a distributed, open source search and analytics engine for all types of data. It is base on the Lucene libraries and developed in Java.

For more details, referred to Elasticsearch official webpage: https://www.elastic.co/fr/what-is/elasticsearch

Before go in too much in Details, let Gives explanation of some key note.

ELK 😥 What’s this mean ?

“ELK stand for Elasticsearch Logstash Kibana”

What’s Logstash ?
Consider we have a log file in some many component (Component — in this case means operating system, web server, cloud instance and so one) and want to centralize it in one storage space.

  • Who is responsible to send a log to centralize system ?
  • How it will be done ?

Answer is giving by the power of Logstash.

Logstash is a powerful concept in ELK. Logstash is a free and open server-side data processing pipeline that ingests data from a multitude of sources, transforms it, and then sends it to your favorite “stash.”

To mainly understand the concept, Read this documentation about it for better explanation. https://www.elastic.co/logstash/

Another important word that need attention is Kibana.

Overview of kibana.

Kibana is a free and open user interface that lets you visualize your Elasticsearch data and navigate the Elastic Stack. Do anything from tracking query load to understanding the way requests flow through your apps.

Reference: https://www.elastic.co/kibana/

Resume: Logstash is the process to ingest data in centralize location. Elasticsearch is a database in which all data are store. Kibana is an interface for visualization.

Now let’s go for implementation part! 😀

  • Elasticsearch — Logstash — Kibana: Installation process

Step by step installation :
Goal:

At the end of this, you will be able to install kibana, Elasticsearch and Logstash each on separate environment.
Requirement:
- Three physical machine or virtual machine (Linux distribution)
- docker or Kubernetes install
- minimum 4 Go of RAM
- 2 CPU

Installation process was been done by using Docker image.
Why we are using docker for installation ?
Answer: Installation with docker is easiest and fast.

— Step 1: Docker network creating

Create a network in which all three component will communicate.
Pres-requisite: Install docker engine on your base operation system or virtual machine (Linux distribution)

Use this command to list all network available.

#docker network ls

Docker command with create option to set a new network.

#docker create network ELK_Network

For conformation, try to list network available. Output will be:

— Step 2: Elasticsearch image run

Elasticsearch can be install using docker repository. Go to this repo https://hub.docker.com/r/elastic/elasticsearch .In your terminal run:

#docker pull elastic/elasticsearch:7.15.1

<7.15.1> is the latest version on date in which this article is written

Reference: Install Elasticsearch with Docker | Elasticsearch Guide [7.15] | Elastic

Installation done successfully by using docker images command:

#docker images

Launch an Elasticsearch container:

#docker run -dit --name elasticSearch --net ELK_Network -p 9200:9200 -p 9300:9300 -e "discovery.type=single-node" elastic/elasticsearch:7.15.1

Check if container is run successfully.

#docker ps

Output:

CONTAINER ID   IMAGE                          COMMAND                  CREATED         STATUS         PORTS                                            NAMES
6366d112a2e1 elastic/elasticsearch:7.15.1 "/bin/tini -- /usr/l…" 7 minutes ago Up 7 minutes 0.0.0.0:9200->9200/tcp, 0.0.0.0:9300->9300/tcp elasticSearch

Test connectivity:

Syntax:

#curl IpAddress:9200

#curl localhost:9200

Output:

{
"name" : "6366d112a2e1",
"cluster_name" : "docker-cluster",
"cluster_uuid" : "fYCgXbhXSIKV_CzRw3doeQ",
"version" : {
"number" : "7.15.1",
"build_flavor" : "default",
"build_type" : "docker",
"build_hash" : "83c34f456ae29d60e94d886e455e6a3409bba9ed",
"build_date" : "2021-10-07T21:56:19.031608185Z",
"build_snapshot" : false,
"lucene_version" : "8.9.0",
"minimum_wire_compatibility_version" : "6.8.0",
"minimum_index_compatibility_version" : "6.0.0-beta1"
},
"tagline" : "You Know, for Search"
}

⚠️ To know ip address of your container

#docker inspect <container ID> | grep IPAdress

#docker inspect 6366d112a2e1 | grep IPAddress

😍Elasticsearch was install successfully

⚠️ To optimized your Ram memory utilization run:

#echo 3 > /proc/sys/vm/drop_caches

— Step 3: Install kibana docker image

Installation of kibana is very simple by using docker images. Go to docker repository https://hub.docker.com/_/kibana?tab=description and pull a image.

#docker pull kibana:7.14.1

Run kibana image:

#docker run -dit --name kibana --net ELK_Network -p 5601:5601 kibana:7.14.1

Check connectivity:

#curl localhost:5601

With you browser : http://ipAddress:5601

— Step 4: Connectivity between kibana and Elasticsearch

  • Cat indices API

It’s returns high-level information about indices in a cluster, including backing indices for data streams. It also use to get information related to index create.

Command used to check index.

#curl localhost:9200/_cat/indices

Output will be:

— Step 4: Installation of Logstash docker image

Go to docker hub repository https://hub.docker.com/_/logstash and download images.

#docker pull logstash:7.14.2

Launch Logstash images:

#docker run -dit --name logstash --net ELK_Network logstash:7.14.2

Our infrastructure is successful setup. 😍 😍

Stay connect with my up coming topic, who take you through functionality of Elasticsearch — kibana — Logstash.

Enjoy !!!!!!!!!!!!!!!