Centralized logging using beats and logstash with elasticsearch
Posted By : Yash Arora | 18-Sep-2018
In this blog, I am going to explain how you can send logs of multiple microservices to the different index of elastic search according to the log creation date and microservice name
Suppose I have a microservice named authService then in elastic search a new index will be created authServicelogs-2018.9.18 on the daily bases.
Let's see how can we achieve this
Prerequisites:-
1.Beats
2.logstash
3.elastic search
Beats:-Beats is the platform for single-purpose data shippers. They should be installed as lightweight agents and send data from thousands of machines to Logstash or Elasticsearch
Logstash:-Logstash is an open source, server-side data processing pipeline that ingests data from a multitude of sources simultaneously, transforms it, and then sends it to your favorite “stash.” (like
ElasticSearch:-Elasticsearch is a distributed and analytics engine able to solve a multiple numbers of use cases. It centrally stores our data so we can discover the required and uncover that is not required.
Here we are using Filebeat component of beats
Step 1:-
Install Filebeat from the official site of ElasticSearch and make changes in the filebeat.yml file
In this example, I am using two microservices logs so here is the changes you have to make in
filebeat.inputs:
- type: log
# Change this value true to enable this input configuration.
enabled: true
# Paths from where it will read the log file.
paths:
${HOME}/logs/AuthLogFile.log
fields:
app_id: authlogs
- type: log
# Change this value true to enable this input configuration.
enabled: true
# Paths that should be crawled and fetched. Glob based paths.
paths:
${HOME}/logs/AccountLogFile.log
fields:
app_id: accountlogs
In this file elastic search is already enabled comment it and uncomment logstash configuration in filebeat.yml and save it.
Here I am using file AuthLogfile.log this file is generated on the daily purpose by using:
logging.file= ${HOME}/logs/AuthLogFile.log
in application.properties file.
It will create the log file at above location and filebeat listen whenever there is an update in this file.
Here I am using extra field app_id in the above file. On the behalf of this field I will differ logs of different microservice and store in different index of elastic search.
To run filebeat use command
./filebeat -e -v -c filebeat.yml
Step 2:
Install logstash from official site of elastic search and create a new file logstash-beat.conf
in the folder where you have downloaded logstash
logstash-beat.conf
input {
beats {
port => 5044
type => "beats"
}
}
output{
if [type] == "beats"{
elasticsearch {
hosts => ["localhost:9200"]
index => "%{[fields][app_id]}-%{+YYYY.MM.dd}"
}
}
}
Before this you have elastic search is running in background. To run elastic search
use command bin/elasticsearch
To run logstash use command
bin/logstash -f logstash-beat.conf
This will start listening beats from file and send logs to elastic search. If you want to see the logs on elastic search use command in chrome
localhost:9200/authlogs-2018.09.18/_search?size=8000&pretty
You will see all the logs that you were getting in your console to elastic search
Hope this will be helpful.
Cookies are important to the proper functioning of a site. To improve your experience, we use cookies to remember log-in details and provide secure log-in, collect statistics to optimize site functionality, and deliver content tailored to your interests. Click Agree and Proceed to accept cookies and go directly to the site or click on View Cookie Settings to see detailed descriptions of the types of cookies and choose whether to accept certain cookies while on the site.
About Author
Yash Arora