Logging Docker Container Logs Using Filebeats

Posted By : Amarnath Arora | 24-Jan-2020

In this blog post, we will discuss the minimum configuration required to shipping docker logs. Before starting with filebeats logs shipping configuration we should know about filebeat and logstash.

 

FileBeat:

Filebeat could be a log information shipper for native files. Filebeat agent is put in on the server, which has to monitor, and filebeat monitors all the logs within the log directory and forwards to Logstash. Filebeat works supported 2 components: prospectors/inputs and harvesters.

  1. Input is to blame for controlling the harvesters and finding all sources to read from. In this field we define some values like: type ,tag, path,include_lines, exclude_lines etc.
  2. A harvester is answerable for reading the content of one file. The harvester reads every file, line by line, and sends the content to the output. One harvester is started for every file. The harvester is answerable for open and closes the file, which suggests that the file descriptor remains open whereas the harvester is running. If a file is removed or renamed whereas it’s being harvested, Filebeat continues to browse the file. This has the aspect impact that the house on your disk is reserved till the harvester closes.

Logstash:

Logstash is a light-weight, open-source, server-side data processing tool that allows you to gather data from a variety of sources, transform it on the fly, and send it to your desired destination like elasticsearch. It collects the data from many types of sources like filebeats, metricbeat etc.

 

Install and Configure filebeats:

1. Install filebeats from following link with curl

curl -LO https://artifacts.elastic.co/downloads/beats/filebeat/filebeat-7.0.1-linux-x86_64.tar.gz

2. Extract the tar.gz file using following command

tar -xf filebeat-7.0.1-linux-x86_64.tar.gz

3. In this(filebeat-7.0.1-linux-x86_64) directory you will get a filebeats.yml file we need to configure it.

4.To shipping the docker container logs we need to set the path of docker logs in filebeat.yml

5. Also, we need to modify the modules.d/logstash.yml (here we need to add the logs path)

6. To check the config command is "./filebeat test config"

7. Check connection command is "./filebeat test output"

8. Run the filebeats "./filebeat run" or "./filebeat -e"

 

 

 

filebeat.inputs:
- type: log
  enabled: false
  paths:
    - /var/lib/docker/containers/*/*-json.log
tags: ["docker-logs"]
setup.kibana:
  host: "Server_IP:5601"
output.logstash:
  hosts: ["Serever_IP:5000"]

Here, types may be log, udp, tcp, syslog etc. depending on the requirement.

At server side create a pipeline for logstash  that is logstash.conf.

 

input {
	beats {

		port => 5000
                host => "0.0.0.0"
	}
}


       if "docker-logs" in [tags]  {
               elasticsearch {
               hosts => ["elasticsearch:9200"]
               user => elastic
               password => changeme
               index => "docker-logs"
               }
       } else {
               elasticsearch {
               hosts => ["elasticsearch:9200"]
               user => elastic
               password => changeme
               index => "docker"
  }
}


        stdout {
                codec => rubydebug
               } 
}

Then we need to restart logstash service, in my case I’m using docker-compose for ELK clusters. So the command would be "docker-compose restart logstash." On the bases of tags, we can identify the logs of different systems. If you have kibana setup then on kibana you will directly get the logs or if you have only logstash setup, then you can go to logs of logstash you will get the logs of docker.

 

 

 

About Author

Author Image
Amarnath Arora

Amarnath has keen interest in cloud technologies & automation. He is very eager to learn and implement new technologies.

Request for Proposal

Name is required

Comment is required

Sending message..