Home  >  Article  >  System Tutorial  >  Using Docker to build an ELK+Filebeat log centralized management platform

Using Docker to build an ELK+Filebeat log centralized management platform

WBOY
WBOYforward
2024-01-12 17:51:131152browse
Current environment

1. System: centos 7

2.docker 1.12.1

introduce

ElasticSearch

Elasticsearch is a real-time distributed search and analysis engine that can be used for full-text search, structured search and analysis. It is a search engine based on the full-text search engine Apache Lucene and written in Java language.

Logstash

Logstash is a data collection engine with real-time channel capabilities. It is mainly used to collect and parse logs and store them in ElasticSearch.

Kibana

Kibana is a web platform based on the Apache open source protocol and written in JavaScript language to provide analysis and visualization for Elasticsearch. It can search in Elasticsearch's index, interact with data, and generate tables and graphs in various dimensions.

Filebeat

Introducing Filebeat as a log collector is mainly to solve the problem of high overhead of Logstash. Compared with Logstash, Filebeat occupies almost negligible system CPU and memory.

Architecture

Do not introduce Filebeat

Using Docker to build an ELK+Filebeat log centralized management platform

Introducing Filebeat

Using Docker to build an ELK+Filebeat log centralized management platform

deploy

Start ElasticSearch

docker run -d -p 9200:9200 --name elasticsearch elasticsearch

Start Logstash

# 1. 新建配置文件logstash.conf
input {
beats {
port => 5044
}
}

output {
stdout {
codec => rubydebug
}
elasticsearch {
#填写实际情况elasticsearch的访问IP,因为是跨容器间的访问,使用内网、公网IP,不要填写127.0.0.1|localhost
hosts => ["{$ELASTIC_IP}:9200"]

}
}

# 2.启动容器,暴露并映射端口,挂载配置文件
docker run -d --expose 5044 -p 5044:5044 --name logstash -v "$PWD":/config-dir logstash -f /config-dir/logstash.conf

Start Filebeat

Download address: https://www.elastic.co/downloads/beats/filebeat

# 1.下载Filebeat压缩包
wget https://artifacts.elastic.co/downloads/beats/filebeat/filebeat-5.2.2-linux-x86_64.tar.gz

# 2.解压文件
tar -xvf filebeat-5.2.2-linux-x86_64.tar.gz

# 3.新建配置文件filebeat.yml
filebeat:
prospectors:
- paths:
- /tmp/test.log #日志文件地址
input_type: log #从文件中读取
tail_files: true #以文件末尾开始读取数据
output:
logstash:
hosts: ["{$LOGSTASH_IP}:5044"] #填写logstash的访问IP

# 4.运行filebeat
./filebeat-5.2.2-linux-x86_64/filebeat -e -c filebeat.yml

Start Kibana

docker run -d --name kibana -e ELASTICSEARCH_URL=http://{$ELASTIC_IP}:9200 -p 5601:5601 kibana
test

Simulation log data

# 1.创建日志文件
touch /tmp/test.log

# 2.向日志文件中写入一条nginx访问日志
echo '127.0.0.1 - - [13/Mar/2017:22:57:14 +0800] "GET / HTTP/1.1" 200 3700 "-" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_11_0) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/46.0.2490.86 Safari/537.36" "-"' >> /tmp/test.log

Visit http://{$KIBANA_IP}:5601

Using Docker to build an ELK+Filebeat log centralized management platform

Using Docker to build an ELK+Filebeat log centralized management platform

Summarize

This article mainly describes how to build ELK step by step and the role Filebeat plays in it.

Here is just a demonstration for you. When deploying in a production environment, you need to use data volumes for data persistence. Container memory issues also need to be considered. Elasticsearch and logstash are relatively memory intensive. If they are not used Limitations may bring down your entire server.

Of course, security factors cannot be ignored, such as transmission security, minimized exposure of port permissions, firewall settings, etc.

Follow-up

logstash parses log formats, such as JAVA, nginx, nodejs and other logs;

Common search syntax for elasticsearch;

Create visual charts through kibana;

The above is the detailed content of Using Docker to build an ELK+Filebeat log centralized management platform. For more information, please follow other related articles on the PHP Chinese website!

Statement:
This article is reproduced at:linuxprobe.com. If there is any infringement, please contact admin@php.cn delete