Home >php教程 >php手册 >Kibana Logstash Elasticsearch log query system, kibanalogstash

Kibana Logstash Elasticsearch log query system, kibanalogstash

WBOY
WBOYOriginal
2016-07-06 14:25:08969browse

Kibana Logstash Elasticsearch log query system, kibanalogstash

The purpose of building this platform is to facilitate log query for operation, maintenance and research and development. Kibana is a free web shell; Logstash integrates various log collection plug-ins and is an excellent regular cutting log tool; Elasticsearch is an open source search engine framework (supports cluster architecture).

1 Installation requirements

1.1 Theoretical Topology

1.2 Installation environment

1.2.1 Hardware environment

192.168.50.62 (HP DL 385 G7, RAM: 12G, CPU: AMD 6128, DISK: SAS 146*4)

192.168.50.98 (HP DL 385 G7, RAM: 12G, CPU: AMD 6128, DISK: SAS 146*6)

192.168.10.42 (Xen virtual machine, RAM: 8G, CPU: ×4, DISK: 100G)

1.2.2 Operating System

CentOS 5.6 X64

1.2.3 Web-server basic environment

Nginx php (the installation process is skipped)

1.2.4 Software List

JDK 1.6.0_25

logstash-1.1.0-monolithic.jar

elasticsearch-0.18.7.zip

redis-2.4.12.tar.gz

kibana

1.3 How to obtain

1.3.1 Jdk acquisition path

http://www.oracle.com/technetwork/java/javase/downloads/jdk-6u25-download-346242.html

1.3.2 Logstash acquisition path

http://semicomplete.com/files/logstash/logstash-1.1.0-monolithic.jar

1.3.3 Elasticsearch acquisition path

https://github.com/downloads/elasticsearch/elasticsearch/ elasticsearch-0.18.7.zip

1.3.4 Kibana Get Path

http://github.com/rashidkpc/Kibana/tarball/master

2 Installation steps

2.1 JDK download and installation

Basic installation

wget http://download.oracle.com/otn-pub/java/jdk/6u25-b06/jdk-6u25-linux-x64.bin

sh jdk-6u25-linux-x64.bin

mkdir -p /usr/java

mv ./jdk1.6.0_25 /usr/java

ln –s /usr/java/jdk1.6.0_25 /usr/java/default

Edit the /etc/profile file and add the following lines

export JAVA_HOME=/usr/java/default

export PATH=$JAVA_HOME/bin:$PATH

export CLASSPATH=.:$JAVA_HOME/lib/tools.jar:$JAVA_HOME/lib/dt.jar:$CLASSPATH

Refresh environment variables

source /etc/profile

2.2 Redis download and installation

wget http://redis.googlecode.com/files/redis-2.4.14.tar.gz

make –j24

make install

mkdir -p /data/redis

cd /data/redis/

mkdir {db,log,etc}

2.3 Elasticsearch download and installation

cd /data/

mkdir –p elasticsearch && cd elasticsearch

wget --no-check-certificate https://github.com/downloads/elasticsearch/elasticsearch/ elasticsearch-0.18.7.zip

unzip elasticsearch-0.18.7.zip

2.4 Logstash download and installation

mkdir –p /data/logstash/ && cd /data/logstash

wget http://semicomplete.com/files/logstash/logstash-1.1.0-monolithic.jar

2.5 Kibana download and installation

wget http://github.com/rashidkpc/Kibana/tarball/master --no-check-certificate

tar zxvf master

3 Related configuration and startup

3.1 Redis configuration and startup

3.1.1 Configuration File

vim /data/redis/etc/redis.conf

#--------------------------------------------- ------

#this is the config file for redis

pidfile /var/run/redis.pid

port 6379

timeout 0

loglevel verbose

logfile /data/redis/log/redis.log

databases 16

save 900 1

save 300 10

save 60 10000

rdbcompression yes

dbfilename dump.rdb

dir /data/redis/db/

slave-serve-stale-data yes

appendonly no

appendfsync everysec

no-appendfsync-on-rewrite no

auto-aof-rewrite-percentage 100

auto-aof-rewrite-min-size 64mb

slowlog-log-slower-than 10000

slowlog-max-len 128

vm-enabled no

vm-swap-file /tmp/redis.swap

vm-max-memory 0

vm-page-size 32

vm-pages 134217728

vm-max-threads 4

hash-max-zipmap-entries 512

hash-max-zipmap-value 64

list-max-ziplist-entries 512

list-max-ziplist-value 64

set-max-intset-entries 512

zset-max-ziplist-entries 128

zset-max-ziplist-value 64

activerehashing yes

3.1.2 Redis startup

[logstash@Logstash_2 redis]# redis-server /data/redis/etc/redis.conf &

3.2 Elasticsearch configuration and startup

3.2.1 Elasticsearch startup

[logstash@Logstash_2 redis]# /data/elasticsearch/elasticsearch-0.18.7/bin/elasticsearch –p ../esearch.pid &

3.2.2 Elasticsearch cluster configuration

curl 127.0.0.1:9200/_cluster/nodes/192.168.50.62

3.3 Logstash configuration and startup

3.3.1 Logstash configuration file

input {

redis {

host => "192.168.50.98"

data_type =>"list"

key => "logstash:redis"

type => "redis-input"

}

}

filter {

grok {

type => "linux-syslog"

pattern => "%{SYSLOGLINE}"

}

grok {

type => "nginx-access"

pattern => "%{NGINXACCESSLOG}"

}

}

output {

elasticsearch {

host =>"192.168.50.62"

}

}

3.3.2 Logstash starts as Index

java -jar logstash.jar agent -f my.conf &

3.3.3 Logstash starts as agent

Configuration file

input {

file{

type => "linux-syslog"

path => [ "/var/log/*.log", "/var/log/messages", "/var/log/syslog" ]

}

file {

type => "nginx-access"

path => "/usr/local/nginx/logs/access.log"

}

file {

type => "nginx-error"

path => "/usr/local/nginx/logs/error.log"

}

}

output {

redis {

host => "192.168.50.98"

data_type =>"list"

key => "logstash:redis"

}

}

Agent starts

java -jar logstash-1.1.0-monolithic.jar agent -f shipper.conf &

3.3.4 kibana configuration

First add site configuration in nginx

server {

listen 80;

server_name logstash.test.com;

index index.php;

root /usr/local/nginx/html;

#charset koi8-r;

#access_log logs/host.access.log main;

location ~ .*.(php|php5)$

{

#fastcgi_pass unix:/tmp/php-cgi.sock;

fastcgi_pass 127.0.0.1:9000;

fastcgi_index index.php;

include fastcgi.conf;

}

}

4 Performance Tuning

4.1 Elasticsearch Tuning

4.1.1 JVM Tuning

Edit the Elasticsearch.in.sh file

ES_CLASSPATH=$ES_CLASSPATH:$ES_HOME/lib/*:$ES_HOME/lib/sigar/*

if [ "x$ES_MIN_MEM" = "x" ]; then

ES_MIN_MEM=4g

fi

if [ "x$ES_MAX_MEM" = "x" ]; then

ES_MAX_MEM=4g

fi

4.1.2 Elasticsearch Index Compression

vim index_elastic.sh

#!/bin/bash

#comperssion the data for elasticsearch now

date=` date %Y.%m.%d `

# compression the new index;

/usr/bin/curl -XPUT http://localhost:9200/logstash-$date/nginx-access/_mapping -d '{"nginx-access" : {"_source" : { "compress " : true }}}'

echo ""

/usr/bin/curl -XPUT http://localhost:9200/logstash-$date/nginx-error/_mapping -d '{"nginx-error" : {"_source" : { "compress " : true }}}'

echo ""

/usr/bin/curl -XPUT http://localhost:9200/logstash-$date/linux-syslog/_mapping -d '{"linux-syslog" : {"_source" : { "compress " : true }}}'

echo ""

Save the script and execute it

sh index_elastic.sh

5 Use

5.1 Logstash query page

Use Firefox or Google Chrome to visit http://logstash.test.com

Statement:
The content of this article is voluntarily contributed by netizens, and the copyright belongs to the original author. This site does not assume corresponding legal responsibility. If you find any content suspected of plagiarism or infringement, please contact admin@php.cn