Home  >  Article  >  Backend Development  >  Using ELK for log collection and analysis in Beego

Using ELK for log collection and analysis in Beego

PHPz
PHPzOriginal
2023-06-23 09:34:401222browse

Using ELK for log collection and analysis in Beego

ELK is a very popular log collection and analysis tool set, which consists of three open source software: Elasticsearch, Logstash and Kibana. This toolset can be used to search, analyze, and visualize large amounts of log data in real time.

In the development of web applications, logs are a very useful source of information that can be used to track application behavior, debugging, and performance optimization. Beego is an open source web framework written in Go language, which can easily use ELK for log collection and analysis.

This article will introduce how to configure and use ELK in Beego for log collection and analysis, and how to use Kibana for data visualization.

1. Install and configure ELK

First you need to install and configure ELK. You can download the latest version of the ELK software package on the official website: https://www.elastic.co/downloads/

Installing ELK requires first installing Java and setting the JAVA_HOME environment variable, then decompressing the ELK package, and starting Elasticsearch and Kibana:

cd elasticsearch-7.6.0/bin
./elasticsearch

cd kibana-7.6.0/bin
./kibana

Then edit the Logstash configuration file logstash.conf, configure the input, Filters and Output:

input {
    tcp {
        port => 5000
        codec => json
    }
}

filter {
    if [type] == "beego" {
        grok {
            match => { "message" => "%{COMBINEDAPACHELOG}" }
        }
    }
}

output {
    elasticsearch {
        hosts => ["localhost:9200"]
        index => "%{type}-%{+YYYY.MM.dd}"
    }
}

This configuration file specifies that Logstash listens for TCP connections on port 5000 and uses JSON encoding and decoding. When a log message sent by a Beego application is received, the log message is parsed using a regular expression grok filter. The parsed log messages are then output to Elasticsearch.

2. Using ELK in the Beego application

Use logrus as the log library in the Beego application, and you can send the logs to the Logstash server:

package main

import (
    "github.com/astaxie/beego"
    "github.com/sirupsen/logrus"
    "github.com/x-cray/logrus-prefixed-formatter"
    "gopkg.in/natefinch/lumberjack.v2"
)

func init() {
    logLevel, err := logrus.ParseLevel(beego.AppConfig.String("logrus.level"))
    if err != nil {
        logLevel = logrus.InfoLevel
    }

    logrus.SetLevel(logLevel)
    logrus.SetFormatter(&prefixed.TextFormatter{})
    logrus.SetOutput(&lumberjack.Logger{
        Filename:   beego.AppConfig.String("logrus.filename"),
        MaxSize:    10, // megabytes
        MaxBackups: 3,
        MaxAge:     28, // days
    })

    logrus.AddHook(&logrusHook{})
}

type logrusHook struct{}

func (h *logrusHook) Levels() []logrus.Level {
    return logrus.AllLevels
}

func (h *logrusHook) Fire(entry *logrus.Entry) error {
    message, err := entry.String()
    if err != nil {
        return err
    }

    logstash := beego.AppConfig.String("logstash.addr")
    connection, err := net.Dial("tcp", logstash)
    if err != nil {
        return err
    }

    defer connection.Close()

    _, err = connection.Write([]byte(message))
    return err
}

This code snippet , first use the logrus library to initialize the logger and output the log to the file and standard output. Then add a logrusHook to send a JSON formatted log message to the Logstash server every time it is logged.

It is very convenient to use logrus as a logging library in Beego applications. You only need to import the logrus library in the code, and then use the logrus.WithFields() method to record log information in the form of key-value pairs.

3. Use Kibana for data visualization

Kibana is a web interface that can query, visualize and analyze log data. It can retrieve data from Elasticsearch and visualize it. Open Kibana's address in a browser, usually http://localhost:5601, and select the "Discover" menu to search and visualize the collected log data.

In Kibana, you can create dashboards to visualize multiple queries together to monitor the running status of the system. Filters and tags can also be used to query and visualize the data more precisely.

4. Summary

Using ELK for log collection and analysis in Beego applications is a very convenient and practical method. ELK provides flexible log filtering and visualization functions to quickly locate application problems and optimize performance. At the same time, the running status of the application can be monitored more intuitively through visualization and dashboards.

The above is the detailed content of Using ELK for log collection and analysis in Beego. For more information, please follow other related articles on the PHP Chinese website!

Statement:
The content of this article is voluntarily contributed by netizens, and the copyright belongs to the original author. This site does not assume corresponding legal responsibility. If you find any content suspected of plagiarism or infringement, please contact admin@php.cn