search

Home  >  Q&A  >  body text

php - How to solve data loss when logstash writes data to elasticsearch

There are more than 600 pieces of data in my log, but only more than 300 pieces are written to elasticsearch.
Does anyone know the reason for this?
This is my configuration
input {

file {
    path => ["/usr/local/20170730.log"]
    type => "log_test_events"
    tags => ["log_tes_events"]
    start_position => "beginning"
    sincedb_path => "/data/logstash/sincedb/test.sincedb"
    codec => "json"
    close_older => "86400"
    #1 day
    ignore_older => "86400"
}
beats{port => 5044}

}
filter {

urldecode {
    all_fields => true
}

}

output{

   elasticsearch {
      hosts  => "localhost:9200"
      index  => "logstash_%{event_date}"
}

stdout { codec => json }
}

欧阳克欧阳克2706 days ago1082

reply all(1)I'll reply

  • 过去多啦不再A梦

    过去多啦不再A梦2017-07-01 09:13:55

    Because when reading the log, the ES template automatically creates a data type based on the format of the data. For example, the value of field a is int and string. The first index he creates reads a number, which is an int type index

    Modify configuration and do mapping
    output {

          elasticsearch {
            hosts  => "localhost:9200"
            index  => "test1"
            manage_template => true
            template_overwrite => true
            template => "/usr/local/logstash/templates/stat_day.json"
         }
         

    stat_day.json template format

         {

    "order" : 1,
    "template" : "test1",
    "mappings" : {

     "log_test": {
          "properties" : {
                "event_id": { "type": "string"}
           }
      }

    }
    }

    reply
    0
  • Cancelreply