[ELK] Configure logstash
Recently I need do some statistic, so I choose ELK to build it. I will introduce about how to clean logs and send the logs to elasticsearch by logstash.
Logstash Version: 5.6.6
Firstly add a new configuration file named xxx.conf under config directory, the content are as below, please replace “xxx” with your business.
input {
file {
path => "/**/xxx.log"
codec => plain {
charset => "UTF-8"
}
tags => ["xxx"]
}
file {
path => "/**/xxx.log"
codec => plain {
charset => "UTF-8"
}
tags => ["xxx"]
}
}
filter {
if "xxx" in [tags] {
dissect {
mapping => {
"message" => "%{timestamp} - [%{thread}] - [%{level}] - [%{class}] - xxx:%{xxx}, xxx:%{xxx}, xxx:%{xxx}, xxx:%{xxx}, xxx:%{xxx}, xxx:%{xxx}, xxx:%{xxx}, xxx:%{xxx}"
}
}
}
if "xxx" in [tags] {
dissect {
mapping => {
"message" => "%{timestamp} - [%{thread}] - [%{level}] - [%{class}] - xxx:%{xxx}, xxx:%{xxx}, xxx:%{xxx}, xxx:%{xxx}, xxx:%{xxx}, xxx:%{xxx}, xxx:%{xxx}, xxx:%{xxx}, xxx:%{xxx}, xxx:%{xxx}, xxx:%{xxx}, xxx:%{xxx}, xxx:%{xxx}"
}
}
}
}
#replace @timestamp
#filter {
#date {
#match => ["timestamp", "yyyy-MM-dd HH:mm:ss,SSS"]
#target => ["@timestamp"]
#}
#}
output {
if "xxx" in [tags] {
elasticsearch {
index => "xxx"
hosts => ["http://xxx:9200"]
}
}
if "xxx" in [tags] {
elasticsearch {
index => "xxx"
hosts => ["http://xxx:9200"]
}
}
}
Then start logstash with this configuration file
bin/logstash -f config/xxx.conf
After this, please configure kibana dashborad, and you will get some cool charts.
P.S.
There are many kinds of input/filter/output, like jdbc/redis/kafka/mongodb, please refer to the official document.
If you are familiar with grok filter, you can also filter logs as below:
filter {
grok {
match => {
"message" => "%{TIMESTAMP_ISO8601:timestamp}%{SPACE}-%{SPACE}[.*]%{SPACE}-%{SPACE}xxx:%{GREEDYDATA:xxx},%{SPACE}xxx:%{GREEDYDATA:xxx},%{SPACE}xxx:%{GREEDYDATA:xxx},%{SPACE}xxx:%{GREEDYDATA:xxx},%{SPACE}xxx:%{GREEDYDATA:xxx},%{SPACE}xxx:%{GREEDYDATA:xxx},%{SPACE}xxx:%{GREEDYDATA:xxx},%{SPACE}xxx:%{GREEDYDATA:xxx}"
}
}
}
Reply