logstash对nginx日志进行解析
Posted
tags:
篇首语:本文由小常识网(cha138.com)小编为大家整理,主要介绍了logstash对nginx日志进行解析相关的知识,希望对你有一定的参考价值。
logstash对nginx日志进行解析过滤转换等操作;
配置可以用于生产环境,架构为filebeat读取日志放入redis,logstash从redis读取日志后进行操作;
对user_agent和用户ip也进行了解析操作,便于统计;
input {
redis {
host => "192.168.1.109"
port => 6379
db => "0"
data_type => "list"
key => "test"
}
}
filter{
json {
source => "message"
remove_field => "message"
}
useragent {
source => "agent"
target => "agent"
remove_field => ["[agent][build]","[agent][os_name]","[agent][device]","[agent][minor]","[agent][patch]"]
}
date {
match => ["access_time", "dd/MMM/yyyy:HH:mm:ss Z"]
}
mutate {
remove_field => ["beat","host","prospector","@version","offset","input","source","access_time"]
convert => {"body_bytes_sent" => "integer"}
convert => {"up_response_time" => "float"}
convert => {"request_time" => "float"}
}
geoip {
source => "remote_addr"
target => "geoip"
remove_field => ["[geoip][country_code3]","[geoip][location]","[geoip][longitude]","[geoip][latitude]","[geoip][region_code]"]
add_field => ["[geoip][coordinates]", "%{[geoip][longitude]}"]
add_field => ["[geoip][coordinates]", "%{[geoip][latitude]}"]
}
mutate {
convert => ["[geoip][coordinates]","float"]
}
}
output {
if [tags][0] == "newvp" {
elasticsearch {
hosts => ["192.168.1.110:9200","192.168.1.111:9200","192.168.1.112:9200"]
index => "%{type}-%{+YYYY.MM.dd}"
}
stdout {
codec => rubydebug
}
#stdout用于调试,正式使用可以去掉
}
}
filebeat读取日志的写法:
filebeat.inputs:
- type: log
paths:
- /var/log/nginx/access.log
tags: ["newvp"]
fields:
type: newvp
fields_under_root: true
output.redis:
hosts: ["192.168.1.109"]
key: "test"
datatype: list
以上是关于logstash对nginx日志进行解析的主要内容,如果未能解决你的问题,请参考以下文章
Logstash收集nginx日志之使用grok过滤插件解析日志
Logstash收集nginx日志之使用grok过滤插件解析日志