logstash写入到kafka和从kafka读取日志
Posted
tags:
篇首语:本文由小常识网(cha138.com)小编为大家整理,主要介绍了logstash写入到kafka和从kafka读取日志相关的知识,希望对你有一定的参考价值。
收集nginx日志放到kafka
修改nginx日志格式:[nginx日志格式修改](https://blog.51cto.com/9025736/2373483)
input {
file {
type => "nginx-access"
path => "/data/wwwlogs/access_nginx.log"
start_position => "beginning"
codec => json
}
file {
path => "/var/log/messages"
start_position => "beginning"
type => "system-log-252"
}
}
}
output {
if [type] == "nginx-access" {
kafka {
bootstrap_servers => "192.168.1.252:9092" #kafka服务器地址
topic_id => "252nginx-accesslog"
batch_size => 5
codec => "json" #写入的时候使用json编码,因为logstash收集后会转换成json格式
}
}
}
if [type] == "system-log-252" {
kafka {
bootstrap_servers => "192.168.1.252:9092"
topic_id => "system-log-252"
batch_size => 5
codec => "json" #写入的时候使用json编码,因为logstash收集后会转换成json格式
}
}
}
}
配置logstash从kafka读取日志
input {
kafka {
bootstrap_servers => "192.168.1.252:9092" #kafka服务器地址
topics => "252nginx-accesslog"
batch_size => 5
codec => "json" #写入的时候使用json编码,因为logstash收集后会转换成json格式
group_id => "252nginx-access-log"
consumer_threads => 1
decorate_events => true
}
kafka {
bootstrap_servers => "192.168.1.252:9092"
topics => "system-log-252"
consumer_threads => 1
decorate_events => true
codec => "json"
}
}
output {
if [type] == "252nginx-accesslo" {
elasticsearch {
hosts => ["192.168.1.252:9200"]
index => "252nginx-accesslog-%{+YYYY.MM.dd}"
}}
if [type] == "system-log-252" {
elasticsearch {
hosts => ["192.168.1.252:9200"]
index => "system-log-1512-%{+YYYY.MM.dd}"
}
}
以上是关于logstash写入到kafka和从kafka读取日志的主要内容,如果未能解决你的问题,请参考以下文章
logstash_output_kafka:Mysql同步Kafka深入详解
logstash_output_kafka:Mysql同步Kafka深入详解