filebeat-收集日志写入到Kafka
Posted 小怪獣55
tags:
篇首语:本文由小常识网(cha138.com)小编为大家整理,主要介绍了filebeat-收集日志写入到Kafka相关的知识,希望对你有一定的参考价值。
filebeat 安装
root@ubuntu:/data# dpkg -i filebeat-6.8.1-amd64.deb
使用filebeat收集单个系统日志
1)测试写入本地文件
root@ubuntu:/data# grep -Ev ^$|# /etc/filebeat/filebeat.yml
--------------------------------------------------------------
filebeat.inputs:
- type: log
enabled: true
paths:
- /var/log/nginx/*.log
#测试写入本地文件
output.file:
path: "/tmp"
filename: "filebeat.log"
--------------------------------------------------------------
2)写入kafka
root@ubuntu:~# grep -Ev ^#|^$ /etc/filebeat/filebeat.yml
filebeat.inputs:
- type: log
enabled: true
paths:
- /var/log/nginx/*.log
document_type: "nginxlog-kafka"
exclude_lines: [^DBG]
exclude_files: [.gz$]
filebeat.config.modules:
path: $path.config/modules.d/*.yml
reload.enabled: false
setup.template.settings:
index.number_of_shards: 3
setup.kibana:
output.kafka:
hosts: ["192.168.47.113:9092","192.168.47.112:9092","192.168.47.111:9092"]
topic: "nginxlog-kafka"
partition.round_robin:
reachable_only: true
required_acks: 1
compression: gzip
max_message_bytes: 1000000
processors:
- add_host_metadata: ~
- add_cloud_metadata: ~
/usr/local/kafka/bin/kafka-topics.sh \\
--list \\
--zookeeper 192.168.47.111,192.168.47.112,192.168.47.113:2181
logstash读取kafka日志到elasticsearch
input
kafka
bootstrap_servers => "192.168.47.113:9092"
topics => ["nginxlog-kafka"]
codec => json
output
elasticsearch
hosts => ["192.168.47.106:9200"]
index => "kafka-nginx-log-%+YYYY.MM.dd"
以上是关于filebeat-收集日志写入到Kafka的主要内容,如果未能解决你的问题,请参考以下文章
通过kafka和filebeat收集日志 再保存到clickhouse 最后通过grafana展现
ELK日志系统设计方案-Filebeat日志收集推送Kafka
ELK日志系统设计方案-Filebeat日志收集推送Kafka