filebeat采集多个目录日志
Posted
tags:
篇首语:本文由小常识网(cha138.com)小编为大家整理,主要介绍了filebeat采集多个目录日志相关的知识,希望对你有一定的参考价值。
参考技术A2020-07-30
预期效果:在kibana中建立不同的所以,通过索引可以查到不同日志的信息
(如果不同目录下的日志格式和类型一致,可以输出到logstash的同一端口,反之应该分开,即一个日志就需要写一个filebeat配置文件+一个logstash配置文件)
采集目标日志的路径:
C:\\testlog\\test.log(例:2020-07-30 15:05:54 | INFO | This is a test log13, kkkk55!)
C:\\testlog1\\test1.log(例:2020-07-30 14:56:30,674 [112] DEBUG performanceTrace 1120 http://api.114995.com:8082/api/Carpool/QueryMatchRoutes 183.205.134.240 null 972533 310000 TITTL00 HUAWEI 860485038452951 3.1.146 HUAWEI 5.1 113.552344 33.332737 发送响应完成 Exception:(null))
(两个日志的内容格式不同)
配置文件路径:D:\\Linux\\data\\ELK-Windows\\elk\\filebeat-7.8.0-windows-x86_64\\filebeat.yml
配置文件路径:D:\\Linux\\data\\ELK-Windows\\elk\\filebeat-7.8.0-windows-x86_64\\filebeat1.yml
(拷贝第一个filebeat.yml文件重命名并修改内容)
配置文件路径:D:\\Linux\\data\\ELK-Windows\\elk\\logstash-7.8.0\\config\\logstash.yml
配置文件路径:D:\\Linux\\data\\ELK-Windows\\elk\\logstash-7.8.0\\config\\logstash.yml(拷贝修改)
powershell执行(一共开启4个powershell终端)
数据存储路径:D:\\Linux\\data\\ELK-Windows\\elk\\logstash-7.8.0\\config
数据存储路径:D:\\Linux\\data\\ELK-Windows\\elk\\logstash-7.8.0\\config\\logstash1(自己新建)
默认的数据存储路径是D:\\Linux\\data\\ELK-Windows\\elk\\filebeat-7.8.0-windows-x86_64\\data
默认的数据存储路径是D:\\Linux\\data\\ELK-Windows\\elk\\filebeat-7.8.0-windows-x86_64\\data1
(注意:data1原本是没有的,需要自己创建)
(必须指定新的数据存储路径,否则执行失败)
1.分别在两个日志文件中新加几条日志记录,并保存
2.观察 http://172.10.0.108:9100/ 是否生成新的索引
filebeat采集k8s日志 - 软链接(自定义docker目录使用此方法)
文章目录
1、安装
1、下载filebeat
https://artifacts.elastic.co/downloads/beats/filebeat/filebeat-8.4.0-linux-x86_64.tar.gz
2、上传服务器,解压
tar -zxvf filebeat-8.4.0-linux-x86_64.tar.gz
cd filebeat-8.4.0-linux-x86_64
2、配置
日志源使用docker软链接方式
- 开启软链接 symlinks: true
- 软链接路径 paths: /var/log/containers/*.log
修改filebeat.yml
# ============================== Filebeat inputs ===============================
filebeat.inputs:
- type: container
symlinks: true
containers.ids:
- "*"
id: my-filestream-id
enabled: true
paths:
- /var/log/containers/*.log
#- /data/docker/containers/*/*-json.log
fields:
cluster: cluster-dev
topic: kafka_log_18603
fields_under_root: true
tail_files: true
json.keys_under_root: true
# ============================== Filebeat modules ==============================
filebeat.config.modules:
path: $path.config/modules.d/*.yml
reload.enabled: false
# ----------------------------- kafka ---------------------------------------
output.kafka:
hosts: ["134.64.15.155:9092"]
topic: kafka_log
partition.round_robin:
reachable_only: true
# ================================= Processors =================================
processors:
- add_host_metadata: ~
- add_docker_metadata: ~
- add_kubernetes_metadata: ~
# ================================== Logging ===================================
logging.level: debug
3、执行
./filebeat -e -c filebeat.yml
4、kafka消息格式
日志路径格式:podName_namespace_container_name-container_id
"@timestamp": "2022-12-22T09:56:01.658Z",
"@metadata":
"beat": "filebeat",
"type": "_doc",
"version": "8.4.0"
,
"host":
"ip": [
"134.64.15.155",
"fe80::da3c:2b52:e460:abb",
"fe80::7fdf:e64e:8e64:9a0d",
"fe80::7689:6ff9:71a3:df3b",
"172.17.0.1",
"fe80::42:84ff:fe2d:83d3",
"fe80::c491:23ff:fe46:38e3"
],
"mac": [
"00:50:56:93:37:ca",
"02:42:84:2d:83:d3",
"c6:91:23:46:38:e3"
],
"hostname": "ceph-admin",
"architecture": "x86_64",
"name": "ceph-admin",
"os":
"kernel": "3.10.0-957.el7.x86_64",
"codename": "Core",
"type": "linux",
"platform": "centos",
"version": "7 (Core)",
"family": "redhat",
"name": "CentOS Linux"
,
"id": "06ae05ec30744b22be02552a86fa12ef",
"containerized": false
,
"stream": "stderr",
"message": "I1222 09:56:01.657908 1 client.go:360] parsed scheme: \\"passthrough\\"",
"cluster": "cluster-dev",
"ecs":
"version": "8.0.0"
,
"log":
"offset": 27399663,
"file":
"path": "/var/log/containers/kube-apiserver-ceph-admin_kube-system_kube-apiserver-c18d1047a8c212dee3388fab593439a76e105ced40da81b080cb384522fa7d57.log"
,
"input":
"type": "container"
,
"topic": "kafka_log",
"agent":
"name": "ceph-admin",
"type": "filebeat",
"version": "8.4.0",
"ephemeral_id": "6f278819-f3ee-4391-a7ba-3ccb05d19649",
"id": "e9cc4627-57a6-44f9-ba7f-11816dc977a1"
5、集群日志目录
以上是关于filebeat采集多个目录日志的主要内容,如果未能解决你的问题,请参考以下文章