读取 Logstash 日志但不会推送到 elasticsearch
Posted
技术标签:
【中文标题】读取 Logstash 日志但不会推送到 elasticsearch【英文标题】:Logstash logs are read but are not pushed to elasticsearch 【发布时间】:2019-12-03 12:10:20 【问题描述】:我有以下logstash配置:
input
file
codec => "json_lines"
path => ["/etc/logstash/input.log"]
sincedb_path => "/etc/logstash/dbfile"
start_position => "beginning"
ignore_older => "0"
output
elasticsearch
hosts => ["192.168.169.46:9200"]
stdout
codec => rubydebug
/etc/logstash/input.log
文件填充了来自正在运行的 Java 应用程序的日志。日志采用以下 json 格式(内联写入,由 \n
字符分隔):
"exception":
"exception_class": "java.lang.RuntimeException",
"exception_message": "Test runtime exception stack: 0",
"stacktrace": "java.lang.RuntimeException: Test runtime exception stack: 0"
,
"@version": 1,
"source_host": "WS-169-046",
"message": "Test runtime exception stack: 0",
"thread_name": "parallel-1",
"@timestamp": "2019-12-02T16:30:14.084+02:00",
"level": "ERROR",
"logger_name": "nl.hnf.logs.aggregator.demo.LoggingTest",
"aplication-name": "demo-log-aggregation"
我还使用 elasticsearch API 更新了 logstash 默认模板(将请求正文放在:http://192.168.169.46:9200/_template/logstash?pretty
):
"index_patterns": "logstash-*",
"version": 60002,
"settings":
"index.refresh_interval": "5s",
"number_of_shards": 1
,
"mappings":
"dynamic_templates": [
"message_field":
"path_match": "message",
"match_mapping_type": "string",
"mapping":
"type": "text",
"norms": false
,
"string_fields":
"match": "*",
"match_mapping_type": "string",
"mapping":
"type": "text",
"norms": false,
"fields":
"keyword":
"type": "keyword",
"ignore_above": 256
],
"properties":
"@timestamp":
"type": "date"
,
"@version":
"type": "keyword"
,
"source_host":
"type": "keyword"
,
"message":
"type": "text"
,
"thread_name":
"type": "text"
,
"level":
"type": "keyword"
,
"logger_name":
"type": "keyword"
,
"aplication_name":
"type": "keyword"
,
"exception":
"dynamic": true,
"properties":
"exception_class":
"type": "text"
,
"exception_message":
"type": "text"
,
"stacktrace":
"type": "text"
Elasticsearch 以 "acknowledged": true
响应,我可以看到模板正在通过 API 更新。
现在以debug
日志级别启动logstash,我看到输入日志正在读取但未发送到elasticsearch,尽管创建了索引但它始终为空(0 个文档):
[2019-12-03T09:30:51,655][DEBUG][logstash.inputs.file ][custom] Received line :path=>"/etc/logstash/input.log", :text=>"\"@version\":1,\"source_host\":\"ubuntu\",\"message\":\"Generating some logs: 65778 - 2019-12-03T09:30:50.775\",\"thread_name\":\"parallel-1\",\"@timestamp\":\"2019-12-03T09:30:50.775+00:00\",\"level\":\"INFO\",\"logger_name\":\"nl.hnf.logs.aggregator.demo.LoggingTest\",\"aplication-name\":\"demo-log-aggregation\""
[2019-12-03T09:30:51,656][DEBUG][filewatch.sincedbcollection][custom] writing sincedb (delta since last write = 1575365451)
此外,elasticsearch 日志也在 debug
级别,但我没有看到任何错误或任何可以提示我问题根源的内容。
对于为什么不将日志推送到elasticsearch,你们有什么想法或建议吗?
【问题讨论】:
【参考方案1】:通过使用json
编解码器而不是json_lines
并删除start_position
、ignore_older
和sincedb_path
修复了该问题
input
file
codec => "json"
path => ["/etc/logstash/input.log"]
output
elasticsearch
hosts => ["192.168.169.46:9200"]
stdout
codec => rubydebug
另外json_lines
编解码器似乎与文件输入不兼容(\n
分隔符没有按预期工作)
【讨论】:
【参考方案2】:在 filebeat 中,将 ignore_older 设置为零意味着“不检查文件的年龄”。对于 logstash 文件输入,它意味着“忽略超过零秒的文件”,这实际上是“忽略所有内容”。删除它。如果这没有帮助,则提高日志级别以进行跟踪,并查看 filewatch 模块对它正在监视的文件的说明。
【讨论】:
好的,在logstash中开启trace
级别的日志并删除ignore_older
配置,仍然是同样的问题。 filewatch 模块按预期读取日志(当它们增长时),但没有关于数据输出的日志(输出由 logstash 正确检测和配置)。以上是关于读取 Logstash 日志但不会推送到 elasticsearch的主要内容,如果未能解决你的问题,请参考以下文章