在logstash中使用grok模式解析我的json文件?

Posted

tags:

篇首语:本文由小常识网(cha138.com)小编为大家整理,主要介绍了在logstash中使用grok模式解析我的json文件?相关的知识,希望对你有一定的参考价值。

我试图通过使用logstash将json文件解析为elasticsearch但我不能,我想我需要编写一些grok模式。但我不能。如何使用logstash将json下面发送到elasticsearch。

{ “计算机名”: “测试1”,

"龙达特":"2019-01-29 13:19:32",

“级别”:“错误”,

“mysite”:“test1”,

“消息”: “TEST2”,

“异常”:“TEST3”

“timestamp”:“2019-01-29T13:19:32.257Z”}

我的logstash文件:


input {
  file {
       path => ["P:/logs/*.txt"]
        start_position => "beginning" 
        discover_interval => 10
        stat_interval => 10
        sincedb_write_interval => 10
        close_older => 10
       codec => multiline { 
        negate => true
        what => "previous" 
       }
  }
}

filter {  
 date {
            match => ["TimeStamp", "ISO8601"]
             }  
    json{
        source => "request"
        target => "parsedJson"

    }   

}   

output {  

    stdout {
        codec => rubydebug
    }



    elasticsearch {
        hosts => [ "http://localhost:9200" ]
         index => "log-%{+YYYY.MM}"

    }   
}



错误:

[2019-01-29T14:30:54,607] [警告] [logstash.config.source.multilocal]忽略'pipelines.yml'文件,因为指定了模块或命令行选项[2019-01-29T14:30:56,929] [INFO] [logstash.runner]启动Logstash {“logstash.version”=>“6.3.2”} [2019-01-29T14:30:59,167] [错误] [logstash.agent]无法执行操作{:action => LogStash :: PipelineAction :: Create / pipeline_id:main,:exception =>“LogStash :: ConfigurationError”,:message =>“输入后第12行第18行(字节281)中预期的#,{,}之一{ n file { n t path => [“P:/ logs / * .txt ”] n t tstart_position => “starting ” n t tdiscover_interval => 10 n t t ttat_interval => 10 n t tsincedb_write_interval => 10 n t ttclose_older => 10 n编解码器=>多行{ n t tpattern => “^%{TIMESTAMP_ISO8601} \” n t tnegate => true n what => “”,:backtrace => [“P:/elk/logstash/logstash-core/lib/logstash/compiler.rb:42:in compile_imperative'", "P:/elk/logstash/logstash-core/lib/logstash/compiler.rb:50:incompile_graph'”, “P:/elk/logstash/logstash-core/lib/logstash/compiler.rb:12:in block in compile_sources'", "org/jruby/RubyArray.java:2486:inmap'”,“P:/ elk / logs tash / logstash -core / lib / logstash / compiler.rb:11:在compile_sources'", "P:/elk/logstash/logstash-core/lib/logstash/pipeline.rb:49:ininitialize'“,”P:/elk/logstash/logstash-core/lib/logstash/pipeline.rb:167:in initialize'", "P:/elk/logstash/logstash-core/lib/logstash/pipeline_action/create.rb:40:inexecute'“,”P :/elk/logstash/logstash-core/lib/logstash/agent.rb:305:在`block in converge_state'“]} [2019-01-29T14:31:00,417] [INFO] [logstash.agent]成功启动Logstash API端点{:port => 9600} [2019-01-29T14:34:23,554] [WARN] [logstash.config.source.multilocal]忽略'pipelines.yml'文件,因为指定了模块或命令行选项[ 2019-01-29T14:34:24,554] [INFO] [logstash.runner]启动Logstash {“logstash.version”=>“6.3.2”} [2019-01-29T14:34:27,486] [错误] [logstash .codecs.multiline]缺少多行编解码器插件所需的设置:

编解码器{multiline {pattern =>#SETTING MISSING ...}} [2019-01-29T14:34:27,502] [错误] [logstash.agent]无法执行操作{:action => LogStash :: PipelineAction :: Create / pipeline_id:main,:exception =>“LogStash :: ConfigurationError”,:message =>“你的配置有问题。”,:backtrace => [“P:/ elk / logstash / logstash-core / lib / logstash /config/mixin.rb:89:in config_init'", "P:/elk/logstash/logstash-core/lib/logstash/codecs/base.rb:19:ininitialize'“,”P:/elk/logstash/logstash-core/lib/logstash/plugins/plugin_factory.rb:97:in plugin'", "P:/elk/logstash/logstash-core/lib/logstash/pipeline.rb:110:inplugin'“,”(eval):8:在<eval>'", "org/jruby/RubyKernel.java:994:ineval'“,”P:/elk/logstash/logstash-core/lib/logstash/pipeline.rb:82:in initialize'", "P:/elk/logstash/logstash-core/lib/logstash/pipeline.rb:167:ininitialize'“,”P:/ elk / logstash / logstash-core / lib / logstash / pipeline_action / create.rb:40:in execute'", "P:/elk/logstash/logstash-core/lib/logstash/agent.rb:305:inblock in converge_state'“]} [2019-01-29T14:34:27,971] [INFO] [logstash.agent]已成功启动Logstash API端点{:port => 9600}

答案

您可以尝试使用json filter plugin进行logstash。

这样logstash中的过滤器插件将解析json:

filter {
  json {
    source => "message"
  }
}

另一件好事是tag_on_failure。这样,如果json无效或被误解,您将在elasticsearch / kibana中看到该消息,但使用_jsonparsefailure标记。

  filter {
      json {
        source => "message"
        tag_on_failure => [ "_jsonparsefailure" ]
      }
    }

以上是关于在logstash中使用grok模式解析我的json文件?的主要内容,如果未能解决你的问题,请参考以下文章

Logstash:日志解析的 Grok 模式示例

Logstash: Grok 模式示例

Logstash: Grok 模式示例

logstash grok 模式来监控 logstash 本身

Logstash收集nginx日志之使用grok过滤插件解析日志

Logstash收集nginx日志之使用grok过滤插件解析日志