flume 日志导入elasticsearch

Posted cynchanpin

tags:

篇首语:本文由小常识网(cha138.com)小编为大家整理,主要介绍了flume 日志导入elasticsearch相关的知识,希望对你有一定的参考价值。

Flume配置

flume生成的数据结构

<span style="font-size:18px;">"_index" : "logstash-2013.01.07",
"_type" : "tms_jboss_syslog",
"_id" : "a_M9X_0YSpmE7A_bEzIFiw",
"_score" : 1.0, "_source" : {"@source":"file://localhost.localdomain/tmp/logstash_test.log","@tags":[],"@fields":{},"@timestamp":"2013-01-07T10:53:50.941Z","@source_host":"localhost.localdomain","@source_path":"/tmp/logstash_test.log","@message":"[2013-01-05 11:02:19,969] packBoxNumber eq 00004000000044043412 createdOffice eq VIP_BJ:;null","@type":"tms_jboss_syslog"}</span>

flume配置文件

agent.sources = tail

agent.channels = memoryChannel

agent.channels.memoryChannel.type = memory

agent.sources.tail.channels = memoryChannel

agent.sources.tail.type = exec

agent.sources.tail.command = tail -F /home/hadoop/flume/conf/es_log/es_log.log

agent.sources.tail.interceptors=i1 i2 i3

agent.sources.tail.interceptors.i1.type=regex_extractor

agent.sources.tail.interceptors.i1.regex = (\\w.*):(\\w.*):(\\w.*)\\s

agent.sources.tail.interceptors.i1.serializers = s1 s2 s3

agent.sources.tail.interceptors.i1.serializers.s1.name = source

agent.sources.tail.interceptors.i1.serializers.s2.name = type

agent.sources.tail.interceptors.i1.serializers.s3.name = src_path

agent.sources.tail.interceptors.i2.type=org.apache.flume.interceptor.TimestampInterceptor$Builder

agent.sources.tail.interceptors.i3.type=org.apache.flume.interceptor.HostInterceptor$Builder

agent.sources.tail.interceptors.i3.hostHeader = host

agent.sinks = elasticsearch

agent.sinks.elasticsearch.channel = memoryChannel

agent.sinks.elasticsearch.type=org.apache.flume.sink.elasticsearch.ElasticSearchSink

agent.sinks.elasticsearch.batchSize=100

agent.sinks.elasticsearch.hostNames=127.0.0.1:9300
agent.sinks.k1.indexType = bar_type
agent.sinks.elasticsearch.indexName=logstash
agent.sinks.elasticsearch.clusterName=elasticsearch
agent.sinks.elasticsearch.serializer=org.apache.flume.sink.elasticsearch.ElasticSearchLogStashEventSerializer
启动:

 ../bin/flume-ng agent -c . -f es_log.conf -n agent  -Dflume.root.logger=INFO,console

測试数据

website:weblog:login_page weblog data1
website:weblog:profile_page weblog data2
website:weblog:transaction_page weblog data3
website:weblog:docs_page weblog data4
syslog:syslog:sysloggroup syslog data1
syslog:syslog:sysloggroup syslog data2
syslog:syslog:sysloggroup syslog data3
syslog:syslog:sysloggroup syslog data4
syslog:syslog:sysloggroup syslog data5
syslog:syslog:sysloggroup syslog data6
技术分享

之后就能够在es集群上看到通过flume导入的数据了

这时候编辑log文件时候会被flume读入es集群中并实时生成索引  例如以下图所看到的:

技术分享

这样就能够通过es对日志的实时检索了







以上是关于flume 日志导入elasticsearch的主要内容,如果未能解决你的问题,请参考以下文章

猎聘网架构中间件负责人:基于Flume+Kafka+ Elasticsearch+Storm的海量日志实时分析平台

基于flume的日志系统

基于Flume+Kafka+ Elasticsearch+Storm的海量日志实时分析平台

Flume概述

filebeat+logstash+elasticsearch收集haproxy日志

flume的初体验