logback kafkaAppender输出日志到kafka

Posted 西木-Lee

tags:

篇首语:本文由小常识网(cha138.com)小编为大家整理,主要介绍了logback kafkaAppender输出日志到kafka相关的知识,希望对你有一定的参考价值。

官网地址https://github.com/danielwegener/logback-kafka-appender

本文以spring boot项目为基础,更多的信息,请参考官网

https://github.com/danielwegener/logback-kafka-appender

使用maven引入所需要的jar包

     <dependency>
            <groupId>com.github.danielwegener</groupId>
            <artifactId>logback-kafka-appender</artifactId>
            <version>0.2.0-RC1</version>
        </dependency>
        <dependency>
            <groupId>ch.qos.logback</groupId>
            <artifactId>logback-classic</artifactId>
            <!--<version>1.2.3</version>
            <scope>runtime</scope>-->
        </dependency>
        <dependency>
            <groupId>ch.qos.logback</groupId>
            <artifactId>logback-core</artifactId>
            <!--<version>1.2.3</version>-->
        </dependency>

配置logback-spring.xml,增加一个appender节点

<appender name="kafkaAppender" class="com.github.danielwegener.logback.kafka.KafkaAppender">
    
        <encoder class="ch.qos.logback.classic.encoder.PatternLayoutEncoder">
            <pattern>%message %n</pattern>
            <charset>utf8</charset>
        </encoder>
        <topic>rmcloud-gateway-audit-log</topic>
        <keyingStrategy class="com.github.danielwegener.logback.kafka.keying.NoKeyKeyingStrategy"/>
        <deliveryStrategy class="com.github.danielwegener.logback.kafka.delivery.AsynchronousDeliveryStrategy"/>
        <producerConfig>bootstrap.servers=127.0.0.1:9092</producerConfig>
    </appender>
<root level="INFO">
   <appender-ref ref="kafkaAppender"/>
</root>

自定义regular Filter 

import ch.qos.logback.classic.spi.ILoggingEvent;
import ch.qos.logback.core.filter.Filter;
import ch.qos.logback.core.spi.FilterReply;
import com.alibaba.fastjson.JSON;
import com.alibaba.fastjson.JSONObject;
import com.vcredit.rmcloud.gateway.bean.JsonResult;
import com.vcredit.rmcloud.gateway.bean.RmcloudConstant;
import lombok.extern.slf4j.Slf4j;
import org.apache.commons.lang3.StringUtils;

/**
 * 扩展logback filter,过虑rmcloud的日志,输出到kafka
 *
 * @author lee
 * @date 2018/9/11
 */

@Slf4j
public class LogKafkaFilter extends Filter<ILoggingEvent> {
    @Override
    public FilterReply decide(ILoggingEvent iLoggingEvent) {

        String message = iLoggingEvent.getMessage();
        /**
         * 此处是业务代码,可根据自己 的业务代码实现
         */
        if (StringUtils.isNotBlank(message)) {
        JSONObject auditLog
= JSON.parseObject(message); log.info("responseBody:" + auditLog.get("responseBody").toString()); JsonResult jsonResult = JSON.parseObject(auditLog.get("responseBody").toString(), JsonResult.class); if (auditLog.get("serviceId").toString().startsWith(RmcloudConstant.SERVICE_ID_RMCLOUD_START)) { return FilterReply.ACCEPT; } } return FilterReply.DENY; } }

 

将自定义的Filter加入到kafkaAppender中

<appender name="kafkaRmcloudAppender" class="com.github.danielwegener.logback.kafka.KafkaAppender">
        <filter class="com.xx.xx.xx.filter.LogKafkaFilter"/>
        <encoder class="ch.qos.logback.classic.encoder.PatternLayoutEncoder">
            <pattern>%message %n</pattern>
            <charset>utf8</charset>
        </encoder>
        <topic>rmcloud-gateway-audit-log</topic>
        <keyingStrategy class="com.github.danielwegener.logback.kafka.keying.NoKeyKeyingStrategy"/>
        <deliveryStrategy class="com.github.danielwegener.logback.kafka.delivery.AsynchronousDeliveryStrategy"/>
        <producerConfig>bootstrap.servers=${kafkaServers}</producerConfig>
    </appender>

这样过滤不需要日志内容

另外,在github上发现另外一个KafkaAppener

https://github.com/johnmpage/logback-kafka

 



以上是关于logback kafkaAppender输出日志到kafka的主要内容,如果未能解决你的问题,请参考以下文章

Flink(1.12.1)日志配置Logback实现日志切分和kafka发送

logback在集群环境下面如何输出日志,有啥好的方案

logback怎么输出tomcat日志

如何配置logback使日志输出到mysql数据库

logback怎么根据logger输出不同文件

Spring Boot 集成 Logback 日志:控制台彩色日志输出 + 日志文件输出