log4j:ERROR 发送日志事件到日志分析的错误。
Posted
tags:
篇首语:本文由小常识网(cha138.com)小编为大家整理,主要介绍了log4j:ERROR 发送日志事件到日志分析的错误。相关的知识,希望对你有一定的参考价值。
我正试图使用以下步骤将 azure databrickss 日志发送到 azure 日志分析工作区。github文档. 我在databricks笔记本上使用的代码是
import com.microsoft.pnp.util.TryWith
import com.microsoft.pnp.logging.Log4jConfiguration
import java.io.ByteArrayInputStream
import org.slf4j.LoggerFactory
import org.slf4j.Logger
val loggerName :String = "fromNotebook"
val level : String = "INFO"
val logType: String = "HerculesDataBricksUAT"
val log4jConfig = s"""
log4j.appender.logAnalytics=com.microsoft.pnp.logging.loganalytics.LogAnalyticsAppender
log4j.appender.logAnalytics.layout=com.microsoft.pnp.logging.JSONLayout
log4j.appender.logAnalytics.layout.LocationInfo=false
log4j.appender.logAnalytics.logType=$logType
log4j.additivity.$loggerName=false
log4j.logger.$loggerName=$level, logAnalytics
"""
TryWith(new ByteArrayInputStream(log4jConfig.getBytes())) {
stream => {
Log4jConfiguration.configure(stream)
}
}
val logger = LoggerFactory.getLogger(loggerName);
logger.info("logging info from " + loggerName)
logger.warn("Warn message " + loggerName)
logger.error("Error message " + loggerName)
我的
homeubuntudatabrickssparkdbconflog4jexecutorlog4j.properties这个appender看起来是这样的
log4j.rootCategory=INFO, console, logAnalyticsAppender
# logAnalytics
log4j.appender.logAnalyticsAppender=com.microsoft.pnp.logging.loganalytics.LogAnalyticsAppender
log4j.appender.logAnalyticsAppender.filter.spark=com.microsoft.pnp.logging.SparkPropertyEnricher
#Disable all other logs
log4j.appender.logAnalyticsAppender.Threshold=INFO
但它对我来说表现得很奇怪。它可以正常工作的水平 信息 但如果我试图记录任何低于或高于我在配置中声明的水平的东西,它将抛出下面的错误。之后,无论我在我的代码中做了什么改变,它只会在我重新启动我的集群后工作。一旦它抛出错误,我的集群性能也受到影响。有的时候,这段代码甚至会无限期地运行下去。
我得到的错误是。
> log4j:ERROR Error sending logging event to Log Analytics
> java.util.concurrent.RejectedExecutionException: Task
> com.microsoft.pnp.client.loganalytics.LogAnalyticsSendBufferTask@5b2a430
> rejected from
> java.util.concurrent.ThreadPoolExecutor@699636fd[Terminated, pool size
> = 0, active threads = 0, queued tasks = 0, completed tasks = 112] at java.util.concurrent.ThreadPoolExecutor$AbortPolicy.rejectedExecution(ThreadPoolExecutor.java:2063)
> at
> java.util.concurrent.ThreadPoolExecutor.reject(ThreadPoolExecutor.java:830)
> at
> java.util.concurrent.ThreadPoolExecutor.execute(ThreadPoolExecutor.java:1379)
> at
> com.microsoft.pnp.client.GenericSendBuffer.send(GenericSendBuffer.java:88)
> at
> com.microsoft.pnp.client.loganalytics.LogAnalyticsSendBufferClient.sendMessage(LogAnalyticsSendBufferClient.java:43)
> at
> com.microsoft.pnp.logging.loganalytics.LogAnalyticsAppender.append(LogAnalyticsAppender.java:52)
> at
> org.apache.log4j.AppenderSkeleton.doAppend(AppenderSkeleton.java:251)
> at
> org.apache.log4j.helpers.AppenderAttachableImpl.appendLoopOnAppenders(AppenderAttachableImpl.java:66)
> at org.apache.log4j.Category.callAppenders(Category.java:206) at
> org.apache.log4j.Category.forcedLog(Category.java:391) at
> org.apache.log4j.Category.log(Category.java:856) at
> org.slf4j.impl.Log4jLoggerAdapter.info(Log4jLoggerAdapter.java:305)
> at log4jWrapper.MyLogger.info(MyLogger.scala:48) at
> line07d51ea7c1834afc957316967b0d0e8225.$read$$iw$$iw$$iw$$iw$$iw$$iw.<init>(command-764707897465587:10)
> at
> line07d51ea7c1834afc957316967b0d0e8225.$read$$iw$$iw$$iw$$iw$$iw.<init>(command-764707897465587:63)
> at
> line07d51ea7c1834afc957316967b0d0e8225.$read$$iw$$iw$$iw$$iw.<init>(command-764707897465587:65)
> at
> line07d51ea7c1834afc957316967b0d0e8225.$read$$iw$$iw$$iw.<init>(command-764707897465587:67)
> at
> line07d51ea7c1834afc957316967b0d0e8225.$read$$iw$$iw.<init>(command-764707897465587:69)
> at
> line07d51ea7c1834afc957316967b0d0e8225.$read$$iw.<init>(command-764707897465587:71)
> at
> line07d51ea7c1834afc957316967b0d0e8225.$read.<init>(command-764707897465587:73)
> at
> line07d51ea7c1834afc957316967b0d0e8225.$read$.<init>(command-764707897465587:77)
> at
> line07d51ea7c1834afc957316967b0d0e8225.$read$.<clinit>(command-764707897465587)
> at
> line07d51ea7c1834afc957316967b0d0e8225.$eval$.$print$lzycompute(<notebook>:7)
> at line07d51ea7c1834afc957316967b0d0e8225.$eval$.$print(<notebook>:6)
> at line07d51ea7c1834afc957316967b0d0e8225.$eval.$print(<notebook>)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:498) at
> scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:793)
> at
> scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1054)
> at
> scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:645)
> at
> scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:644)
> at
> scala.reflect.internal.util.ScalaClassLoader$class.asContext(ScalaClassLoader.scala:31)
> at
> scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:19)
> at
> scala.tools.nsc.interpreter.IMain$WrappedRequest.loadAndRunReq(IMain.scala:644)
> at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:576) at
> scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:572) at
> com.databricks.backend.daemon.driver.DriverILoop.execute(DriverILoop.scala:215)
> at
> com.databricks.backend.daemon.driver.ScalaDriverLocal$$anonfun$repl$1.apply$mcV$sp(ScalaDriverLocal.scala:202)
> at
> com.databricks.backend.daemon.driver.ScalaDriverLocal$$anonfun$repl$1.apply(ScalaDriverLocal.scala:202)
> at
> com.databricks.backend.daemon.driver.ScalaDriverLocal$$anonfun$repl$1.apply(ScalaDriverLocal.scala:202)
> at
> com.databricks.backend.daemon.driver.DriverLocal$TrapExitInternal$.trapExit(DriverLocal.scala:685)
> at
> com.databricks.backend.daemon.driver.DriverLocal$TrapExit$.apply(DriverLocal.scala:638)
> at
> com.databricks.backend.daemon.driver.ScalaDriverLocal.repl(ScalaDriverLocal.scala:202)
> at
> com.databricks.backend.daemon.driver.DriverLocal$$anonfun$execute$8.apply(DriverLocal.scala:373)
> at
> com.databricks.backend.daemon.driver.DriverLocal$$anonfun$execute$8.apply(DriverLocal.scala:350)
> at
> com.databricks.logging.UsageLogging$$anonfun$withAttributionContext$1.apply(UsageLogging.scala:238)
> at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58) at
> com.databricks.logging.UsageLogging$class.withAttributionContext(UsageLogging.scala:233)
> at
> com.databricks.backend.daemon.driver.DriverLocal.withAttributionContext(DriverLocal.scala:48)
> at
> com.databricks.logging.UsageLogging$class.withAttributionTags(UsageLogging.scala:271)
> at
> com.databricks.backend.daemon.driver.DriverLocal.withAttributionTags(DriverLocal.scala:48)
> at
> com.databricks.backend.daemon.driver.DriverLocal.execute(DriverLocal.scala:350)
> at
> com.databricks.backend.daemon.driver.DriverWrapper$$anonfun$tryExecutingCommand$2.apply(DriverWrapper.scala:644)
> at
> com.databricks.backend.daemon.driver.DriverWrapper$$anonfun$tryExecutingCommand$2.apply(DriverWrapper.scala:644)
> at scala.util.Try$.apply(Try.scala:192) at
> com.databricks.backend.daemon.driver.DriverWrapper.tryExecutingCommand(DriverWrapper.scala:639)
> at
> com.databricks.backend.daemon.driver.DriverWrapper.getCommandOutputAndError(DriverWrapper.scala:485)
> at
> com.databricks.backend.daemon.driver.DriverWrapper.executeCommand(DriverWrapper.scala:597)
> at
> com.databricks.backend.daemon.driver.DriverWrapper.runInnerLoop(DriverWrapper.scala:390)
> at
> com.databricks.backend.daemon.driver.DriverWrapper.runInner(DriverWrapper.scala:337)
> at
> com.databricks.backend.daemon.driver.DriverWrapper.run(DriverWrapper.scala:219)
> at java.lang.Thread.run(Thread.java:748)
答案
我找到了解决方案。事实上,在databricks notebook的环境中,如果你尝试多次运行以下代码 同名记录仪 它将抛出上述错误,因为配置一旦被设置为全局的(只有在集群重启后才能被删除)。
TryWith(new ByteArrayInputStream(log4jConfig.getBytes())) {
stream => {
Log4jConfiguration.configure(stream)
}
}
我找到的解决方案是
import org.apache.log4j.LogManager
if(LogManager.exists(loggerName) == null) {
TryWith(new ByteArrayInputStream(log4jConfig.getBytes())) {
stream => {
Log4jConfiguration.configure(stream)
}
}
}
它将禁止再次配置您的记录器。
以上是关于log4j:ERROR 发送日志事件到日志分析的错误。的主要内容,如果未能解决你的问题,请参考以下文章