簡體   English   中英

Spark強制log4j

[英]Spark forcing log4j

我在Scala中有一些瑣碎的火花項目,並且想要使用logback,但是spark / hadoop似乎強迫我使用log4j。

  1. 這似乎與我對slf4j的目的的理解不一致; 這不是對spark / hadoop的疏忽嗎?

  2. 我是否必須放棄logback並使用log4j,還是有解決方法?

在build.sbt中,我嘗試了排除...

"org.apache.spark" %% "spark-core" % "1.4.1" excludeAll(
    ExclusionRule(name = "log4j"),
    ExclusionRule(name = "slf4j-log4j12")
),
"org.slf4j" % "slf4j-api" % "1.7.12",
"ch.qos.logback" % "logback-core" % "1.1.3",
"ch.qos.logback" % "logback-classic" % "1.1.3"

......但這會導致異常......

Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/log4j/Level
    at org.apache.hadoop.mapred.JobConf.<clinit>(JobConf.java:354)
    at java.lang.Class.forName0(Native Method)
    at java.lang.Class.forName(Class.java:344)
    at org.apache.hadoop.conf.Configuration.getClassByNameOrNull(Configuration.java:1659)
    at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:91)
    at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:75)
    at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133)
    at org.apache.hadoop.security.Groups.<init>(Groups.java:55)
    at org.apache.hadoop.security.Groups.getUserToGroupsMappingService(Groups.java:182)
    at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:235)
    at org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:214)
    at org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:669)
    at org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:571)
    at org.apache.spark.util.Utils$$anonfun$getCurrentUserName$1.apply(Utils.scala:2162)
    at org.apache.spark.util.Utils$$anonfun$getCurrentUserName$1.apply(Utils.scala:2162)
    at scala.Option.getOrElse(Option.scala:120)
    at org.apache.spark.util.Utils$.getCurrentUserName(Utils.scala:2162)
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:301)
    at spike.HelloSpark$.main(HelloSpark.scala:19)
    at spike.HelloSpark.main(HelloSpark.scala)
Caused by: java.lang.ClassNotFoundException: org.apache.log4j.Level
    at java.net.URLClassLoader$1.run(URLClassLoader.java:372)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
    ... 20 more

我遇到了和你一樣的異常。 我認為你還應該添加log4j-over-slf4j作為依賴,除了不包括log4j和slf4j-log4j12。 這個對我有用。

log4j-over-slf4j是log4j的替代品,因為它提供與log4j完全相同的api,並實際將所有對log4j的調用路由到slf4j,后者又將所有內容路由到下面的日志框架。 https://www.slf4j.org/legacy.html提供詳細說明。

暫無
暫無

聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.

 
粵ICP備18138465號  © 2020-2024 STACKOOM.COM