[英]Unable to create local spark session during scala test. log4j:ERROR Could not create an Appender
在尝试使用 funsuite 在 scala 测试中创建本地火花 session 时,出现以下错误
log4j:错误无法创建附加程序。 报告的错误如下。 java.lang.ClassNotFoundException: com.microsoft.applicationinsights.log4j.v1_2.ApplicationInsightsAppender at java.net.URLClassLoader.findClass(URLClassLoader.java:387) at java.lang.ClassLoader.loadClass(ClassLoader.java:418) at sun. misc.Launcher$AppClassLoader.loadClass(Launcher.java:355) at java.lang.ClassLoader.loadClass(ClassLoader.java:351) at java.lang.Class.forName0(Native Method) at java.lang.Class.forName( Class.java:264) at org.apache.log4j.helpers.Loader.loadClass(Loader.java:198) at org.apache.log4j.xml.DOMConfigurator.parseAppender(DOMConfigurator.java:247) at org.apache.log4j.xml.DOMConfigurator.findAppenderByName(DOMConfigurator.java:176) at org.apache.log4j.xml.DOMConfigurator.findAppenderByReference (DOMConfigurator.java:191) at org.apache.log4j.xml.DOMConfigurator.parseChildrenOfLoggerElement(DOMConfigurator.java:523) at org.apache.log4j.xml.DOMConfigurator.parseRoot(DOMConfigurator.Z93F725A07423FE1C889F44 8B33D21F46Z:492) at org.apache.log4j.xml.DOMConfigurator.parse(DOMConfigurator.java:1006) at org.apache.log4j.xml.DOMConfigurator.doConfigure(DOMConfigurator.java:872) at org.apache.log4j.xml .DOMConfigurator.doConfigure(DOMConfigurator.java:778) at org.apache.log4j.helpers.OptionConverter.selectAndConfigure(OptionConverter.java:526) at org.apache.log4j.LogManager.(LogManager.java:127) at org.slf4j .impl.Log4jLoggerFactory.(Log4jLoggerFactory.Z93F725A07423FE1C889F448B33 D21F46Z:66) at org.slf4j.impl.StaticLoggerBinder.(StaticLoggerBinder.java:72) at org.slf4j.impl.StaticLoggerBinder.(StaticLoggerBinder.java:45) at org.slf4j.LoggerFactory.bind(LoggerFactory.java:150 ) at org.slf4j.LoggerFactory.performInitialization(LoggerFactory.java:124) at org.slf4j.LoggerFactory.getILoggerFactory(LoggerFactory.java:417) at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:362) at org.slf4j .LoggerFactory.getLogger(LoggerFactory.java:388) at org.apache.spark.network.util.JavaUtils.(JavaUtils.java:41) at org.apache.spark.internal.config.ConfigHelpers$.byteFromString(ConfigBuilder.scala :67) 在 org.apache.spark.internal.c onfig.ConfigBuilder.$anonfun$bytesConf$1(ConfigBuilder.scala:259) at org.apache.spark.internal.config.ConfigBuilder.$anonfun$bytesConf$1$adapted(ConfigBuilder.scala:259) at org.apache.spark. internal.config.TypedConfigBuilder.$anonfun$transform$1(ConfigBuilder.scala:101) at org.apache.spark.internal.config.TypedConfigBuilder.createWithDefault(ConfigBuilder.scala:144) at org.apache.spark.internal.config. package$.(package.scala:345) at org.apache.spark.internal.config.package$.(package.scala) at org.apache.spark.SparkConf$.(SparkConf.scala:654) at org.apache .spark.SparkConf$.(SparkConf.scala) 在 org.ZB6EFD606D118D0F62066E3141 9FF04CCZ.spark.SparkConf.set(SparkConf.scala:94) at org.apache.spark.SparkConf.set(SparkConf.scala:83) at org.apache.spark.sql.SparkSession$Builder.$anonfun$getOrCreate$1( SparkSession.scala:916) at scala.collection.mutable.HashMap.$anonfun$foreach$1(HashMap.scala:149) at scala.collection.mutable.HashTable.foreachEntry(HashTable.scala:237) at scala.collection.mutable .HashTable.foreachEntry$(HashTable.scala:230) at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:44) at scala.collection.mutable.Z0 63A5BC470661C3C7909BCE1B7E971A5Z.foreach(HashMap.scala:149) at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:916) at com.random.SampleSpec.(SampleSpec.scala:9) at sun.reflect. NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor. java:423) at java.lang.Class.newInstance(Class.java:442) at org.scalatest.tools.DiscoverySuite$.g etSuiteInstance(DiscoverySuite.scala:66) at org.scalatest.tools.DiscoverySuite.$anonfun$nestedSuites$1(DiscoverySuite.scala:38) at scala.collection.TraversableLike.$anonfun$map$1(TraversableLike.scala:286) at scala .collection.Iterator.foreach(Iterator.scala:943) at scala.collection.Iterator.foreach$(Iterator.scala:943) at scala.collection.AbstractIterator.foreach(Iterator.scala:1431) at scala.collection.IterableLike .foreach(IterableLike.scala:74) at scala.collection.IterableLike.foreach$(IterableLike.scala:73) at scala.collection.AbstractIterable.foreach(Iter able.scala:56) at scala.collection.TraversableLike.map(TraversableLike.scala:286) at scala.collection.TraversableLike.map$(TraversableLike.scala:279) at scala.collection.AbstractTraversable.map(Traversable.scala: 108) at org.scalatest.tools.DiscoverySuite.(DiscoverySuite.scala:37) at org.scalatest.tools.Runner$.genDiscoSuites$1(Runner.scala:1128) at org.scalatest.tools.Runner$.doRunRunRunDaDoRunRun(Runner .scala:1224) at org.scalatest.tools.Runner$.$anonfun$runOptionallyWithPassFailReporter$24(Runner.scala:993) at org.scalatest.tools.Runner$.$anonfun$runOptionallyWithPassFailReporter$24$adapted(Runner.scala:971 ) 在 org.scalatest.tools.Runner$.wit hClassLoaderAndDispatchReporter(Runner.scala:1480) at org.scalatest.tools.Runner$.runOptionallyWithPassFailReporter(Runner.scala:971) at org.scalatest.tools.Runner$.main(Runner.scala:775) at org.scalatest.tools .Runner.main(Runner.scala)
以下是 pom 中使用的代码片段和测试配置
import org.apache.spark.sql.{DataFrame, SparkSession}
import org.scalatest.BeforeAndAfter
import org.scalatest.funsuite.AnyFunSuite
class SampleSpec extends AnyFunSuite with BeforeAndAfter {
val spark: SparkSession = SparkSession.builder().master("local[*]").appName("SampleTest").getOrCreate()
spark.conf.set("spark.sql.crossJoin.enabled", "true")
spark.conf.set("spark.sql.legacy.timeParserPolicy", "LEGACY")
before {
spark.sparkContext.setLogLevel("ERROR")
logger.info("Before")
}
}
<!-- Test -->
<!-- https://mvnrepository.com/artifact/org.scalatest/scalatest -->
<dependency>
<groupId>org.scalatest</groupId>
<artifactId>scalatest_${scala.version.major}</artifactId>
<version>3.3.0-SNAP3</version>
<scope>test</scope>
</dependency>
<!-- https://mvnrepository.com/artifact/org.scalacheck/scalacheck -->
<dependency>
<groupId>org.scalacheck</groupId>
<artifactId>scalacheck_${scala.version.major}</artifactId>
<version>1.15.4</version>
<scope>test</scope>
</dependency>
<plugin>
<groupId>org.scalatest</groupId>
<artifactId>scalatest-maven-plugin</artifactId>
<version>2.0.0</version>
<configuration>
<reportsDirectory>${project.build.directory}/surefire-reports</reportsDirectory>
<junitxml>.</junitxml>
<filereports>WDF TestSuite.txt</filereports>
</configuration>
<executions>
<execution>
<id>test</id>
<goals>
<goal>test</goal>
</goals>
</execution>
</executions>
</plugin>
在 intelliJ 2021.2.3 版本上运行它
相同的代码适用于另一个旧模块。 我尝试从旧代码中复制几乎所有内容,但问题仍然存在尝试使模块无效并重新启动,构建和重建模块,添加框架以支持 scala 但似乎没有任何效果
本地 spark 安装和 pom spark 版本匹配即 3.1.3 和 scala 版本是 2.12
显然您的 Log4J 配置文件 (log4j.xml) 指的是 com.microsoft.applicationinsights.log4j.v1_2.ApplicationInsightsAppender 作为依赖项:这不是附加程序
工件:applicationinsights-logging-log4j1_2
组: com.microsoft.azure
版本:0.9.0
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.