简体   繁体   English

错误SparkContext:初始化SparkContext时出错-Java + Eclipse + Spark

[英]ERROR SparkContext: Error initializing SparkContext - Java + Eclipse + Spark

Im starting now with Spark. 我现在从Spark开始。 Im trying some example projects and now im working with a project to read from a csv. 我正在尝试一些示例项目,现在正在处理要从csv读取的项目。 The problem comes when i run the app. 问题出在我运行应用程序时。 The console of eclipse tells me the following error: eclipse的控制台告诉我以下错误:

18/09/19 05:00:48 ERROR MetricsSystem: Sink class org.apache.spark.metrics.sink.MetricsServlet cannot be instantiated
18/09/19 05:00:48 ERROR SparkContext: Error initializing SparkContext.
java.lang.reflect.InvocationTargetException
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
    at org.apache.spark.metrics.MetricsSystem$$anonfun$registerSinks$1.apply(MetricsSystem.scala:200)
    at org.apache.spark.metrics.MetricsSystem$$anonfun$registerSinks$1.apply(MetricsSystem.scala:194)
    at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
    at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
    at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:230)
    at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40)
    at scala.collection.mutable.HashMap.foreach(HashMap.scala:99)
    at org.apache.spark.metrics.MetricsSystem.registerSinks(MetricsSystem.scala:194)
    at org.apache.spark.metrics.MetricsSystem.start(MetricsSystem.scala:102)
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:522)
    at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2313)
    at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:909)
    at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:901)
    at scala.Option.getOrElse(Option.scala:121)
    at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:901)
    at com.mycsv.app.CSVFileAnalysisInSparkSQL.main(CSVFileAnalysisInSparkSQL.java:28)
Caused by: java.lang.NoSuchMethodError: com.fasterxml.jackson.annotation.JsonFormat$Value.empty()Lcom/fasterxml/jackson/annotation/JsonFormat$Value;
    at com.fasterxml.jackson.databind.cfg.MapperConfig.<clinit>(MapperConfig.java:50)
    at com.fasterxml.jackson.databind.ObjectMapper.<init>(ObjectMapper.java:543)
    at com.fasterxml.jackson.databind.ObjectMapper.<init>(ObjectMapper.java:460)
    at org.apache.spark.metrics.sink.MetricsServlet.<init>(MetricsServlet.scala:48)
    ... 20 more

My project code is the following: 我的项目代码如下:

final SparkSession sparkSession = SparkSession.builder().appName("Spark CSV").master("local[5]").getOrCreate();

What could be the problem with the SparkContext? SparkContext可能是什么问题? Thank you and regards. 谢谢和问候。

由于缺少或错误的杰克逊注释依赖关系而导致错误。请确认您在项目构建中使用的版本是否正确。

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 在Mac上初始化SparkContext时出错 - Error initializing SparkContext on Mac SparkContext:在MapR沙盒上初始化SparkContext时出错 - SparkContext: Error initializing SparkContext on MapR Sandbox windows 上的 Spark - 初始化 SparkContext 时出错,无效的 spark URL - Spark on windows - Error initializing SparkContext, Invalid spark URL 在 jupyterlab 中初始化 SparkContext 时出错 - Error when initializing SparkContext in jupyterlab 错误 SparkContext:初始化 SparkContext 时出错。 java.lang.IllegalArgumentException: 系统内存 259522560 必须至少为 471859200 - ERROR SparkContext: Error initializing SparkContext. java.lang.IllegalArgumentException: System memory 259522560 must be at least 471859200 错误 SparkContext:初始化 SparkContext 时出错.. IntelliJ 和 Scala - ERROR SparkContext: Error initializing SparkContext.. IntelliJ and Scala 错误 SparkContext:初始化 SparkContext 时出错。 java.lang.RuntimeException:java.lang.NoSuchFieldException:DEFAULT_TINY_CACHE_SIZE - ERROR SparkContext: Error initializing SparkContext. java.lang.RuntimeException: java.lang.NoSuchFieldException: DEFAULT_TINY_CACHE_SIZE Apache Spark-JavaSparkContext无法转换为SparkContext错误 - Apache Spark - JavaSparkContext cannot be converted to SparkContext error 初始化SparkContext时出错。 java.util.concurrent.TimeoutException:期货在[10000毫秒]后超时 - Error initializing SparkContext. java.util.concurrent.TimeoutException: Futures timed out after [10000 milliseconds] (没有活动的SparkContext。)将作业提交到本地Spark主数据库时出错 - (No active SparkContext.) Error submitting Job to local Spark master
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM