繁体   English   中英

spark和kafka的集成,Spark中的异常-提交一个jar

[英]Integration of spark and kafka, exception in Spark-submit a jar

提交将kafka与ubuntu下的spark集成jar文件时,出现NullPointerException 我正在尝试在https://github.com/apache/spark/tree/v2.1.1/examples上运行代码

我试图检查在Ubuntu下安装spark是否需要设置HADOOP_HOME ; 但是,未设置HADOOP_HOME,请仔细检查jar的参数。

./bin/spark-submit --class "org.apache.spark.examples.streaming.JavaKafkaWordCount" --packages org.apache.spark:spark-streaming-kafka-0-10_2.11:2.1.0 --master local[*] --jars ~/software/JavaKafkaWordCount.jar localhost:2181 test-consumer-group streams-plaintext-input 1

org.apache.spark.deploy.DependencyUtils $ .downloadFile(DependencyUtils.scala:136)上的org.apache.hadoop.fs.Path.getName(Path.java:337)处的线程“ main”中的java.lang.NullPointerException在org.apache.spark.deploy.SparkSubmit $$ anonfun $ prepareSubmitEnvironment $ 7.apply(SparkSubmit.scala:367)在org.apache.spark.deploy.SparkSubmit $$ anonfun $ prepareSubmitEnvironment $ 7.apply(SparkSubmit.scala)在org.apache.spark.deploy.SparkSubmit.prepareSubmitEnvironment(SparkSubmit.scala:366)上的scala.Option.map(Option.scala:146)在org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:143)上)的org.apache.spark.deploy.SparkSubmit $$ anon $ 2.doSubmit(SparkSubmit.scala:924)的org.apache.spark.deparky.SparkSubmit.doSubmit(SparkSubmit.scala:86)。在org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)上部署.SparkSubmit $ .main(SparkSubmit.scala:933)

您的路径uri jar无法理解,请参阅此DependencyUtils.scala#L136

 /**
   * Download a file from the remote to a local temporary directory. If the input path points to
   * a local path, returns it with no operation.
   *
   * @param path A file path from where the files will be downloaded.
   * @param targetDir A temporary directory for which downloaded files.
   * @param sparkConf Spark configuration.
   * @param hadoopConf Hadoop configuration.
   * @param secMgr Spark security manager.
   * @return Path to the local file.
   */
  def downloadFile(
      path: String,
      targetDir: File,
      sparkConf: SparkConf,
      hadoopConf: Configuration,
      secMgr: SecurityManager): String = {
    require(path != null, "path cannot be null.")
    val uri = Utils.resolveURI(path)

    uri.getScheme match {
      case "file" | "local" => path
      case "http" | "https" | "ftp" if Utils.isTesting =>
        // This is only used for SparkSubmitSuite unit test. Instead of downloading file remotely,
        // return a dummy local path instead.
        val file = new File(uri.getPath)
        new File(targetDir, file.getName).toURI.toString
      case _ =>
        val fname = new Path(uri).getName()
        val localFile = Utils.doFetchFile(uri.toString(), targetDir, fname, sparkConf, secMgr,
          hadoopConf)
        localFile.toURI().toString()
    }
  }

在提交火花时,像这样更改参数

--jars /fullpath/JavaKafkaWordCount.jar而不是--jars ~/software/JavaKafkaWordCount.jar

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM