簡體   English   中英

將 JSON 數據寫入 MongoDB 本地的 Spark 作業錯誤

[英]Spark Job Error writing JSON Data to MongoDB Local

我正在嘗試一個簡單的程序來讀取 JSON 文件並寫入 MongoDB 本地實例,我安裝了 Mongo DB

在 Spark 中寫了這個程序

import com.mongodb.spark.MongoSpark
import org.apache.spark.sql.{DataFrame, SparkSession}

    object Main extends App {

        val sparkSession = SparkSession.builder().appName("SparkMongo")
          .master("local[*]")
          .config("spark.mongodb.input.uri", "mongodb://localhost:27017/product.styles")
          .config("spark.mongodb.output.uri", "mongodb://localhost:27017/product.styles")
          .getOrCreate()

        var data : DataFrame = sparkSession.read.option("multiline", "true").json("/home/sandeep/style.json")
        MongoSpark.save(data)
    }

但是當我使用 Spark-Submit 運行時出現以下錯誤

Exception in thread "main" java.lang.NoSuchMethodError: scala.Predef$.refArrayOps([Ljava/lang/Object;)[Ljava/lang/Object;
        at com.mongodb.spark.config.MongoCompanionConfig.getOptionsFromConf(MongoCompanionConfig.scala:233)
        at com.mongodb.spark.config.MongoCompanionConfig.getOptionsFromConf$(MongoCompanionConfig.scala:232)
        at com.mongodb.spark.config.WriteConfig$.getOptionsFromConf(WriteConfig.scala:37)
        at com.mongodb.spark.config.MongoCompanionConfig.apply(MongoCompanionConfig.scala:113)
        at com.mongodb.spark.config.MongoCompanionConfig.apply$(MongoCompanionConfig.scala:112)
        at com.mongodb.spark.config.WriteConfig$.apply(WriteConfig.scala:37)
        at com.mongodb.spark.config.MongoCompanionConfig.apply(MongoCompanionConfig.scala:100)
        at com.mongodb.spark.config.MongoCompanionConfig.apply$(MongoCompanionConfig.scala:100)
        at com.mongodb.spark.config.WriteConfig$.apply(WriteConfig.scala:37)
        at com.mongodb.spark.MongoSpark$.save(MongoSpark.scala:138)
        at Main$.delayedEndpoint$Main$1(Main.scala:15)
        at Main$delayedInit$body.apply(Main.scala:5)
        at scala.Function0$class.apply$mcV$sp(Function0.scala:34)
        at scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:12)
        at scala.App$$anonfun$main$1.apply(App.scala:76)
        at scala.App$$anonfun$main$1.apply(App.scala:76)
        at scala.collection.immutable.List.foreach(List.scala:392)
        at scala.collection.generic.TraversableForwarder$class.foreach(TraversableForwarder.scala:35)
        at scala.App$class.main(App.scala:76)
        at Main$.main(Main.scala:5)
        at Main.main(Main.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
        at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:849)
        at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:167)
        at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:195)
        at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
        at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:924)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:933)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

不知道,我在哪里做錯了什么

您不需要為 mongodb 服務器指定端口嗎?

暫無
暫無

聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.

 
粵ICP備18138465號  © 2020-2024 STACKOOM.COM