[英]The good way to set classpath for Spark YARN cluster mode?
I have a Spark-Cassandra connector application, the config part of the code are: 我有一个Spark-Cassandra连接器应用程序,代码的config部分是:
val conf = new SparkConf(true).setAppName("Some Name")
.set("spark.cassandra.connection.host", "127.0.0.1")
.set("spark.executor.extraClassPath", "/absolute_path_to/my.jar")
val sc = new SparkContext("spark://127.0.0.1:7077", "App", conf)
And I submit with: 我提交:
spark-submit --class com.data.MyApp --master yarn --deploy-mode cluster \
--executor-cores 2 --num-executors 2 --executor-memory 4G \
--jars /absolute_path_to/my.jar ./target/scala-2.10/ds-spark-assembly-1.0.jar
I CAN make it work. 我可以使它工作。 But can I use "relative path" in my code for
spark.executor.extraClassPath
? 但是我可以在我的代码中为
spark.executor.extraClassPath
使用“相对路径”吗? If I can, the path is relative to where in all cluster nodes? 如果可以,该路径是相对于所有群集节点中的位置的?
Thanks 谢谢
I make it work as: 我将其作为:
val conf = new SparkConf(true).setAppName("Some Name")
.set("spark.cassandra.connection.host", "127.0.0.1")
.setJars(Seq("my.jar"))
val sc = new SparkContext("spark://127.0.0.1:7077", "App", conf)
And I don't need to put --jar
option in spark-submit
. 而且我不需要将
--jar
选项放在spark-submit
。
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.