简体   繁体   中英

The good way to set classpath for Spark YARN cluster mode?

I have a Spark-Cassandra connector application, the config part of the code are:

val conf = new SparkConf(true).setAppName("Some Name")
    .set("spark.cassandra.connection.host", "127.0.0.1")
    .set("spark.executor.extraClassPath", "/absolute_path_to/my.jar")
val sc = new SparkContext("spark://127.0.0.1:7077", "App", conf)

And I submit with:

spark-submit --class com.data.MyApp --master yarn --deploy-mode cluster \
--executor-cores 2 --num-executors 2 --executor-memory 4G \
--jars /absolute_path_to/my.jar ./target/scala-2.10/ds-spark-assembly-1.0.jar

I CAN make it work. But can I use "relative path" in my code for spark.executor.extraClassPath ? If I can, the path is relative to where in all cluster nodes?

Thanks

I make it work as:

val conf = new SparkConf(true).setAppName("Some Name")
    .set("spark.cassandra.connection.host", "127.0.0.1")
    .setJars(Seq("my.jar"))
val sc = new SparkContext("spark://127.0.0.1:7077", "App", conf)

And I don't need to put --jar option in spark-submit .

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM