简体   繁体   中英

How can I programmatically find Spark version in an executor node?

I'm trying to write a method (that will be ran through an executor) that will return the Spark version as a string. I know that I can find the Spark version with the following code:

SparkSession.builder().getOrCreate().version (even on executor)

But, when I'm running tests (tests in Apache Spark source code, were written before mine), some tests fail with the following error:

Caused by: java.lang.IllegalStateException: SparkSession should only be created and accessed on the driver.

So, I understand that I can't use SparkSession. Therefore my question is, is there any other way to find Spark version in an executor?

I solved my problem by importing SPARK_VERSION directly:

import org.apache.spark.SPARK_VERSION

There was also an option to transfer version with configuration that was already in my class.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM