[英]How can I programmatically find Spark version in an executor node?
I'm trying to write a method (that will be ran through an executor) that will return the Spark version as a string. 我正在尝试编写将通过字符串返回Spark版本的方法(将通过执行程序运行)。 I know that I can find the Spark version with the following code: 我知道可以使用以下代码找到Spark版本:
SparkSession.builder().getOrCreate().version (even on executor)
But, when I'm running tests (tests in Apache Spark source code, were written before mine), some tests fail with the following error: 但是,当我运行测试时(Apache Spark源代码中的测试是在我的测试之前编写的),某些测试失败并显示以下错误:
Caused by: java.lang.IllegalStateException: SparkSession should only be created and accessed on the driver. 原因:java.lang.IllegalStateException:仅应在驱动程序上创建和访问SparkSession。
So, I understand that I can't use SparkSession. 因此,我知道我无法使用SparkSession。 Therefore my question is, is there any other way to find Spark version in an executor? 因此,我的问题是,是否有其他方法可以在执行程序中找到Spark版本?
I solved my problem by importing SPARK_VERSION directly: 我通过直接导入SPARK_VERSION解决了我的问题:
import org.apache.spark.SPARK_VERSION
There was also an option to transfer version with configuration that was already in my class. 在我的课程中,还有一个选项可以用来传输带有配置的版本。
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.