简体   繁体   中英

How to check Spark configuration from command line?

Basically, I want to check a property of Spark's configuration, such as "spark.local.dir" through command line, that is, without writing a program. Is there a method to do this?

There is no option of viewing the spark configuration properties from command line.

Instead you can check it in spark-default.conf file. Another option is to view from webUI.

The application web UI at http://driverIP:4040 lists Spark properties in the “Environment” tab. Only values explicitly specified through spark-defaults.conf, SparkConf, or the command line will appear. For all other configuration properties, you can assume the default value is used.

For more details, you can refer Spark Configuration

以下命令在控制台上打印您的 conf 属性

 sc.getConf.toDebugString

We can check in Spark shell using below command :

scala> spark.conf.get("spark.sql.shuffle.partitions")
res33: String = 200

Based on http://spark.apache.org/docs/latest/configuration.html . Spark provides three locations to configure the system:

  • Spark properties control most application parameters and can be set by using a SparkConf object, or through Java system properties.
  • Environment variables can be used to set per-machine settings, such the IP address, through the conf/spark-env.sh script on each node.

  • Logging can be configured through log4j.properties.

I haven't heard about method through command line.

从 CLI 检查 spark 配置的主命令

sc._conf.getAll()

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM