[英]How to check Spark configuration from command line?
Basically, I want to check a property of Spark's configuration, such as "spark.local.dir" through command line, that is, without writing a program.基本上,我想通过命令行检查Spark的配置属性,例如“spark.local.dir”,即不编写程序。 Is there a method to do this?
有没有办法做到这一点?
There is no option of viewing the spark configuration properties from command line.没有从命令行查看 spark 配置属性的选项。
Instead you can check it in spark-default.conf file.相反,您可以在 spark-default.conf 文件中检查它。 Another option is to view from webUI.
另一种选择是从 webUI 查看。
The application web UI at http://driverIP:4040 lists Spark properties in the “Environment” tab.位于http://driverIP:4040的应用程序 Web UI 在“环境”选项卡中列出了 Spark 属性。 Only values explicitly specified through spark-defaults.conf, SparkConf, or the command line will appear.
只有通过 spark-defaults.conf、SparkConf 或命令行明确指定的值才会出现。 For all other configuration properties, you can assume the default value is used.
对于所有其他配置属性,您可以假设使用默认值。
For more details, you can refer Spark Configuration更多细节可以参考Spark 配置
以下命令在控制台上打印您的 conf 属性
sc.getConf.toDebugString
We can check in Spark shell using below command :我们可以使用以下命令检查 Spark shell:
scala> spark.conf.get("spark.sql.shuffle.partitions")
res33: String = 200
Based on http://spark.apache.org/docs/latest/configuration.html .基于http://spark.apache.org/docs/latest/configuration.html 。 Spark provides three locations to configure the system:
Spark 提供了三个位置来配置系统:
Environment variables can be used to set per-machine settings, such the IP address, through the conf/spark-env.sh script on each node.环境变量可用于通过每个节点上的 conf/spark-env.sh 脚本设置每台机器的设置,例如 IP 地址。
Logging can be configured through log4j.properties.可以通过 log4j.properties 配置日志记录。
I haven't heard about method through command line.我还没有通过命令行听说过方法。
从 CLI 检查 spark 配置的主命令
sc._conf.getAll()
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.