[英]How to load and write properties file in a Spark Scala job?
I have a job Spark and I should read properties information from a file "config.properties" in this format:我有一份工作 Spark,我应该以这种格式从文件“config.properties”中读取属性信息:
var1=1
var2=12/10/2021
At the end of the process, I should update var1
and var2
, so I have to overwrite "config.properties" file.... how can I do?在该过程结束时,我应该更新
var1
和var2
,所以我必须覆盖“config.properties”文件....我该怎么办?
This code would be part of the driver, so you write it as any Java/Scala app reading a configuration files, whether the properties format or using JSON.此代码将成为驱动程序的一部分,因此您可以将其编写为读取配置文件的任何 Java/Scala 应用程序,无论是属性格式还是使用 JSON。
What you need to keep in mind:您需要记住的事项:
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.