简体   繁体   中英

How to load and write properties file in a Spark Scala job?

I have a job Spark and I should read properties information from a file "config.properties" in this format:

var1=1
var2=12/10/2021

At the end of the process, I should update var1 and var2 , so I have to overwrite "config.properties" file.... how can I do?

This code would be part of the driver, so you write it as any Java/Scala app reading a configuration files, whether the properties format or using JSON.

What you need to keep in mind:

  • when you run in local mode (when you create your session with setMaster(“local”)) or client mode (setting up master to a known cluster) then you run locally. This means that the driver will access your local file system. Make sure the user running the app Ahmad the rights to do so.
  • when in cluster mode, and you submit your application via Spark-submit or a similar tool, then you do not control the path and you may not be able to access a local file on the cluster. In this scenario, depending on your infrastructure, you may want to point to a cloud drive (S3 or equivalent), a network mount (SMB, NFS…), or a virtual drive (Google Drive, ownCloud, Dropbox…)

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM