简体   繁体   中英

spark-shell load scala config using typesafe config

I am trying to load application.conf in spark-shell using typesafe-config.

Following is the code and spark-shell command:

scala code:

import com.typesafe.config.{Config, ConfigFactory}

val config: Config = ConfigFactory.load("application.conf")
val env = config.getString("key.key1")

spark-shell commands :

  • spark-shell --jars confing-1.3.4.jar --files application.conf --driver-java-options -Dconfig.file=application.conf (attempt 1)
  • spark-shell --jars confing-1.3.4.jar (attempt 2)

application.conf:

key {

 key1 = "value1"

}

error:

com.typesafe.config.ConfigException$Missing: system properties: No configuration setting found for key 'key'

Where is your conf file present? Did you try using the full path from the root? Eg:

val config: Config = ConfigFactory.load("/<root-path>/application.conf")

You can pass file to each executors by --files "application.conf" and read with ConfigFactory.parseFile() as

You can get the file passed by --files with SparkFiles.get

import com.typesafe.config.ConfigFactory
import java.io.File
import org.apache.spark.SparkFiles

val config = ConfigFactory.parseFile(new File(SparkFiles.get("application.conf")))

This should load the config you want.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM