[英]lightbend config : pass dynamic values in application.properties file
I am using lightbend in my scala code so that my application.properties
file is away from my jar. 我在scala代码中使用了lightbend ,以便我的
application.properties
文件远离jar。 I need to pass the current date in application.properties
file. 我需要在
application.properties
文件中传递当前日期。
Date_read_write = 20190828
how do I make this to pick up dynamically, if it is shell script I can mention 我如何使它动态地拾取,如果是我可以提及的shell脚本
Date_read_write=`date +%Y%m%d`
how do I do a similar step in application.properties
file? 如何在
application.properties
文件中执行类似的步骤?
Update for time addition 更新时间添加
below is my application.properties
file: 以下是我的
application.properties
文件:
hdfs_link = hdfs://XX.XXX.XXX.XX:8080/sessions/data/events
mediaKey = 1234
eventName = media
Date_read_write = 20190815
time = *_W
parts = *
i am using above to generate hdfs://XX.XXX.XXX.XX:8080/sessions/data/events/1234/media/20190815/*_W/*
我正在上面使用生成
hdfs://XX.XXX.XXX.XX:8080/sessions/data/events/1234/media/20190815/*_W/*
using : 使用:
val conf = ConfigFactory.load()
val hdfs_path = conf.getString("hdfs_link")+"/"+conf.getString("mediaKey")+"/"+conf.getString("eventName")+"/"+conf.getString("Date_read_write")+"/"+conf.getString("time")+"/"+conf.getString("parts")
but when i add 但是当我添加
--driver-java-options -DDate_read_write=`date --date='1 days ago' '+%Y%m%d'`
to my spark submit commad, my URL is turing to hdfs://XX.XXX.XXX.XX:8080/sessions/data/events/1234/media/20190828/*/*
到我的
hdfs://XX.XXX.XXX.XX:8080/sessions/data/events/1234/media/20190828/*/*
提交逗号,我的URL正在图注hdfs://XX.XXX.XXX.XX:8080/sessions/data/events/1234/media/20190828/*/*
i have no clue why this addition changes the value of time
. 我不知道为什么这种加法会改变
time
的价值。
Below is my spark submit command: 以下是我的spark提交命令:
nohup spark-submit --master yarn --deploy-mode client --num-executors 20 --executor-cores 6 --driver-memory 10g --executor-memory 10g --class metrics.MasterAggregateTable --files application.properties --driver-java-options -Dconfig.file=application.properties --driver-java-options -DDate_read_write=`date +%Y%m%d` --jars com.datastax.spark_spark-cassandra-connector_2.11-2.3.0.jar UserMetrics.jar &
nohup spark-submit --master yarn --deploy-mode客户端--num-executors 20 --executor-cores 6 --driver-memory 10g --executor-memory 10g --classmetrics.MasterAggregateTable --files application.properties --driver-java-options -Dconfig.file = application.properties --driver-java-options -DDate_read_write =`date +%Y%m%d` --jars com.datastax.spark_spark-cassandra-connector_2.11- 2.3.0.jar UserMetrics.jar和
You can override all the settings in application.properties
by system properties. 您可以按系统属性覆盖
application.properties
的所有设置 。
java -DDate_read_write=`date +%Y%m%d` -jar yourapp.jar
In default uses of the library, exact-match system properties already override the corresponding config properties.
在该库的默认使用中,完全匹配系统属性已经覆盖了相应的配置属性。
This does not require any changes to the configuration file inside of the jar. 这不需要对jar内的配置文件进行任何更改。
You could consider using an environment variable for that. 您可以考虑为此使用环境变量。 Just add environment variable substitution in your
application.properties
file: 只需在
application.properties
文件中添加环境变量替换:
date = ${DATE}
Then while you run your app just set DATE
variable: 然后,在运行应用程序时,只需设置
DATE
变量:
DATE=`date +%Y%m%d` java -jar yourapp.ja
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.