简体   繁体   中英

Flink requires local path for hive conf directory but how to give that path if we are submitting flink job on yarn?

https://ci.apache.org/projects/flink/flink-docs-stable/dev/table/hive/#connecting-to-hive According to this link, Flink requires local hive conf folder path but I need to submit the Flink job at yarn so Flink try to find path in yarn container eg /mnt/volume4/yarn/nm/usercache/akashkumar.patel/appcache/application_1594626939821_80078/container_e83_1594626939821_80078_01_000002/hdfs:/warehousestore/hive/warehouse/db/hive_conf/hive-site.xml

How can we handle it?, I just need to create a partition for the hive table.Is there any way to give hdfs folder location for hive-conf folder?

You can download hive-site.xml from remote HDFS location and put locally into the container.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM