[英]How to create directory dynamically if it doesn't exist in HDFS by using pyspark and set file and directory permission as well
I am new to Hadoop can we create directory in hadoop dyanamically?我是 Hadoop 的新手,我们可以动态地在 hadoop 中创建目录吗?
currently I am using below command:目前我正在使用以下命令:
hadoop fs -mkdir -p /data/test1/test2/test3/
and setting the file permission by using below command:并使用以下命令设置文件权限:
hdfs dfs -chmod -R 777 /path /data/test1/test2/test3/t_bill_sheet.csv
By Dyanamically I mean {year} and iteratively inside it folder by date like 5,6,7 etc.动态我的意思是 {year} 并按日期在它的文件夹中迭代,如 5、6、7 等。
Thanks in Advance提前致谢
You can define a bash variable (or compute from current dates if you want) then reuse it over and over您可以定义一个 bash 变量(或根据需要从当前日期计算)然后一遍又一遍地重用它
YEAR=2000
MONTH=03
DAY=01
PATH="/data/$YEAR/$MONTH/$DAY"
hadoop fs -mkdir -p "$PATH"
hdfs dfs -chmod -R 777 /path "$PATH/t_bill_sheet.csv"
you can do it using a combination of exists() and mkdirs() methon in pyspark as below您可以在 pyspark 中使用 exists() 和 mkdirs() 方法的组合,如下所示
fs = spark._jvm.org.apache.hadoop.fs.FileSystem.get(spark._jsc.hadoopConfiguration())
if not fs.exists(sc._jvm.org.apache.hadoop.fs.Path("path")): #returns true or false
fs.mkdirs(spark._jvm.org.apache.hadoop.fs.Path("path"), FsPermission(777: Short) permission)
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.