简体   繁体   English

将附加文本文件从数据块写入 azure adls gen1

[英]writing appending text file from databricks to azure adls gen1

I want to write kind of a log file back to azure adls gen1 I can write (not append) using我想将一种日志文件写回 azure adls gen1 我可以使用

dbutils.fs.put(filename,"random text")

but i cant append it using但我不能 append 它使用

with open("/dbfs/mnt/filename.txt","a"):
f.write("random text")

it give me error它给了我错误

1 with  open("/dbfs/mnt/filename.txt", "a") as f:
----> 2   f.write("append values")

OSError: [Errno 95] Operation not supported

alternatively, i tried using logger.basicconfig(logging.basicConfig(filename='dbfs:/mnt/filename.txt', filemode='w')或者,我尝试使用 logger.basicconfig(logging.basicConfig(filename='dbfs:/mnt/filename.txt', filemode='w')

but looks like its not writing into the path.但看起来它没有写入路径。 can anyone help please谁能帮忙

Append Only ('a') : Open the file for writing. Append Only ('a') :打开文件进行写入。 The file is created if it does not exist.如果文件不存在,则创建该文件。 The handle is positioned at the end of the file.句柄位于文件的末尾。 The data being written will be inserted at the end, after the existing data.正在写入的数据将插入到现有数据之后。

file = open("myfile.txt","a")#append mode 
file.write("Today \n") 

在此处输入图片说明

Output of append file:附加文件的输出:

在此处输入图片说明

you can do that to a dbfs file.您可以对 dbfs 文件执行此操作。 https://kb.databricks.com/en_US/dbfs/errno95-operation-not-supported https://kb.databricks.com/en_US/dbfs/errno95-operation-not-supported

You may need to figure out a logic to read a file from datalake using python cli and write to it.您可能需要弄清楚使用 python cli 从数据湖读取文件并写入文件的逻辑。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM