简体   繁体   English

Python:行缓存文件大小限制?

[英]Python: linecache file size limit?

I have a pretty big.txt file.我有一个很大的.txt 文件。 Each entry is on a new line.每个条目都在一个新行上。 I am trying to access the file and iterate through each line to grab the entry.我正在尝试访问该文件并遍历每一行以获取条目。 However when I use linecache.getline('file_path', 1) , I am getting an empty string.但是,当我使用linecache.getline('file_path', 1)时,我得到一个空字符串。 Which from the Python docs , is how linecache returns errors. Python 文档中的哪一个是linecache返回错误的方式。 Is there a file size limit?有文件大小限制吗? I am trying to read a 1.2GB file.我正在尝试读取一个 1.2GB 的文件。 I am also fairly sure linecache is still trying to read the whole file into memory before getting a line number.我也相当确定linecache在获得行号之前仍在尝试将整个文件读入 memory 。 RAM usage goes up about the size of the file and then returns to normal. RAM 使用率上升大约文件的大小,然后恢复正常。 Anything I'm doing wrong with linecache ?我对linecache做错了什么? Any suggestions other than linecache ?除了linecache之外还有什么建议吗?

If you simply want to read a file line by line, without loading it into memory, the python open comes with its own generator for exactly this.如果您只是想逐行读取文件,而不是将其加载到 memory 中,则 python open带有自己的生成器。

with open("filename", "r") as file:
    for line in file:
        # Do stuff

As long as you do what you have to do inside that for loop, you don't have to worry about memory.只要你在for循环中做你必须做的事情,你就不必担心memory。

More info on the official docs有关官方文档的更多信息

All you need to do is use the file object as an iterator.您需要做的就是使用文件 object 作为迭代器。

for line in open("log.txt"):
    do_something_with(line)

Even better is using context manager in Python versions.更好的是在 Python 版本中使用上下文管理器。

with open("log.txt") as fileobject:
    for line in fileobject:
        do_something_with(line)

This will automatically close the file as well.这也会自动关闭文件。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM