简体   繁体   中英

Python: linecache file size limit?

I have a pretty big.txt file. Each entry is on a new line. I am trying to access the file and iterate through each line to grab the entry. However when I use linecache.getline('file_path', 1) , I am getting an empty string. Which from the Python docs , is how linecache returns errors. Is there a file size limit? I am trying to read a 1.2GB file. I am also fairly sure linecache is still trying to read the whole file into memory before getting a line number. RAM usage goes up about the size of the file and then returns to normal. Anything I'm doing wrong with linecache ? Any suggestions other than linecache ?

If you simply want to read a file line by line, without loading it into memory, the python open comes with its own generator for exactly this.

with open("filename", "r") as file:
    for line in file:
        # Do stuff

As long as you do what you have to do inside that for loop, you don't have to worry about memory.

More info on the official docs

All you need to do is use the file object as an iterator.

for line in open("log.txt"):
    do_something_with(line)

Even better is using context manager in Python versions.

with open("log.txt") as fileobject:
    for line in fileobject:
        do_something_with(line)

This will automatically close the file as well.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM