简体   繁体   中英

Is there a really efficient (FAST) way to read large text files in python?

I am looking to open and fetch data from a large text file in python as fast as possible ( It almost has 62603143 lines - size 550MB ). As I don't want to stress my computer, I am doing it by following way ,

import time
start = time.time()
for line in open(filePath):
    #considering data as last element in file
    if data in line:
        do_something(data)
end = time.time()
print "processing time = %s" % (count, end-start)

But as I am doing by above method its taking almost 18 seconds to read full file ( My computer has Intel i3 processor and 4 GB RAM ). Likewise if file size is more it is taking more time and considering user point of view its very large. I read lot of opinions on forums, referred multiple Stack Overflow questions but didn't get the fast and efficient way to read and fetch the data from large files. Is there really any way in Python to read large text files in few seconds?

No, there is no faster way of processing a file line by line, not from Python.

Your bottleneck is your hardware , not how you read the file. Python is already doing everything it can (using a buffer to read the file in larger chunks before splitting into newlines).

I suggest upgrading your disk to an SSD.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM