简体   繁体   中英

How to load a big text file efficiently in python

I have a text file containing 7000 lines of strings. I got to search for a specific string based upon few params.

Some are saying that the below code wouldn't be efficient (speed and memory usage).

f = open("file.txt")
data = f.read().split() # strings as list
  1. First of all, if don't even make it as a list, how would I even start searching at all?
  2. Is it efficient to load the entire file? If not, how to do it?
  3. To filter anything, we need to search for that we need to read it right!

A bit confused

iterate over each line of the file, without storing it. This will make for program memory Efficient.

with open(filname) as f:
   for line in f:
      if "search_term" in line:
           break

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM