简体   繁体   中英

Does python's csv.reader read the entire file into memory?

Does the csvreader object read the entire file into memory? If I have big data, would it crash because of low memory. Or it's only a pointer so that I can process each line?

import csv
with open('RawData.csv','r') as file:
    csvreader = csv.reader(file, delimiter=',')
    for row in csvreader:
        print(row)

From the csv.reader documentation:

Return a reader object which will iterate over lines in the given csvfile. csvfile can be any object which supports the iterator protocol and returns a string each time its __next__() method is called — file objects and list objects are both suitable.

(Emphasis mine.)

What you have is a wrapper around the file object. The file pointer does all the dirty work of efficiently iterating over the lines of your file, and the csv module's Reader parses those lines as they're read in.

So yes, +1 for memory friendliness and efficiency.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM