简体   繁体   中英

Truncating a large csv-file in Python

I have a csv file which is too large to completely fit into my laptop's memory (about 10GB). Is there a way to truncate the file such that only the first n entries are saved in a new file? I started by trying

df = pandas.read_csv("path/data.csv").as_matrix()

but this doesn´t work since the memory is too small.

Any help will be appreciated!

Leon

Use nrows :

df = pandas.read_csv("path/data.csv", nrows=1000)

The nrows docs say:

Number of rows of file to read. Useful for reading pieces of large files

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM