[英]Memory Error while reading a Big text file
I have a Text File Of Around 36gb which contains words per line, i am trying to read the file, but it says Memory Error, at which I am not shocked, but how do I work around to read it? 我有一个大约36gb的文本文件,其中每行包含一个单词,我正在尝试读取该文件,但它显示“内存错误”,对此我并不感到震惊,但是如何解决该问题?
I am trying this: 我正在尝试:
for words in open("hugefile.txt").readlines():
#do something
I have 2gb RAM, OS: Windows XP, Python 2.7 我有2GB RAM,操作系统:Windows XP,Python 2.7
Thanks. 谢谢。
You are calling readlines()
which loads the whole file into memory. 您正在调用
readlines()
,它将整个文件加载到内存中。
Iterate over the file instead: 而是遍历文件:
for words in open("hugefile.txt"):
This'll iterate over lines one by one, reading more lines as needed. 这将逐行迭代,并根据需要读取更多行。
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.