[英]Python glob.glob(dir) Memory Error
搜索包含數百萬個文件的文件夾時,我正在處理內存問題。 有誰知道如何克服這種情況? 有什么方法可以限制glob搜索的文件嗎? 因此可以分塊執行嗎?
Traceback (most recent call last):
File "./lb2_lmanager", line 533, in <module>
main(sys.argv[1:])
File "./lb2_lmanager", line 318, in main
matched = match_files(policy.directory, policy.file_patterns)
File "./lb2_lmanager", line 32, in wrapper
res = func(*args, **kwargs)
File "./lb2_lmanager", line 380, in match_files
listing = glob.glob(directory)
File "/usr/lib/python2.6/glob.py", line 16, in glob
return list(iglob(pathname))
File "/usr/lib/python2.6/glob.py", line 43, in iglob
yield os.path.join(dirname, name)
File "/usr/lib/python2.6/posixpath.py", line 70, in join
path += '/' + b
MemoryError
嘗試使用generators
而不是lists
。
要了解什么發電機閱讀此
import glob
dir_list = glob.iglob(YOUR_DIRECTORY)
for file in dir_list:
print file
將YOUR_DIRECTORY
更改為您要列出的目錄。
聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.