简体   繁体   中英

lxml pretty_print python memory overload

I have a poorly formatted xml file with over 350 MB of data. Basically, all the data was consolidated into one line. I am trying to pretty_print this into a new file to make life easier, but am running into memory issues. Am I doing anything wrong here and is there a way around this? My computer has 4GB of RAM and is a Quad-Core i5-2410M (2.30Ghz)

import os
from lxml import etree

parser = etree.XMLParser(remove_blank_text=True)
tree = etree.parse('filename',parser)
f = open('filename',"w")
f.write(etree.tostring(tree,pretty_print=True))
f.close()

You might want to try using the write method directly with the file handle rather than calling tostring. Change this line:

f.write(etree.tostring(tree,pretty_print=True))

to this:

tree.write(f, pretty_print=True)

This should hopefully reduce the memory usage by half.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM