简体   繁体   中英

Memory keeps on growing after writing large number of files

I've a sample program shown below which open a file and writes a 100kb of string into the file and closes the file.

for a in xrange(100000):
    file_to = open('.//OutputFiles/Data' + str(a) + '.xml', "w")
    file_to.write(100kb_String)
    file_to.close()

The issue with this code is, the memory keeps on growing and doesn't release memory to OS. After the above code has run and if I remove the files from physical disk using rm the memory goes back to OS. gc.collect() is not working. I tried with subprocesses as below but still no luck.

def worker(a):
    file_to = open('.//OutputFiles/Data' + str(a) + '.xml', "w")
    file_to.write(100kb_string)
    file_to.close()

if __name__ == '__main__':
    jobs = []
    for i in range(100000):
        p = multiprocessing.Process(target=worker, args=(i,))
        jobs.append(p)
        p.start()
        p.join()

Is there anyway to better handle this situation?

I found it! It is basically not python's problem. As @Brad said it is cache problem. I followed what is mentioned in this page and the memory is back to OS.

http://www.yourownlinux.com/2013/10/how-to-free-up-release-unused-cached-memory-in-linux.html

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM