简体   繁体   English

记住Python垃圾收集

[英]Memoize Python Garbage Collection

If I use a memoize decorator for example similar to that in: 如果我使用memoize装饰器,例如类似于:

https://wiki.python.org/moin/PythonDecoratorLibrary#Memoize https://wiki.python.org/moin/PythonDecoratorLibrary#Memoize

Do I need to worry about running out of memory and needing to manually garbage collect? 我是否需要担心耗尽内存并需要手动垃圾收集? For example if I have a long running Python processess that continually memoizes, won't I need to be sure that the dict does not get too large. 例如,如果我有一个长期运行的Python进程不断记忆,我不需要确定dict不会太大。 Do memoize decorators typically also need to do cache eviction? memoize装饰器通常还需要进行缓存驱逐吗?

Why isn't this an issue with all decorators that can hold an arbitrary amount of intermediate state? 为什么这不是一个可以容纳任意数量的中间状态的装饰器的问题?

Would using an lru_cache from functools resolve this? 使用functoolslru_cache会解决这个问题吗?

The memoized decorator you linked has no bound on memory usage, and does not do cache eviction. 您链接的memoized装饰器对内存使用没有限制,也不会进行缓存驱逐。 So yes, if you keep calling the function with different parameters you have to worry about running out of memory. 所以是的,如果你继续使用不同的参数调用函数,你必须担心内存不足。

functools.lru_cache(n) will not store more than n calls in the cache - this is perfect to limit memory usage. functools.lru_cache(n)不会在缓存中存储多于n调用 - 这非常适合限制内存使用。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM