简体   繁体   English

有没有办法使用生存时间装饰器来缓存Python 3.5定义?

[英]Is there a way to cache Python 3.5 definitions using a time-to-live decorator?

Currently, I use functools ' lru_cache to handle my caching for the function. 目前,我使用functoolslru_cache来处理我对该函数的缓存。 The problem is that the size of the cache never grows large enough to make use of the LRU (due to the fact that the function never takes in a parameter). 问题是缓存的大小永远不会增长到足以使用LRU (因为函数从不接受参数)。 Instead, the function, when called, opens up a specific URL and return its contents. 相反,该函数在被调用时会打开一个特定的URL并返回其内容。

Is there a way in which I can specify the 'time to live' cache in which after a certain amount of time/certain amount of calls, it refreshes its cache? 有没有一种方法可以指定“生存时间”缓存,在一定时间/一定数量的调用之后,它会刷新缓存?

I don't know of a decorator, but you can keep track of the last time you got the page and update as needed. 我不知道装饰器,但您可以跟踪上次获取页面并根据需要进行更新。 If this is a single-threaded app, it can be simple 如果这是一个单线程应用程序,它可以很简单

_cached_page = ''
_cached_page_time = 0

def get_page():
    global _cached_page, _cached_page_time
    now = time.time()
    # invalidate in 1 hour
    if not _cached_page or now - _cached_page_time > 60 * 60:
        _cached_page = get_the_page_here()
        _cached_page_time = time.time()
    return _cached_page

You could also age the page with a timer in the background. 您还可以在后台使用计时器对页面进行老化。 You need to control access with a lock, but that makes the cache usable in a multithreaded program too. 您需要使用锁来控制访问,但这也使缓存在多线程程序中可用。

_cached_page = ''
_cached_page_lock = threading.Lock()

def _invalidate_page():
    global _cached_page
    with _cached_page_lock:
        _cached_page = ''

def get_page():
    global _cached_page
    with _cached_page_lock:
        if not _cached_page:
            _cached_page = get_the_page_here()
            # invalidate in 1 hour
            threading.Timer(60*60, _invalidate_page)
        return _cached_page

Finally, the server may include an Expires: ... field in the http header. 最后,服务器可以在http报头中包括Expires: ...字段。 Depending on how well the service was written, this would be a good reflection of how long the page can be cached. 根据服务的编写程度,这可以很好地反映页面可以缓存多长时间。

The functools.lru_cache function accepts a maxsize argument which saves the results up to the maxsize most recent calls. functools.lru_cache函数接受maxsize参数,该参数将结果保存到maxsize最近的调用。

You can check this by calling the cache_info attribute of your decorated function. 您可以通过调用已修饰函数的cache_info属性来检查此问题。

If you want to refresh your cache completely you should implement a cash object manually by counting the number of cache calls and resetting the cache whenever it hits the max size. 如果要完全刷新缓存,则应通过计算缓存调用次数并在达到最大大小时重置缓存来手动实现现金对象。

from functools import wraps


class Mycache(object):
    def __init__(self, maxcount):
        self.count = 0
        self.maxcount = maxcount
        self.cache = {}

    def __call__(self, func):

        @wraps(func)
        def wrapped(*args):
            self.count += 1
            if self.count > self.maxcount:
                self.cache = {}
                self.count = 0
                result = self.cache[args] = func(*args)
            else:
                try:
                    result = self.cache[args]
                except KeyError:
                    result = self.cache[args] = func(*args)
            return result
        return wrapped

Demo: 演示:

@Mycache(3)
def a(arg):
    print("arg is : {}".format(arg))
    return arg ** 2

print(a(4))
print(a(4))
print(a(4))
print(a(4))
print(a(3))
print(a(4))

output: 输出:

arg is : 4
16
16
16
arg is : 4
16
arg is : 3
9
16

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM