Memory-aware Lru Caching In Python?
I'm using Python 3's builtin functools.lru_cache decorator to memoize some expensive functions. I would like to memoize as many calls as possible without using too much memory, sin
Solution 1:
I ended up modifying the built-in lru_cache
to use psutil
.
The modified decorator takes an additional optional argument use_memory_up_to
. If set, the cache will be considered full if there are fewer than use_memory_up_to
bytes of memory available (according to psutil.virtual_memory().available
). For example:
from .lru_cache import lru_cache
GB = 1024**3@lru_cache(use_memory_up_to=(1 * GB))defexpensive_func(args):
...
Note: setting use_memory_up_to
will cause maxsize
to have no effect.
Here's the code: lru_cache.py
Post a Comment for "Memory-aware Lru Caching In Python?"