Introduction

A decorator that caches function results to improve performance by memoizing expensive function calls.

  1. The @functools.lru_cache decorator is a built-in decorator from the functools module in Python. It stands for "Least Recently Used Cache" and is used to optimize the performance of functions by caching their results.
  2. When a function is decorated with @functools.lru_cache, the function's results are stored in a cache. If the same function is called again with the same arguments, the cached result is returned instead of re-computing the function, which can save time for expensive or time-consuming calculations.
  3. @functools.lru_cache is particularly useful for optimizing recursive functions or functions with repeated calls that have the same inputs. However, be cautious with its use in functions with large or unbounded inputs, as it can consume a lot of memory if not used judiciously.
  4. The maxsize argument allows you to limit the cache size and control memory usage.

Example:

import functools

@functools.lru_cache(maxsize=3)
def fibonacci(n):
    if n <= 1:
        return n
    else:
        return fibonacci(n-1) + fibonacci(n-2)

# Calling the function multiple times with the same arguments
result1 = fibonacci(5)  # Calculated
result2 = fibonacci(5)  # Cached
print(result1)  # Output: 5

In this example, the cache size is limited to 2. It means that the cache can store the results for the two most recent function calls. When the cache is full and a new result needs to be stored, the least recently used result will be discarded.