search

Clear LRU cache in Python

Sep 10, 2023 pm 12:57 PM
pythonClearlru cache

Clear LRU cache in Python

In this article, we will learn how to clear an LRU cache implemented in Python. Before we dive deep into the coding aspect, let's explore a little about what an LRU cache is and why it is popular.

LRU Cache, also known as the least recently used cache, is a data structure widely used in computer science to improve application performance by reducing the time required to access frequently used data. The LRU Cache stores a limited number of items and deletes the least recently used items when the cache becomes full. This allows the most frequently used items to remain in the cache and be accessed quickly, while less frequently used items are removed to make room for new items.

LRU cache is particularly useful in applications where retrieving data is expensive, such as disk I/O or network access. In these cases, caching frequently used data in memory can significantly improve application performance by reducing the number of expensive operations required to retrieve the data.

The LRU Cache is used in a wide variety of applications, including databases, web servers, compilers, and operating systems. It is particularly useful in applications that require frequent access to a large amount of data, such as search engines and data analytics platforms.

Interacting with LRU cache in Python

In Python 3.2 and above, the functools module includes a powerful feature that allows programmers to interact with the LRU Cache. This feature can be utilized by using a decorator that is placed above a class or function definition. By applying this decorator to functions that require frequent variable access and changes, the performance of the function can be significantly improved.

When working with functions that require the processing of large amounts of data or complex computations, the use of an LRU Cache can greatly speed up the execution time. This is because the LRU Cache stores frequently−used data in memory, allowing the function to quickly access and process the data without incurring the cost of time−consuming I/O operations.

By utilizing the LRU Cache, Python programmers can reduce the execution time of their applications and improve their performance. This is particularly important when working with large-scale applications or those that require real-time data processing, where even small improvements in performance can result in significant gains.

In short, the functools module in Python provides a powerful mechanism for interacting with LRU Cache. By using LRU Cache, programmers can improve application performance by reducing the time required for expensive variable access and change operations. Using LRU Cache is particularly beneficial in applications that require real-time data processing or handle large amounts of data.

Now that we know a little about LRU cache, let's make use of it in Python.

The cache clear() method of the functools module in Python can be used to clear the LRU (least recently used) cache.

The cache is fully cleared using this technique.

Sample code snippet

from functools import lru_cache

@lru_cache(maxsize=128)
def some_function(arg):
	# function implementation
	return result

# clear the cache
some_function.cache_clear()

Explanation

In the above example, some_function is decorated with lru_cache, which creates an LRU cache with a maximum size of 128. To clear the cache, you can call the cache_clear() method on the function object, which will remove all the entries from the cache.

Please note that calling cache_clear() will clear the cache of all parameters. If you want to clear the cache for a specific parameter, you can use a different cache implementation, such as functools.typed_lru_cache, which allows you to use the cache_clear() method with parameters to clear the cache for a specific parameter.

Now let us use the above code to write a working example.

Consider the code shown below.

The Chinese translation of

Example

is:

Example

from functools import lru_cache

@lru_cache(maxsize=128)
def fibonacci(n):
	"""Return the nth Fibonacci number."""
	if n < 2:
    	return n
	return fibonacci(n-1) + fibonacci(n-2)

# Call the function with some arguments to populate the cache
print(fibonacci(10))  # Output: 55
print(fibonacci(15))  # Output: 610

# Clear the cache
fibonacci.cache_clear()

# Call the function again to see that it's recomputed
print(fibonacci(10))  # Output: 55

Explanation

In this example, the fibonacci function uses lru_cache to memoize its results. The cache has a maximum size of 128, so the function will remember the results of the most recent 128 calls.

We first call the function with some arguments to populate the cache. Then, we clear the cache using the cache_clear() method. Finally, we call the function again with the same argument to see that it's recomputed instead of using the cached result.

To run the above code, we need to run the command shown below.

Command

python3 main.py

Once we run the above command, we should expect output similar to the one shown below.

Output

55
610
55

If we want, we can also print the current state information of the cache as well in the above code, to do that we need to make use of the cache_info() method.

Consider the updated code shown below.

The Chinese translation of

Example

is:

Example

from functools import lru_cache

@lru_cache(maxsize=128)
def fibonacci(n):
	"""Return the nth Fibonacci number."""
	if n < 2:
    	return n
	return fibonacci(n-1) + fibonacci(n-2)

# Call the function with some arguments to populate the cache
print(fibonacci(10))  # Output: 55
print(fibonacci(15))  # Output: 610

print(fibonacci.cache_info())

# Clear the cache
fibonacci.cache_clear()

# Call the function again to see that it's recomputed
print(fibonacci(10))  # Output: 55

print(fibonacci.cache_info())

Explanation

In the above code, the @lru cache decorator accepts the optional parameter maxsize, which specifies the maximum size of the cache.

If maxsize is not defined, the cache size is unlimited.

如果缓存已满,最近最少使用的项目将被移除,以为新项目腾出空间。

The function object itself houses the cache that @lru cache uses.

Accordingly, the cache is private to the function and is not shared by other versions of the function. Also, the different part here is the cache_info() method, which is used to print information about the LRU cache used by the fibonacci function. This includes the number of cache hits and misses, as well as the size of the cache.

要运行上述代码,我们需要运行下面显示的命令。

Command

python3 main.py

一旦我们运行上述命令,我们应该期望输出类似于下面所示的输出。

Output

55
610
CacheInfo(hits=14, misses=16, maxsize=128, currsize=16)
55
CacheInfo(hits=8, misses=11, maxsize=128, currsize=11)

现在我们已经看到了如何清除缓存,让我们在另一个例子中使用它。

考虑下面显示的代码。

Example

的中文翻译为:

示例

from functools import lru_cache

@lru_cache(maxsize=128)
def edit_distance(s1, s2):
	"""
	Compute the edit distance between two strings using dynamic programming.
	"""
	if not s1:
    	return len(s2)
	elif not s2:
    	return len(s1)
	elif s1[0] == s2[0]:
    	return edit_distance(s1[1:], s2[1:])
	else:
    	d1 = edit_distance(s1[1:], s2) + 1  # deletion
    	d2 = edit_distance(s1, s2[1:]) + 1  # insertion
    	d3 = edit_distance(s1[1:], s2[1:]) + 1  # substitution
    	return min(d1, d2, d3)

# Call the function with some arguments to populate the cache
print(edit_distance("kitten", "sitting"))  # Output: 3
print(edit_distance("abcde", "vwxyz"))	# Output: 5

# Clear the cache
edit_distance.cache_clear()

# Call the function again to see that it's recomputed
print(edit_distance("kitten", "sitting"))  # Output: 3

Explanation

In this example, the edit_distance function computes the edit distance between two strings using dynamic programming. The function is recursive and has three base cases: if one of the strings is empty, the edit distance is the length of the other string; if the first characters of the two strings are the same, the edit distance is the edit distance between the rest of the strings; otherwise, the edit distance is the minimum of the edit distances for the three possible operations: deletion, insertion, and substitution.

为了提高函数的性能,我们使用lru_cache来缓存其结果。缓存的最大大小为128,因此函数将记住最近128次调用的结果。这样可以避免为相同的参数重新计算编辑距离。

We first call the function with some arguments to populate the cache. Then, we clear the cache using the cache_clear() method. Finally, we call the function again with the same argument to see that it's recomputed instead of using the cached result.

请注意,edit_distance函数只是一个示例,计算两个字符串之间的编辑距离还有更高效的方法(例如使用Wagner−Fischer算法)。这个示例的目的是演示如何使用lru_cache来记忆递归函数的结果。

结论

总之,在某些情况下,清除Python中的LRU(最近最少使用)缓存可能是重要的,以管理内存并确保缓存保持最新。LRU缓存是Python的functools模块提供的内置缓存机制,可以根据函数的参数缓存函数的结果。@lru_cache装饰器用于为函数启用缓存,可以指定maxsize来设置缓存大小的限制。

修饰的函数对象的cache clear()方法可用于清除LRU缓存。通过清除所有缓存结果,该技术使缓存保持最新,同时释放内存。如果函数被更新或输入数据经常变化,清除缓存可能是必要的。

总的来说,LRU缓存提供了一种简单而有效的方法来提高Python函数的性能,特别是那些计算密集型的函数或者被多次使用相同参数调用的函数。在必要时清除缓存可以帮助保持通过缓存获得的性能提升,并确保缓存在减少计算时间方面保持有效。

The above is the detailed content of Clear LRU cache in Python. For more information, please follow other related articles on the PHP Chinese website!

Statement
This article is reproduced at:tutorialspoint. If there is any infringement, please contact admin@php.cn delete
Python vs. C  : Understanding the Key DifferencesPython vs. C : Understanding the Key DifferencesApr 21, 2025 am 12:18 AM

Python and C each have their own advantages, and the choice should be based on project requirements. 1) Python is suitable for rapid development and data processing due to its concise syntax and dynamic typing. 2)C is suitable for high performance and system programming due to its static typing and manual memory management.

Python vs. C  : Which Language to Choose for Your Project?Python vs. C : Which Language to Choose for Your Project?Apr 21, 2025 am 12:17 AM

Choosing Python or C depends on project requirements: 1) If you need rapid development, data processing and prototype design, choose Python; 2) If you need high performance, low latency and close hardware control, choose C.

Reaching Your Python Goals: The Power of 2 Hours DailyReaching Your Python Goals: The Power of 2 Hours DailyApr 20, 2025 am 12:21 AM

By investing 2 hours of Python learning every day, you can effectively improve your programming skills. 1. Learn new knowledge: read documents or watch tutorials. 2. Practice: Write code and complete exercises. 3. Review: Consolidate the content you have learned. 4. Project practice: Apply what you have learned in actual projects. Such a structured learning plan can help you systematically master Python and achieve career goals.

Maximizing 2 Hours: Effective Python Learning StrategiesMaximizing 2 Hours: Effective Python Learning StrategiesApr 20, 2025 am 12:20 AM

Methods to learn Python efficiently within two hours include: 1. Review the basic knowledge and ensure that you are familiar with Python installation and basic syntax; 2. Understand the core concepts of Python, such as variables, lists, functions, etc.; 3. Master basic and advanced usage by using examples; 4. Learn common errors and debugging techniques; 5. Apply performance optimization and best practices, such as using list comprehensions and following the PEP8 style guide.

Choosing Between Python and C  : The Right Language for YouChoosing Between Python and C : The Right Language for YouApr 20, 2025 am 12:20 AM

Python is suitable for beginners and data science, and C is suitable for system programming and game development. 1. Python is simple and easy to use, suitable for data science and web development. 2.C provides high performance and control, suitable for game development and system programming. The choice should be based on project needs and personal interests.

Python vs. C  : A Comparative Analysis of Programming LanguagesPython vs. C : A Comparative Analysis of Programming LanguagesApr 20, 2025 am 12:14 AM

Python is more suitable for data science and rapid development, while C is more suitable for high performance and system programming. 1. Python syntax is concise and easy to learn, suitable for data processing and scientific computing. 2.C has complex syntax but excellent performance and is often used in game development and system programming.

2 Hours a Day: The Potential of Python Learning2 Hours a Day: The Potential of Python LearningApr 20, 2025 am 12:14 AM

It is feasible to invest two hours a day to learn Python. 1. Learn new knowledge: Learn new concepts in one hour, such as lists and dictionaries. 2. Practice and exercises: Use one hour to perform programming exercises, such as writing small programs. Through reasonable planning and perseverance, you can master the core concepts of Python in a short time.

Python vs. C  : Learning Curves and Ease of UsePython vs. C : Learning Curves and Ease of UseApr 19, 2025 am 12:20 AM

Python is easier to learn and use, while C is more powerful but complex. 1. Python syntax is concise and suitable for beginners. Dynamic typing and automatic memory management make it easy to use, but may cause runtime errors. 2.C provides low-level control and advanced features, suitable for high-performance applications, but has a high learning threshold and requires manual memory and type safety management.

See all articles

Hot AI Tools

Undresser.AI Undress

Undresser.AI Undress

AI-powered app for creating realistic nude photos

AI Clothes Remover

AI Clothes Remover

Online AI tool for removing clothes from photos.

Undress AI Tool

Undress AI Tool

Undress images for free

Clothoff.io

Clothoff.io

AI clothes remover

Video Face Swap

Video Face Swap

Swap faces in any video effortlessly with our completely free AI face swap tool!

Hot Tools

MantisBT

MantisBT

Mantis is an easy-to-deploy web-based defect tracking tool designed to aid in product defect tracking. It requires PHP, MySQL and a web server. Check out our demo and hosting services.

Dreamweaver Mac version

Dreamweaver Mac version

Visual web development tools

SublimeText3 Mac version

SublimeText3 Mac version

God-level code editing software (SublimeText3)

PhpStorm Mac version

PhpStorm Mac version

The latest (2018.2.1) professional PHP integrated development tool

WebStorm Mac version

WebStorm Mac version

Useful JavaScript development tools