Home >Java >javaTutorial >How to Implement a Thread-Safe LRU Cache in Java?
Implementing an LRU Cache in Java from Scratch
Implementing a Least Recently Used (LRU) cache in Java from scratch is a valuable exercise for understanding data structures and concurrency. While libraries like EHCache and OSCache simplify the task, creating your own offers insights into the underlying mechanisms.
For multithreaded environments, LinkedHashMap emerges as a strong contender, delivering constant-time access while preserving insertion order. However, since Java's standard LinkedHashMap is not thread-safe, a common approach is to wrap it using Collections#synchronizedMap.
Leveraging Concurrent Data Structures
While LinkedHashMap with synchronization provides a reliable solution, newer concurrent data structures offer potential improvements. By extending ConcurrentHashMap and replicating the logic employed by LinkedHashMap, you can craft a highly concurrent LRU cache.
However, for the time being, it's prudent to stick with the proven combination of LinkedHashMap and Collections#synchronizedMap. If desired, you can also explore extending ConcurrentHashMap in the future to enhance concurrency.
Implementation Snippet
Below is a gist of the current implementation using LinkedHashMap and synchronization:
<code class="java">private class LruCache<A, B> extends LinkedHashMap<A, B> { private final int maxEntries; public LruCache(final int maxEntries) { super(maxEntries + 1, 1.0f, true); this.maxEntries = maxEntries; } @Override protected boolean removeEldestEntry(final Map.Entry<A, B> eldest) { return super.size() > maxEntries; } } Map<String, String> example = Collections.synchronizedMap(new LruCache<String, String>(CACHE_SIZE));</code>
The above is the detailed content of How to Implement a Thread-Safe LRU Cache in Java?. For more information, please follow other related articles on the PHP Chinese website!