Very simple LRU (Least Recently Used) cache implemented with Dequeue and HashMap in Java.
This project demonstrates a straightforward implementation of an LRU cache using Java’s Dequeue and HashMap data structures. An LRU cache efficiently stores a limited number of items, automatically evicting the least recently accessed item when the cache reaches its maximum capacity.
- Simple, easy-to-understand Java implementation
- Fast O(1) access and updates
- Uses core Java data structures (Dequeue and HashMap)
- Suitable for educational purposes and small-scale caching needs
-
Clone the repository:
git clone https://github.com/vilinuz/lru-cache-demo.git
-
Build and run the project with your preferred Java IDE or via command line.
-
Example usage (see
Main.javaor similar entry point for sample code):int capacity = 3; LRUCache<Integer, String> cache = new LRUCache<>(capacity); cache.put(1, "One"); cache.put(2, "Two"); cache.put(3, "Three"); System.out.println(cache.get(2)); // Output: Two cache.put(4, "Four"); // Evicts key 1 System.out.println(cache.get(1)); // Output: null
- The cache keeps track of the keys in a Dequeue (doubly-ended queue).
- The HashMap provides fast lookup for stored key-value pairs.
- On cache access or update, the key is moved to the front of the queue (most recently used).
- When the cache exceeds its capacity, the least recently used key (at the end of the queue) is removed.
Pull requests are welcome! For major changes, please open an issue first to discuss what you would like to change.
This project is licensed under the MIT License.