具有LRU回收策略的Java缓存
Posted 咕泡研习社
tags:
篇首语:本文由小常识网(cha138.com)小编为大家整理,主要介绍了具有LRU回收策略的Java缓存相关的知识,希望对你有一定的参考价值。
本文解释了如何使用ConcurrentHashMap和一个双链表数据结构来使用LRU策略实现高速缓存。使用LinkedHashMap也可以达到同样的效果,但是默认情况下它不是线程安全的(也许在下一篇文章中,我将使用LinkedHashMap提交一个)。
考虑到以上5个属性,可以使用Java的ConcurrentHashMap(满足属性1-4)和简单的double - linked - list(满足属性5)实现一个简单的LRU缓存。
使用ConcurrentHashMap——保存对象并通过键访问它们,以便访问时间为O(1)顺序,提供并发访问,可以设置大小限制,并且可以使用键访问对象。
一个双链表(DLL)维护指向相同对象的指针(在映射中)。这个DLL帮助我们实现LRU回收策略,其中频繁访问的对象位于尾端,访问最少的对象位于列表的开头。
当一个新对象被插入到缓存中时,如果缓存已经达到最大容量,就会触发LRU清除。基本上在DLL中,头节点(最近最少使用的对象)将被删除,同样的也将从映射中删除。
从缓存获取对象get(K k)
对象实例化:
// 1. Create a new LRUCache instance by passing the initial storage capacity (10 in below example)
LRUCache<String, String> cache = new LRUCache<String, String>(10);
使用put(K k, V v)方法将对象插入到缓存中:
// inserts 10 objects to cache
for(int i=1; i<=10; i++) {
cache.put(String.format("key-%d", i), String.format("value-%d", i));
}
使用printCache()方法打印缓存中的对象的实用方法(将按最近访问的对象与最近访问的对象的顺序打印)
// print the cache objects
cache.printCache();
使用相应的密钥获取存储在缓存中的对象的方法 get(K k)
:
// access the first object and print the cache
cache.get("key-1");
// printing cache after accessing key-1
cache.printCache();
高速缓存框架和分布式缓存的使用在过去的几年里由于高数据量的应用而增长。当内存有限时,应用这些回收策略的挑战变得非常有趣,存储的对象分布在多个节点上,而回收必须应用在这个分布中,保存大型对象和二进制数据等等。
REDIS和Memcache是两个众所周知的分布式缓存服务器。REDIS提供了比普通缓存多一点的功能,它是一个数据结构缓存,具有磁盘持久性选项。
在当前的实现中,使用一个简单的类来实现基于LRU的缓存和使用键值对存储对象。
在此实现的下一个迭代中,计划构建一个简单的缓存库,其中包含各种功能,如添加POJO对象、支持Set接口等。添加工厂模式来创建各种缓存并外部化缓存策略(如LRU、MRU、基于时间的等)。
package io.howto;
import java.util.concurrent.ConcurrentHashMap;
/**
*
* LRUCache: Class to cache objects based on LRU (Least Recently Used) cache eviction strategy,
* wherein if the cache size has reached the maximum allocated capacity, the
* least recently used objects in the cache will be evicted.
*
* See the runner function(a.k.a main(...)) to see the working.
*
* @author sunil
*
* @param <K>
* @param <V>
*
* <p>
* Date: 14/Nov/2019
* </p>
*/
public final class LRUCache <K,V> {
/**
* A doubly-linked-list implementation to save objects into the hashmap
* as key-value pari.
*
* @author sunil
*
* @param <K>
* @param <V>
*/
private static class Node<K,V> {
private V value;
private K key;
private Node<K, V> next, prev;
public Node(K key, V value) {
this.key = key;
this.value = value;
}
@Override
public String toString() {
return value.toString();
}
}
/**
* The maximum number of elements that can be cached, should be set during instantiation
* time.
*/
private finalintmaxCapacity;
/**
* Use {@linkplain ConcurrentHashMap} here to maintain the cache of objects.
* Also this offers thread safe access of the cache.
*/
private ConcurrentHashMap<K, Node<K, V>> map;
/**
* A key-value representation of the cache object identified by a cache key.
* This is actually a doubly-linked list which maintains the most recently
accessed
* objects (read/write) at the tail-end and the least read objects at the head.
*/
private Node<K, V> head, tail;
public LRUCache(intmaxCapacity) {
this(16, maxCapacity);
}
public LRUCache(intinitialCapacity, intmaxCapacity) {
this.maxCapacity = maxCapacity;
if (initialCapacity > maxCapacity)
initialCapacity = maxCapacity;
map = newConcurrentHashMap<>(initialCapacity);
}
/**
* Removes a node from the head position doubly-linked list.
* @param node
*/
privatevoidremoveNode(Node<K, V> node) {
if(node == null)
return;
if(node.prev != null) {
node.prev.next = node.next;
} else {
head = node.next;
}
if (node.next != null) {
node.next.prev = node.prev;
} else {
tail = node.prev;
}
}
/**
* Offers a node to the tail-end of the doubly-linked list because
* it was recently read or written.
* @param node
*/
private void offerNode(Node<K, V> node) {
if (node == null)
return;
if (head == null) {
head = tail = node;
} else {
tail.next = node;
node.prev = tail;
node.next = null;
tail = node;
}
}
/**
* Adds a new object to the cache. If the cache size has reached it's capacity,
* then the least recently accessed object will be evicted.
*
* @param key
* @param value
*/
public void put(K key, V value) {
if (map.contains(key)) {
Node<K, V> node = map.get(key);
node.value = value;
removeNode(node);
offerNode(node);
} else {
if (map.size() == maxCapacity) {
System.out.println("maxCapacity of cache reached");
map.remove(head.key);
removeNode(head);
}
Node<K, V> node = new Node<K, V>(key, value);
offerNode(node);
map.put(key, node);
}
}
/**
* Fetches an object from the cache (could be null if no such mapping exists).
* If the object is found in the cache, then it will be moved to the tail-end of the
* doubly-linked list to indicate that it was recently accessed.
*
* @param key
* @param value
*/
public V get(K key) {
Node<K, V> node = map.get(key);
removeNode(node);
offerNode(node);
return node != null ? node.value : null;
}
/**
* Utility function to print the cache objects.
*/
public void printCache() {
Node<K, V> curr = head;
while (curr != null) {
System.out.print(curr.value + " -> ");
curr = curr.next;
}
System.out.println();
}
/**
* Runner program to test the LRU cache
* @param args
*/
public static void main(String[] args) {
/**
* 1. create LRUCache of initial capacity 10
* 2. insert 10 objects to cache
* 3. print the cache objects
* 4. access the first object and print the cache
* 5. insert new objects to cache
* 6. print the cache and observe that the least recently used
* objects are evicted
*/
// 1. initiate the cache with capacity 10
LRUCache<String, String> cache = new LRUCache<String, String>(10);
// 2. insert 10 objects to cache
for(int i=1; i<=10; i++) {
cache.put(String.format("key-%d", i), String.format("value-%d", i));
}
// 3. print the cache objects
System.out.println("printing cache:");
cache.printCache();
// 4. access the first object and print the cache
cache.get("key-1");
System.out.println("printing cache after accessing key-1:");
cache.printCache();
// 5. insert new objects to cache
for(int i=11; i<=15; i++) {
cache.put(String.format("key-%d", i), String.format("value-%d", i));
}
// 6. print the cache and observe that the least recently used objects are evicted
System.out.println("printing cache after adding new objects:");
cache.printCache();
}
}
往期推荐:
链接:https://dzone.com/articles/java-based-simple-cache-lru-eviction
点“在看”涨薪50%哟
以上是关于具有LRU回收策略的Java缓存的主要内容,如果未能解决你的问题,请参考以下文章