Package-level declarations

Types

Link copied to clipboard
interface Cache<K, V>
In general, a cache is a key/value store that has a maximum storage capacity and exposes methods to support CRUD operations to provide temporary storage which maintains a single copy of items that are expensive to create or load, for example via remote network calls.
Link copied to clipboard
abstract class CacheBase<K, V> : Cache<K, V>
Abstract class for Cache implementations.
Link copied to clipboard
interface CacheCostFactor<K>
In parallel to the Cache Replacement Policy, the optional Cost Factor feature described here provides the eviction mechanism for a Cache implementation to remove one or more existing cache entries so that a new item can be added (via put) when the aggregated cost exceeds the maximum total cost:
  1. During cache initialization, the cache implementation should have method(s) to let the user specify the maximum total cost, and a CacheCostFactor instance that implements onExceedMaxCost.
  2. When the put is called:
    • The cache implementation first calculates the aggregated cost by adding the current entry's cost to the costs of all existing entries (via getCostOfEntries).
    • If the aggregated cost is greater than the maximum total cost, user' onExceedMaxCost method is invoked with a list of existing cache entry keys sorted in cost ascending order.
    • onExceedMaxCost returns a list of key(s) to remove.
    • The cache implementation removes the cache entries user selected and proceeds with the put.
Note that the Cost Factor can also be used as a mechanism to pin down a certain entry that is deemed important regardless of the Cache Replacement Policy.
Link copied to clipboard
class CacheEntry<K, V>
A read-only helper class used as the return value of getEntry.
Link copied to clipboard
class CompositeCache<K, V> : CacheBase<K, V> , CacheBuilder<K, V>

CompositeCache

is a multi-level cache container that implements the Cache interface.
Link copied to clipboard
class MemoryCache<K, V> : CacheBase<K, V> , CacheBuilder<K, V>
MemoryCache is a fixed size in-memory cache that uses weak reference keys to store a set of values with the following features and options:
  1. LRU (Least Recently Used) Cache Replacement Policy: Each time a value is accessed, it is moved to the end of a queue in descending LRU order. When an entry is added to a full cache, the entry at the head of that queue, that is, the Least Recently Used entry is evicted to make room for the new entry.
  2. Clear on Memory Error Option: When this option is turned on, low memory situation detected by the application will cause the entire cache to be cleared.
  3. Cost Factor Option: The cache can be configured with a maximum cost, and a cost provided by the user is associated with each cache entry during the put operation.

    In addition to the LRU Replacement Policy, when the aggregated cost exceeds the maximum cost, one or more cache entries based on system or user-defined eviction criteria will be removed to make space for the current entry, then adds the current entry to the cache:

    • System Eviction Criteria-- the current entry with the highest cost will be removed.
    • Customized Eviction Criteria-- a list of existing entries sorted in ascending cost order will be presented to the user via a callback interface CacheCostFactor, the user-defined callback method List<K> onExceedMaxCost(final List<K> existingKeys) can select and return a list of one or more entries from the input list to evict.
Before interacting with the MemoryCache, you need to instantiate and configure the MemoryCache.
Link copied to clipboard
class SecureStoreCache<V : Serializable?> : CacheBase<K, V>
A SecureStoreCache is a fixed size LRU (Least Recently Used) cache that contains key-value pairs persisted in an encrypted database.