c# - Default memory cache with LRU policy -
i trying implement caching in application , want use default memory cache in c# (this requirement can changed if wont work). problem don't want exceed maximum amount of physical memory have on machine, understand can't add such constraint default memory cache.
in general policy is:
- if object has been in cache 10 min no requests removed
- if new object added cache , maximum amount of avaliable physical memory close used, elements removed based on lru
my cache can contain many different objects , range 10mb 2-3gb, can't trim
function work.
are there suggestions on how implement lru cache monitoring ram usage? , hoppefully can done using caches in .net?
edit
i've added simple example memorycache limited 100mb , 20% of physical memory, not change anything. memory filled no removal of cache entries. note polling interval changed evert 5 second.
class item { private list<guid> data; public item(int capacity) { this.data = new list<guid>(capacity); (var = 0; < capacity; i++) data.add(guid.newguid()); } } class program { static void main(string[] args) { var cache = new memorycache( "mycache", new namevaluecollection { { "cachememorylimitmegabytes", "100" }, { "physicalmemorylimitpercentage", "20" }, { "pollinginterval", "00:00:05" } }); (var = 0; < 10000; i++) { var key = string.format("key{0}", i); console.writeline("add item: {0}", key); cache.set(key, new item(1000000), new cacheitempolicy() { updatecallback = updatehandler } ); } console.writeline("\ndone"); console.readkey(); } public static void updatehandler(cacheentryupdatearguments args) { console.writeline("remove: {0}, reason: {1}", args.key, args.removedreason.tostring()); } }
looks system.runtime.caching.memorycache class fit bill nicely. set caching policy on per-item basis, if add item cache policy of slidingexpiration timespan of 10min, should behavior looking for.
this .net v4 class only, doesn't exist on earlier runtime versions. if in web context, asp.net cache behaves similarly, not let manage system information.
you can set limits on cache size when create not exceed memory footprint:
var mycache = new memorycache( "mycache", new namevaluecollection { { "physicalmemorylimit", "50" }} // set max mem pct );
this should prevent paging disk, @ least within application. if there outside memory pressures or system overly aggressive paging memory disk, cache may still paged out, not due overuse within application. knowledge there no way reserve memory pages in c#.
Comments
Post a Comment