I'm buidling a custom C# DNN module and have implemented caching:
public static List<TestInfo> GetCached(int ModuleId, HttpContext myContext, ControlCachePolicy myCachePolicy)
{
string key = "TestCache" + ModuleId.ToString();
object cacheItems = myContext.Cache.Get(key);
if (cacheItems == null)
{
TestController controller = new TestController();
cacheItems = controller.GetTest(ModuleId);
DateTime cacheDuration = DateTime.Now.AddHours(2.00);
if (myCachePolicy.SupportsCaching == true)
{
cacheDuration = DateTime.Now.AddHours(myCachePolicy.Duration.Hours);
}
myContext.Cache.Insert(key, cacheItems, null, cacheDuration, TimeSpan.Zero);
}
return (List<TestInfo>)cacheItems;
}
What I am storing in cache, List<TestInfo>, is relatively small compared to what I would like to cache. Each item in this List contains an ItemId. Using the ItemId, I call the database to retrieve an object. This object can be large and I can see where there may be 100, 200, or more of these objects that would be cached.
Does it make sense to cache that many large objects? What would be the downside of doing so? I know that the upside would be less calls to stored procedures in the database.