ASP.NET Core Memory Cache in Practice: The Complete Guide to Configuration and Pitfall Avoidance
Introduction
In this article, let's talk about ASP.NET Core's memory caching. ASP.NET Core Memory Cache (IMemoryCache) is a lightweight caching solution suitable for single-instance applications or local caching in distributed environments. It provides a simple API for storing and retrieving data while supporting expiration policies, priority settings, and other features.
What Is Caching?
From user request to database returning data, this is a long process (exaggerating a bit—usually just tens to hundreds of milliseconds). But there's more than one user accessing, and even the same user might initiate multiple similar requests in a short time. In such cases, completing the entire flow every time seems very wasteful.
The purpose of caching is to store previous request results, so when the same request comes again, cached results can be returned directly, eliminating the overhead of repeated calculations and database access. Therefore, caching is a storage mechanism used to improve performance and response speed.
Cache Types in ASP.NET Core
ASP.NET Core provides three common caching solutions:
- Memory Cache (IMemoryCache): Suitable for single-instance applications or local caching in distributed environments.
- Distributed Cache (IDistributedCache): Suitable for shared caching in distributed environments, with common implementations including Redis, SQL Server, etc.
- Hybrid Cache: Combines memory cache and distributed cache, checking memory cache first, then distributed cache if not hit.
Each type of cache has its own usage scenarios and applicable boundaries. Choosing the appropriate caching solution is very important.
ASP.NET Core's Memory Cache IMemoryCache
ASP.NET Core's memory cache uses local memory to temporarily store data, so its access speed is very fast—typically much faster than network and database access (specific latency depends on data size and serialization overhead). However, memory cache also has some limitations; it cannot share data across multiple instances. Additionally, memory cache data is lost with application restarts, so it's more suitable for storing temporary data or data that doesn't need persistence.
Stored in local machine memory, it occupies server memory resources. If the cached data volume is too large or expiration policies are set improperly, it may lead to increased memory pressure, frequent garbage collection, or performance issues. Therefore, when using memory cache, pay attention to the following points:
- Don't use external input as cache keys, because such input may consume unpredictable memory resources, leading to cache being maliciously attacked or misused.
- Set reasonable expiration times to limit cache growth.
- Limit cache size to avoid occupying excessive memory resources.
Using IMemoryCache in ASP.NET Core
Using IMemoryCache in ASP.NET Core is very simple. First, you need to register the service in Program.cs:
var builder = WebApplication.CreateBuilder(args);
builder.Services.AddMemoryCache();Then inject IMemoryCache where you need to use caching:
public class MyService
{
private readonly IMemoryCache _cache;
public MyService(IMemoryCache cache)
{
_cache = cache;
}
public async Task<string> GetDataAsync(string key)
{
if (_cache.TryGetValue(key, out string value))
{
return value; // Get data from cache
}
else
{
value = await FetchDataFromDatabaseAsync(key); // Get data from database
_cache.Set(key, value, TimeSpan.FromMinutes(5)); // Store data in cache with 5-minute expiration
return value;
}
}
}In the example above, we first try to get data from cache. If cache hits, we return directly; otherwise, we get data from the database and store the result in cache with a 5-minute expiration. This way, when the same request comes next time, data can be retrieved directly from cache, improving performance and response speed.
Besides the manual try-get-set pattern above, IMemoryCache also provides the GetOrCreateAsync method, which can implement the same functionality more concisely. This approach is more recommended:
public async Task<string> GetDataAsync(string key)
{
return await _cache.GetOrCreateAsync(key, async entry =>
{
entry.AbsoluteExpirationRelativeToNow = TimeSpan.FromMinutes(5); // Set expiration time
return await FetchDataFromDatabaseAsync(key); // Get data from database
});
}IMemoryCache Optimization Techniques
When using memory cache, there are some optimization techniques that can help us manage cache better:
1. Use Sliding Expiration Policy
Use sliding expiration to extend cache item lifecycle. Sliding expiration resets the expiration time each time the cache item is accessed, ensuring frequently accessed data doesn't expire too early.
Example:
_cache.Set(key, value, new MemoryCacheEntryOptions
{
SlidingExpiration = TimeSpan.FromMinutes(5) // Expiration resets to 5 minutes after each access
});2. Use Absolute Expiration Policy
Use absolute expiration to set the maximum lifecycle of cache items. Absolute expiration expires at a specified time point, regardless of whether it's been accessed.
Example:
_cache.Set(key, value, new MemoryCacheEntryOptions
{
AbsoluteExpirationRelativeToNow = TimeSpan.FromMinutes(30) // Expires after 30 minutes
});3. Limit Total Cache Size and Set Size for Each Cache Item
Enable SizeLimit, and all written cache items should explicitly set Size, otherwise an exception will be thrown at runtime.
Example:
builder.Services.AddMemoryCache(options =>
{
options.SizeLimit = 1024; // Total cache capacity (unit defined by business)
});
_cache.Set(key, value, new MemoryCacheEntryOptions
{
Size = 1 // Current cache item occupies 1 unit
});4. Set Cache Item Priority
Ensure important data isn't removed too early by setting priority:
_cache.Set(key, value, new MemoryCacheEntryOptions
{
Size = 1,
Priority = CacheItemPriority.NeverRemove // Set never-remove priority
});5. Use Callback Functions for Cache Item Removal Logic
Utilize callback functions to handle logic when cache items are removed, such as logging or cleaning related resources:
_cache.Set(key, value, new MemoryCacheEntryOptions
{
PostEvictionCallbacks =
{
new PostEvictionCallbackRegistration
{
EvictionCallback = (k, v, reason, state) =>
{
Console.WriteLine($"Cache item {k} was removed, reason: {reason}");
}
}
}
});6. Compress Cache Data to Reduce Memory Usage
You can use third-party libraries (such as System.IO.Compression) to compress data before storing in cache.
This scenario is only suitable for storing larger data objects where access performance requirements aren't high, because compression and decompression increase CPU overhead. Sacrificing computation for a bit of memory isn't worth it.
Example:
var compressedValue = Compress(value); // Compress data
_cache.Set(key, compressedValue, new MemoryCacheEntryOptions
{
Size = compressedValue.Length // Set cache item size to compressed length
});Summary
ASP.NET Core memory cache is powerful, but it has its limitations. When using memory cache, pay attention to reasonably setting expiration policies, limiting cache size, and other issues. Through proper use of memory cache, you can significantly improve application performance and response speed.
Hope this article helps you better understand and use ASP.NET Core memory cache.