ASP.NET Core Memory Cache in Practice: Complete Guide to Configuration and Pitfalls
Introduction
In this article, let's talk about ASP.NET Core's memory cache. ASP.NET Core Memory Cache (IMemoryCache) is a lightweight caching solution suitable for single-instance applications or local caching in distributed environments. It provides simple APIs for storing and retrieving data while supporting expiration policies, priority settings, and other features.
What is Caching?
From user request to database returning data, this is a long process (exaggerating a bit—usually just tens to hundreds of milliseconds). But there's more than one user accessing, and even the same user might initiate multiple similar requests in a short time. Completing the entire flow every time would be very wasteful.
The purpose of caching is to store previous request results and directly return cached results when the same request occurs again, eliminating redundant computation and database access overhead. Therefore, caching is a storage mechanism used to improve performance and response speed.
Cache Types in ASP.NET Core
ASP.NET Core provides three common caching solutions:
- Memory Cache (IMemoryCache): Suitable for single-instance applications or local caching in distributed environments.
- Distributed Cache (IDistributedCache): Suitable for shared caching in distributed environments. Common implementations include Redis, SQL Server, etc.
- Hybrid Cache: Combines memory cache and distributed cache—first check memory cache, if miss then check distributed cache.
Each type of cache has its own usage scenarios and applicable boundaries. Choosing the right caching solution is very important.
ASP.NET Core's Memory Cache IMemoryCache
ASP.NET Core's memory cache uses local memory to temporarily store data, so its access speed is very fast—usually far faster than network and database access (specific latency depends on data size and serialization overhead).
However, memory cache also has some limitations:
- Cannot share data across multiple instances
- Cache data is lost with application restart
- Occupies server memory resources
Therefore, it's more suitable for storing temporary data or data that doesn't need persistence.
Important Considerations:
If cached data volume is too large or expiration policies are set improperly, it may lead to increased memory pressure, frequent garbage collection, or performance issues. When using memory cache, pay attention to the following:
- Don't use external input as cache keys: Such input may consume unpredictable memory resources, leading to cache attacks or misuse.
- Set reasonable expiration times: Limit cache growth.
- Limit cache size: Avoid occupying too much memory resources.
Using IMemoryCache in ASP.NET Core
Using IMemoryCache in ASP.NET Core is very simple.
Step 1: Register Service
First, register the service in Program.cs:
var builder = WebApplication.CreateBuilder(args);
builder.Services.AddMemoryCache();Step 2: Inject and Use
Then inject IMemoryCache where you need to use caching:
public class MyService
{
private readonly IMemoryCache _cache;
public MyService(IMemoryCache cache)
{
_cache = cache;
}
public async Task<string> GetDataAsync(string key)
{
if (_cache.TryGetValue(key, out string value))
{
return value; // Get data from cache
}
else
{
value = await FetchDataFromDatabaseAsync(key); // Fetch from database
_cache.Set(key, value, TimeSpan.FromMinutes(5)); // Set with 5-minute expiration
return value;
}
}
}In the example above, we first try to get data from cache. If cache hits, return directly. Otherwise, fetch from database and store result in cache with 5-minute expiration. Next time the same request comes, data can be fetched directly from cache, improving performance and response speed.
Recommended: GetOrCreateAsync Pattern
Besides the manual try-get-set pattern above, IMemoryCache also provides the GetOrCreateAsync method for more concise implementation. This is more recommended:
public async Task<string> GetDataAsync(string key)
{
return await _cache.GetOrCreateAsync(key, async entry =>
{
entry.AbsoluteExpirationRelativeToNow = TimeSpan.FromMinutes(5); // Set expiration
return await FetchDataFromDatabaseAsync(key); // Fetch from database
});
}IMemoryCache Optimization Techniques
When using memory cache, there are several optimization techniques to help manage cache better:
1. Sliding Expiration Policy
Use sliding expiration to extend cache item lifecycle. Sliding expiration resets expiration time on each cache access, ensuring frequently accessed data doesn't expire too early.
Example:
_cache.Set(key, value, new MemoryCacheEntryOptions
{
SlidingExpiration = TimeSpan.FromMinutes(5) // Reset to 5 minutes after each access
});Use Case: Perfect for data that's accessed repeatedly within a time window but should expire after a period of inactivity.
2. Absolute Expiration Policy
Use absolute expiration to set maximum cache item lifecycle. Absolute expiration expires at a specified time point regardless of access.
Example:
_cache.Set(key, value, new MemoryCacheEntryOptions
{
AbsoluteExpirationRelativeToNow = TimeSpan.FromMinutes(30) // Expire after 30 minutes
});Use Case: Ideal for data that should refresh at regular intervals regardless of usage patterns.
3. Combine Both Policies
You can combine sliding and absolute expiration:
_cache.Set(key, value, new MemoryCacheEntryOptions
{
SlidingExpiration = TimeSpan.FromMinutes(5), // Reset on access
AbsoluteExpirationRelativeToNow = TimeSpan.FromHours(1) // But expire after 1 hour max
});This ensures data doesn't stay cached forever while still benefiting from sliding expiration for active data.
4. Limit Cache Total Size
Limit total cache size and set size for each cache item. After enabling SizeLimit, written cache items must explicitly set Size, otherwise an exception will be thrown at runtime.
Configuration:
builder.Services.AddMemoryCache(options =>
{
options.SizeLimit = 1024; // Total cache capacity (unit defined by business)
});Setting Item Size:
_cache.Set(key, value, new MemoryCacheEntryOptions
{
Size = 1 // Current cache item occupies 1 unit
});Best Practice: Define size based on actual memory consumption or business importance. For example, large objects could have size proportional to their byte size.
5. Set Cache Item Priority
Set priority for cache items to ensure important data isn't removed too early:
_cache.Set(key, value, new MemoryCacheEntryOptions
{
Size = 1,
Priority = CacheItemPriority.NeverRemove // Set never-remove priority
});Available Priorities:
Low: Removed first when cache needs spaceNormal: Default priorityHigh: Removed after low and normalNeverRemove: Never removed due to memory pressure (use sparingly!)
6. Use Callbacks for Eviction Handling
Utilize callback functions to handle logic when cache items are removed, such as logging or cleaning related resources:
_cache.Set(key, value, new MemoryCacheEntryOptions
{
PostEvictionCallbacks =
{
new PostEvictionCallbackRegistration
{
EvictionCallback = (k, v, reason, state) =>
{
Console.WriteLine($"Cache item {k} removed, reason: {reason}");
}
}
}
});Eviction Reasons:
Removed: Explicitly removedReplaced: Replaced by new valueExpired: Expired by time policyMemoryPressure: Removed due to memory constraints
7. Compress Cache Data
Compress cache data to reduce memory usage. Can use third-party libraries (like System.IO.Compression) to compress data before storing in cache.
When to Use: This scenario is only suitable for storing larger data objects where access performance requirements aren't high, because compression and decompression add CPU overhead. Trading computation for a bit of memory isn't worth it for small or frequently accessed data.
Example:
var compressedValue = Compress(value); // Compress data
_cache.Set(key, compressedValue, new MemoryCacheEntryOptions
{
Size = compressedValue.Length // Set cache item size to compressed length
});Common Pitfalls and Solutions
Pitfall 1: Cache Stampede
Problem: When a popular cache item expires, multiple requests simultaneously try to regenerate it, causing database overload.
Solution: Use cache-aside pattern with locking:
public async Task<string> GetDataAsync(string key)
{
if (_cache.TryGetValue(key, out string value))
{
return value;
}
// Use semaphore to prevent stampede
var semaphore = _cache.GetOrCreate($"lock:{key}", e =>
{
e.AbsoluteExpirationRelativeToNow = TimeSpan.FromSeconds(5);
return new SemaphoreSlim(1, 1);
});
await semaphore.WaitAsync();
try
{
// Double-check after acquiring lock
if (_cache.TryGetValue(key, out value))
{
return value;
}
value = await FetchDataFromDatabaseAsync(key);
_cache.Set(key, value, TimeSpan.FromMinutes(5));
return value;
}
finally
{
semaphore.Release();
}
}Pitfall 2: Memory Leak
Problem: Cache grows indefinitely without proper expiration.
Solution: Always set expiration policies and consider using SizeLimit.
Pitfall 3: Cache Inconsistency
Problem: Cache data becomes stale when underlying data changes.
Solution: Implement cache invalidation strategy:
// Invalidate cache when data changes
public async Task UpdateDataAsync(string key, string newValue)
{
await SaveToDatabaseAsync(key, newValue);
_cache.Remove(key); // Remove stale cache
// Or update cache directly
// _cache.Set(key, newValue, TimeSpan.FromMinutes(5));
}Pitfall 4: Using Mutable Objects
Problem: Caching mutable objects can lead to unexpected behavior if the object is modified after caching.
Solution: Cache immutable objects or deep copies:
// Cache a copy, not the original
var cachedValue = originalValue.Clone(); // Or serialize/deserialize
_cache.Set(key, cachedValue, TimeSpan.FromMinutes(5));Performance Monitoring
Monitor cache performance to optimize configuration:
Key Metrics to Track
Hit Rate: Percentage of requests served from cache
Hit Rate = Cache Hits / (Cache Hits + Cache Misses)Target: >80% for most scenarios
Miss Rate: Percentage of requests requiring data fetch
Miss Rate = Cache Misses / Total RequestsEviction Rate: How often items are removed before expiration
- High eviction rate may indicate memory pressure
- Average Response Time: Compare cached vs. non-cached requests
Implementation Example
public class CacheMetrics
{
private int _hits = 0;
private int _misses = 0;
public void RecordHit() => Interlocked.Increment(ref _hits);
public void RecordMiss() => Interlocked.Increment(ref _misses);
public double HitRate => (double)_hits / (_hits + _misses);
}When to Use Memory Cache vs. Distributed Cache
Use Memory Cache When:
- Single instance application
- Data can be different across instances
- Fastest possible access is critical
- Data is specific to the instance
- Simple setup is preferred
Use Distributed Cache When:
- Multiple application instances
- Data must be consistent across instances
- Application may restart frequently
- Cache needs to survive application restarts
- Large cache size exceeds single instance memory
Hybrid Approach
Consider hybrid caching for best of both worlds:
public async Task<string> GetDataAsync(string key)
{
// Try local memory cache first
if (_memoryCache.TryGetValue(key, out string value))
{
return value;
}
// Try distributed cache
var distributedValue = await _distributedCache.GetStringAsync(key);
if (distributedValue != null)
{
// Populate local cache
_memoryCache.Set(key, distributedValue, TimeSpan.FromMinutes(5));
return distributedValue;
}
// Fetch from database
value = await FetchDataFromDatabaseAsync(key);
// Populate both caches
_memoryCache.Set(key, value, TimeSpan.FromMinutes(5));
await _distributedCache.SetStringAsync(key, value,
new DistributedCacheEntryOptions
{
AbsoluteExpirationRelativeToNow = TimeSpan.FromMinutes(30)
});
return value;
}Summary
ASP.NET Core memory cache is powerful but has its limitations. When using memory cache, pay attention to:
- Setting reasonable expiration policies
- Limiting cache size
- Monitoring cache performance
- Implementing proper invalidation strategies
Through proper use of memory cache, you can significantly improve application performance and response speed.
Key Takeaways:
- Start Simple: Use GetOrCreateAsync pattern for clean code
- Set Expiration: Always configure expiration policies
- Monitor: Track hit rates and memory usage
- Plan for Growth: Consider SizeLimit for production
- Know When to Scale: Move to distributed cache when needed
Hope this article helps you better understand and use ASP.NET Core memory cache.
Further Reading:
- Microsoft Docs: IMemoryCache Interface
- ASP.NET Core Caching Best Practices
- Distributed Caching in ASP.NET Core