Five Ways To Add Caching To Your ASP.NET App
Caching has been a fundamental feature of ASP.NET since its inception. From response caching to tag caching and data source caching, there are many solutions available for each layer of your application. This article gives you an overview of these techniques and the challenges they present. We propose a solution to cache invalidation, robust cache key generation, and tools to avoid boilerplate code and keep your code clean.
1. Caching the HTTP response
HTTP Response Caching has been part of ASP.NET since version 1.0. In ASP.NET Core, it is implemented by the Response Caching Middleware.
It’s an implementation of the RFC 9111 internet standard. The protocol specifies request and response headers (such as Cache-Control
) for cache control and defines conditions under which a response can be cached by any node in the internet infrastructure, i.e., not only the web server but also the browser or intermediate nodes such as a CDN or a corporate proxy.
How to use Response Caching?
To use HTTP Response Caching in ASP.NET Core:
-
Add the middleware service using the
AddResponseCaching
method:builder.Services.AddResponseCaching();
-
Add the
[ResponseCache]
attribute to controller classes or methods:[ApiController] public class TimeController : ControllerBase { [Route("api/[controller]")] [HttpGet] [ResponseCache] public ContentResult GetTime() => Content(DateTime.Now.ToString()); }
These two steps will cause the HTTP response of the page or API call to be cached in memory. Some settings allow the cached response to vary according to some HTTP headers or query string parameters. Please refer to the ASP.NET documentation for details.
Limitations of Response Caching
The benefit of Response Caching is also its inconvenience: it respects the cache control HTTP headers sent by the client. However, giving the client control over the server’s performance can be dangerous. Additionally, some web frameworks like Blazor always set the Cache-Control
header to no-cache, no-store
, ultimately defeating this approach to caching.
Another limitation of Response Caching is that it only uses in-memory storage.
2. Output Caching
Output Caching is an alternative to response caching that does not depend on the HTTP headers set by the client. It offers different storage options than local memory, like Redis. Additionally, Output Caching implements some advanced features such as cache stampede, thundering herd, and cache revalidation that go beyond the scope of this article.
-
The first thing to do is to add the output caching service to the application builder. There are several approaches according to the storage you want:
-
To use an in-memory store for your output cache, call the
AddOutputCache()
method:builder.Services.AddOutputCache();
-
Using a Redis cache is more challenging to configure, except if you let .NET Aspire do the wiring for you.
In your app host, add a Redis component:
builder.AddRedis("cache");
Then, in your web app, call
AddRedisOutputCache
:builder.AddRedisOutputCache("cache");
-
-
Then, call
UseOutputCache
on the application.var app = builder.Build(); // ... app.UseOutputCache(); // ...
-
Enable output for your controller or page using one of the following techniques:
-
Call the
CacheOutput()
extension method afterMapGet(...)
:app.MapGet( "/weatherforecast-cached", ( WeatherForecastService forecastService ) => forecastService.GetWeatherForecast() ) .CacheOutput( policy => policy.Expire( TimeSpan.FromSeconds( 5 ) ) );
-
Add the
[CacheOutput]
attribute to the controller class, controller method, or controller delegate:app.MapGet( "/weatherforecast-cached-attribute", [OutputCache( Duration = 5 )]( WeatherForecastService forecastService ) => forecastService.GetWeatherForecast() );
-
Add the
[CacheOutput]
attribute to the Razor page:@attribute [OutputCache( Duration = 5 )]
-
3. Caching Razor Tags
The first two approaches allowed you to cache the whole output of a page or API call. Instead of caching the whole page, you might want to cache just a part. In Razor, this is possible thanks to the <cache>
tag helper.
For instance, consider a to-do list app displaying the weather forecast. This forecast varies only by the user’s hometown and is refreshed every 15 minutes. However, the content of the to-do list can change at any time upon user action. Therefore, we choose to cache the weather forecast component.
<cache vary-by="@Model.User.HomeTown" expires-after="@TimeSpan.FromMinutes(15)">
<!-- Weather rendering goes here. -->
</cache>
For more details regarding this technique, see cache tag helpers.
4. Caching the data source
The preceding techniques added caching to the very end of the server pipeline, by caching our response to the client. Suppose that several pages use the same data source and that every call to this data source is expensive. By adding caching to the calls to this data source, we would not only increase the performance of our own website but also save real money if we have to pay for that service per call.
.NET comes with two abstractions for caching:
-
The
IMemoryCache
service allows for caching objects in local memory. The advantage of a local cache is the absence of serialization and deserialization. -
The
IDistributedCache
service lets your app share a cache with multiple instances of the app. Using this service requires serialization and out-of-process communication, usually including network communication, so the cache operations may be slower here.
Let’s see how we can cache an HTTP request using IMemoryCache
:
-
Add the
IMemoryCache
service to your app:builder.Services.AddMemoryCache();
-
Where you need caching, call the
GetOrCreateAsync
extension method.public partial class WeatherApiClient( HttpClient httpClient, IMemoryCache cache ) { private async Task<WeatherForecast[]> GetWeatherAsync( string endpoint, int maxItems, CancellationToken cancellationToken ) { var forecast = await cache.GetOrCreateAsync( CacheKeyFactory.GetWeather( endpoint ), async _ => await httpClient.GetFromJsonAsync<WeatherForecast[]>( endpoint, cancellationToken ) ); return forecast!.Take( maxItems ).ToArray(); }
One of the challenges of caching is cache invalidation. For instance, if we cache the to-do list, we must not forget to remove the list from the cache whenever this list is modified. This causes the challenge of producing consistent caching keys: the caching key generated in the update method must exactly match the one used by the get method. This is why we moved the cache key generation logic to a CacheKeyFactory
class, which both the update and the get method would call.
public static class CacheKeyFactory
{
public static string GetWeather(string endpoint) => $"{nameof(GetWeather)}({endpoint})";
public static string GetToDo(string endpoint, int id) => $"{nameof(GetToDo)}({endpoint}, {id})";
public static string GetToDoList(string endpoint) => $"{nameof(GetToDo)}({endpoint})";
}
Still, implementing caching by hand results in some boilerplate code, making the business code harder to read. To avoid repetitive work and keep your source clean, we can use source generation.
5. Caching method results using aspects
Instead of writing caching code manually, you can use a special kind of custom attribute called an aspect to generate that code for you at build time. Tools that make this magic possible are called aspect-oriented frameworks. One of them, based on Roslyn, is Metalama.
Metalama comes with its own open-source caching library, which makes caching method return values a no-brainer.
To cache the return value of a method according to its parameters, just add the [Cache]
attribute:
[Cache]
public async Task<IEnumerable<Todo>> GetTodosAsync(
[NotCacheKey] CancellationToken cancellationToken = default )
=> await db.Todos.ToListAsync( cancellationToken );
[Cache]
public async Task<Todo?> GetTodoAsync(
int id,
[NotCacheKey] CancellationToken cancellationToken = default )
=> await db.Todos.FindAsync( id );
To remove a method return value from the cache when an update method is executed, use the [InvalidateCache]
attribute:
[InvalidateCache( nameof(this.GetTodosAsync) )]
public async Task<Todo> AddTodoAsync( Todo todo, CancellationToken cancellationToken = default )
{
var newEntry = db.Todos.Add( todo );
await db.SaveChangesAsync( cancellationToken );
return newEntry.Entity;
}
[InvalidateCache( nameof(this.GetTodosAsync), nameof(this.GetTodoAsync) )]
public async Task<bool> UpdateTodoAsync(
int id,
Todo todo,
CancellationToken cancellationToken = default )
{
var existingTodo = await this.GetTodoAsync( id, cancellationToken );
if ( existingTodo is null )
{
return false;
}
existingTodo.IsCompleted = todo.IsCompleted;
existingTodo.Title = todo.Title;
db.Todos.Update( existingTodo );
await db.SaveChangesAsync( cancellationToken );
return true;
}
Metalama Caching offers the following benefits:
- Reduces the amount of repetitive code, saving you time and making your code cleaner and easier to read.
- Minimizes errors related to the consistent generation of cache keys.
- Provides robust support for cache invalidation.
- Offers several caching topologies, including in-memory, Redis, Redis with an in-memory L1, and more.
If you’re interested in learning more about Metalama Caching, read Simplify Your .NET Aspire Caching With Metalama.
Summary
Choosing the right level of caching can greatly improve the performance of your application and, at the same time, reduce its operating costs. However, implementing caching can have its own challenges. Doing cache invalidation properly is a notoriously hard problem. Open-source libraries like Metalama Caching can reduce the boilerplate involved with caching and improve its robustness.