Skip to main content
Version: Next

Caching Overview

RCommon provides a provider-agnostic caching abstraction built around a single interface: ICacheService. The same interface is implemented by in-process memory caching, distributed memory caching, and Redis. Application code that depends only on ICacheService can switch caching backends without any handler or service changes.

ICacheService

ICacheService exposes a get-or-create pattern. If the requested key exists in the cache the cached value is returned immediately. If the key is absent the factory delegate is invoked, the result is stored in the cache, and the value is returned.

public interface ICacheService
{
TData GetOrCreate<TData>(object key, Func<TData> data);
Task<TData> GetOrCreateAsync<TData>(object key, Func<TData> data);
}

All implementations — InMemoryCacheService, DistributedMemoryCacheService, and RedisCacheService — implement this interface. The provider is swapped by changing the builder call in startup; consuming code does not change.

CacheKey

CacheKey is a strongly-typed wrapper around a cache key string. It validates that the key is non-empty and does not exceed 256 characters. Use the factory methods to build composite keys:

using RCommon.Caching;

// Simple key from string parts
var key = CacheKey.With("orders", orderId.ToString());

// Scoped to a type
var key = CacheKey.With(typeof(OrderService), "list", tenantId.ToString());
// Produces: "OrderService:list-<tenantId>"

Passing a CacheKey as the key argument to ICacheService methods ensures consistent key formatting across the application.

When to use caching

Use ICacheService when:

  • A query or computation is expensive and the result changes infrequently (reference data, configuration, aggregated reports).
  • You want to protect a downstream service or database from repeated identical requests.
  • You are caching dynamically compiled expressions or reflection results to improve performance (RCommon uses this internally for repository expression caching).

Do not use caching when:

  • The data must always be fresh and stale reads are unacceptable.
  • The data is user-specific and not shared across requests (unless you scope the cache key per user).
  • The cached value is trivially cheap to compute.

Expression caching

RCommon uses caching internally to compile and cache LINQ expression trees that would otherwise be recompiled on every repository call. Enabling CacheDynamicallyCompiledExpressions on either the in-memory or Redis builder activates this optimization:

builder.Services.AddRCommon()
.WithMemoryCaching<InMemoryCachingBuilder>(cache =>
{
cache.CacheDynamicallyCompiledExpressions();
});

This is the recommended minimum caching configuration for applications that use the persistence layer.

Choosing a caching provider

ProviderPackageBacking storeCross-process
In-memoryRCommon.MemoryCacheIMemoryCacheNo
Distributed memoryRCommon.MemoryCacheIDistributedCache (in-process)No
RedisRCommon.RedisCacheStackExchange.RedisYes

Use the in-memory provider for single-process applications or development environments. Use Redis when you need a cache that is shared across multiple instances of the same service.

Builder hierarchy

IRCommonBuilder
.WithMemoryCaching<T>() → IMemoryCachingBuilder
.WithDistributedCaching<T>() → IDistributedCachingBuilder

IMemoryCachingBuilder and IDistributedCachingBuilder are marker interfaces. Provider packages implement them with concrete builder types (InMemoryCachingBuilder, DistributedMemoryCacheBuilder, RedisCachingBuilder) that expose provider-specific extension methods.

Section contents

  • Memory Cache — In-process memory caching setup, configuration options, eviction, and expression caching
  • Redis Cache — Redis setup, connection string configuration, distributed caching, and expression caching
RCommonRCommon