DEV Community

Sean G. Wright
Sean G. Wright

Posted on • Edited on

Kentico 12: Design Patterns Part 12 - Database Query Caching Patterns

Cash, not cache

Photo by Anthony Abruzzo on Unsplash

Sites built with Kentico's Portal Engine technology provided a lot of performance through caching, which is baked into the technology ⚑.

When building Kentico 12 MVC sites, the responsibility for effective caching lands on the shoulders of the developers - but we can still leverage Kentico's APIs to help us accomplish this goal πŸ‘.

There are many layers at which an application can leverage caching to increase responsiveness, including client-side caching and Output Caching.

In this post we will be looking specifically at database query caching.


What Are We Caching? Data, Context, and Validity

The Data: ObjectQuery and DocumentQuery

The most common scenario for caching data in our application is when we are retrieving information from the database.

Any time we are calling methods on one of Kentico's *Provider classes or working with an instance of ObjectQuery<T> or DocumentQuery<T>, we can use caching to save ourselves from having to go to the database again the next time the data is needed.

The Context: Cache Item Name & Query Parameters

When we cache this data we are not just caching the data itself, but also the context under which the data was queried πŸ€”.

  • What is the name we are giving this specific cached data?
  • What were the general query parameters (site name, database identifiers, locale)?
  • Was Preview-mode enabled for the request (is unpublished content being queried)?
  • Was the query made during an anonymous request or a request for a known user?
  • If the request was made for a known user, was the data queried something only they should have access to?

If we don't encode this context in the data we are caching then a subsequent request that needs similar data could mistakenly respond with the incorrect cached data 😨.

So, we need to somehow store the cached data with a unique name and encode the context under which the data was queried.

The Validity: Cache Dependencies And Lifetime

The cached results are not eternally valid since they are simply a fast-access snapshot of a limited portion of the database, and the data in the database regularly changes.

We need to encode into the cache what database changes should invalidate the cached items, and we call these changes Cache Dependencies.

We also want to be able to give a general lifetime to the cached data, because there's no need to hold onto it in memory if it hasn't been queried recently.

This lifetime is typically specified in minutes or seconds.


How Are We Caching? CacheHelper.Cache()

Kentico provides a generic, type-safe method, with two overloads, on the CacheHelper class to help us cache data:

public static TData Cache<TData>(Func<TData> loadMethod, CacheSettings settings);
Enter fullscreen mode Exit fullscreen mode

and

public static TData Cache<TData>(Func<CacheSettings, TData> loadMethod, CacheSettings settings);
Enter fullscreen mode Exit fullscreen mode

When using CacheHelper.Cache() we will need to always supply 3 values (either to the Cache method or the CacheSettings parameter):

  1. Data: The delegate that returns data we want to cache.
  2. Context: The name (key) we want the cached value to have which includes the context under which it was queried.
  3. Validity: The names (keys) of the values that our cached value depends on and the lifetime of the cache.

The simplest use of this method can be seen in Kentico's documentation on Caching on MVC sites.

Below, I've annotated the example from the documentation:

public IEnumerable<Article> GetArticles(
    int count, 
    string culture, 
    string siteName)
{
    // βœ… The Data - DocumentQuery<Article> 
    Func<IEnumerable<Article>> dataLoadMethod = () =>
        ArticleProvider.GetArticles()
            .OnSite(siteName)
            .Culture(culture)
            .TopN(count)
            .OrderByDescending("DocumentPublishFrom")
            .TypedResult;

    // βœ… The Validity - Lifetime
    int cacheMinutes = 10;

    var cacheSettings = new CacheSettings(
        cacheMinutes, 

        // βœ… The Context - Cache Item Name
        "myapp|data|articles", 

        // βœ… The Context - Query Parameters
        siteName, culture, count)
    {
        // βœ… The Validity - Dependencies
        GetCacheDependency = () =>
        {
            string dependencyCacheKey = String.Format("nodes|{0}|{1}|all", siteName, Article.CLASS_NAME);
            return CacheHelper.GetCacheDependency(dependencyCacheKey);
        }
    };

    return CacheHelper.Cache(dataLoadMethod, cacheSettings);
}
Enter fullscreen mode Exit fullscreen mode

We can see that all the required pieces, which we previously outlined, are present - Data, Context, and Validity.

So this should be all we need, right?


Where Do We Cache?

The above example works perfectly when we need to only cache a few database queries, but it scales poorly across any reasonably sized application.

Why? πŸ˜•

A Simple Approach: Explicit, Repeated, Procedural Caching Calls

An application will typically have many methods that access the database, all needing wrapped in a call to CacheHelper.Cache() with the correct CacheSettings.

Including the call to CacheHelper.Cache() in the same method that performs our database query makes it hard to apply consistent caching, since it is going to be repeated many times (at each database query call site).

This goes against the rule-of-thumb, Don't Repeat Yourself (DRY), which is a recommendation to not repeat concepts in your application (lines of code can be repeated if they don't represent the same concept).

It also violates the Single Responsibility Principle (SRP) by performing both data access and caching in the same place.

The violation of SRP will make testing more difficult and the method harder to reason about 😦.

It's also possible to see the data access code become intertwined with caching code, and as the two intermix they will tend to become more dependent on each other, making refactoring difficult.

We can adhere to SRP by applying caching as a layer across all methods that perform data access by leveraging Aspect Oriented Programming (AOP).

The Very Advanced Approach: Caching through Interception based AOP

In the DancingGoat sample MVC application, we see an interesting approach to solving these issues with the CachingRepositoryDecorator by using Aspect Oriented Programming (AOP) through Interception.

Here is a sample of the main Intercept method of that class:

public void Intercept(IInvocation invocation)
{
    if (!mCacheEnabled || 
        !invocation.Method.Name.StartsWith("Get", StringComparison.Ordinal))
    {
        invocation.Proceed();

        return;
    }

    Type returnType = invocation.Method.ReturnType;

    List<CacheDependencyAttribute> cacheDependencyAttributes = invocation
        .MethodInvocationTarget
        .GetCustomAttributes<CacheDependencyAttribute>()
        .ToList();

    if (cacheDependencyAttributes.Count > 0)
    {
        invocation.ReturnValue = GetCachedResult(
            invocation,
            GetDependencyCacheKeyFromAttributes(
                cacheDependencyAttributes, 
                invocation.Arguments));
    }
    else if (typeof(TreeNode).IsAssignableFrom(returnType))
    {
        invocation.ReturnValue = GetCachedResult(
            invocation, 
            GetDependencyCacheKeyForPage(returnType));
    }

    // Continues with various use-cases
}
Enter fullscreen mode Exit fullscreen mode

AOP is a pattern I really love - it helps centralize Cross-Cutting Concerns, like logging and caching, which keeps them out of our business logic. This in turn helps us ensure our code follows the SRP, and many of the other SOLID princples πŸ€“.

However, I'm not a fan of the way it's applied in the DancingGoat code base πŸ˜’.

The CachingRepositoryDecorator relies on IInvocation, from the Castle.Core package:

public interface IInvocation
{
    object[] Arguments { get; }
    Type[] GenericArguments { get; }
    object InvocationTarget { get; }
    MethodInfo Method { get; }
    MethodInfo MethodInvocationTarget { get; }
    object Proxy { get; }
    object ReturnValue { get; set; }
    Type TargetType { get; }

    object GetArgumentValue(int index);
    MethodInfo GetConcreteMethod();
    MethodInfo GetConcreteMethodInvocationTarget();
    void Proceed();
    void SetArgumentValue(int index, object value);
}
Enter fullscreen mode Exit fullscreen mode

This is an example of AOP through Interception.

We are applying our Aspect (Caching) not across common contracts (interfaces) and known types (generic type constraints), but by intercepting all calls and executing conditional logic based matches of conventions or runtime parameters.

In the CachingRepositoryDecorator.Intercept() method we are only interested in methods with names that start with the characters "Get", case-sensitive, and have a special CacheDependencyAttribute applied or have a return type of one of several variations of TreeNode or BaseInfo.

The Inversion of Control Container registration for this decorator class makes sure we only intercept classes implementing IRepository, but since this is only a marker interface (it has no methods or properties), it can't help us know anything about our intercepted method call at runtime.

Using the IInvocation interface isn't all that enjoyable and requires digging through a lot of reflection APIs πŸ™.

The way CachingRepositoryDecorator builds up the Context and Validity cache keys to store with the cached data is pretty difficult to understand and predict just by viewing the code.

Looking at a method that will be intercepted, it's hard to describe what the cache keys will be, both for the cache item name and dependencies.

So, where do we go to know what cache keys are generated for our CacheHelper.Cache() API calls at runtime πŸ˜–?

When adding a new "Get" method to a repository, ensuring it is cached correctly is going to require some manual testing or maybe new tests of CachingRepositoryDecorator.

This seems... off, to me, since the behavior that should be tested when adding a new method to a Repository, is the Repository, not a layer the Repository shouldn't know about (caching).

While (Dynamic) Interception, as a pattern, is extremely powerful and can enable things that would otherwise be impossible, it's a big hammer πŸ”¨ to be swinging at something C#, with its robust type system, and an Inversion of Control (IoC) Container, like Autofac, can give us for free.

My Preferred Approach: Caching through Dependency Injection (DI) Decoration based AOP

It turns out that we already have the tools in our application to enable a simpler and friendlier means of applying caching to our application through AOP 😯.

DI Decoration requires that a type is transparently wrapped in another type, that implements one of its contracts (interfaces), at runtime.

This allows the wrapping type to either handle a method call itself, forward the call to the wrapped type, or some combination of both.

Here is how we might implement DI Decoration, drawing on our earlier simple caching example:

// The Contract that both the wrapping type
// and the wrapped type must implement
public interface IArticleRepository
{
    IEnumerable<Article> GetArticles(int count);
}

// The wrapped type that performs the data access logic
public class ArticleRepository : IArticleRepository
{
    public IEnumerable<Article> GetArticles(
        int count, 
        string culture, 
        string siteName)
    {
        // Call to ArticleProvider.GetArticles()
        // No caching is performed here (SRP!)

        return articles;
    }
}

// The wrapping type that performs the caching logic
public class ArticleRepositoryCacheDecorator : IArticleRepository
{
    // This is an instance of the above ArticleRepository at runtime
    private readonly IArticleRepository repo;

    public ArticleRepositoryCacheDecorator(
        IArticleRepository repo) => this.repo = repo;

    public IEnumerable<Article> GetArticles(
        int count, 
        string culture, 
        string siteName)
    {
        // Forward the call to the original ArticleRepository
        // No data access here (SRP!)
        Func<IEnumerable<Article> dataLoadMethod = () => 
            repo.GetArticles(count, culture, siteName);

        var cacheSettings = new CacheSettings(
            10, "myapp|data|articles", siteName, culture, count)
        {
            GetCacheDependency = () =>
            {
                string dependencyCacheKey = String.Format(
                    "nodes|{0}|{1}|all", siteName, Article.CLASS_NAME);
                return CacheHelper.GetCacheDependency(dependencyCacheKey);
            }
        };

        return CacheHelper.Cache(dataLoadMethod, cacheSettings);
    }
}
Enter fullscreen mode Exit fullscreen mode

Since we are performing DI Decoration instead of Convention-based Interception we have access to all the original parameter names and types, of the Decorated class' method.

  • βœ… This should be much easier to understand and test.
  • βœ… Changes to the ArticleRepository will be enforced on the ArticleRepositoryCacheDecorator by way of the C# type system.
  • βœ… The separation of Data Access and Caching are very pronounced.
  • βœ… Specifying which types should be decorated can be configured through our IoC Container.

In my opinion, this is a much more elegant and verifiable approach to AOP.

If you like the above approach more than digging into .NET's Reflection APIs, you might wonder why Kentico chose to do Interception in the DancingGoat application πŸ€”.

The answer comes from the data-access layer architecture, which is Repository based.

The Problem With Repositories and DI Decoration AOP.

I mention in a previous post in this series that Repositories tend to be bags of methods.

These methods do not typically implement a contract that matches the patterns of any other Repository.

In fact, there is no good way to enforce a requirement (based on the type system and using generics), that the methods of one repository adhere to, matching the name, parameters and return types of method in another Repository without having them all implement a common interface.

Here is a quick example of the problem:

public IArticleRepository
{
    IEnumerable<Article> GetArticles(int count);
}

public ICoffeeRepository
{
    IEnumerable<Coffee> GetCoffees(IRepositoryFilter filter, int count);
}

public ICafeRepository
{
    IEnumerable<Cafe> GetPartnerCafes();
}
Enter fullscreen mode Exit fullscreen mode

All of these Repository interfaces have different method names, different method signatures, and many other various methods.

This freedom in method signatures common in Repositories forces us to use Interception based AOP.

DI based AOP requires consistent method signatures across all types that need to be decorated, and Repositories aren't the type of architectural pattern that results in consistent method signatures across different Repositories.

So, it appears we are at an impasse 😀.

Or are we ... ?

My recommendation: Get rid of the repositories πŸ˜….

They don't work with our goals for caching and logging and they also already exist in our application as the Object Relational Mapping (ORM) layer (DocumentHelper, *InfoProvider in Kentico), so let's not create two layers of abstraction with the same pattern.

If you would like to read more about the differences between DI Decoration and Dynamic Interception, checkout this blog post Why choose DI interception over aspect oriented programming?

The Solution: The Query, QueryHandler Pattern

What we effectively want is a single method for all database querying.

If we have only one method that processes all database queries, then we only have one method to decorate through DI to apply our caching.

But, if we only have a single method signature to work with, how can it handle all the various queries we want to perform on the database?

Well, we can leverage C# generics which provide a way to vary our method parameters and return types but keep the method signature consistent.

Let's look at the solution:

public interface IQuery<TResult>
{
}

public interface IQueryHandler<TQuery, TResult> 
    where TQuery : IQuery<TResult>
{
    TResult Exceute(TQuery query);
}
Enter fullscreen mode Exit fullscreen mode

Tada πŸ‘πŸ€˜!

We now have an IQuery interface, which is generic on the type that is returned for a given query and where we will put all of our parameters we want to query the database with.

We also have an IQueryHandler interface, which is generic on both the query type it handles and the return type for that specific query.

Every data access method call will be a call to Execute() for some IQuery<TResult> with a TResult return type.

Check out this StackOverflow answer, from Steven van Deursen, detailing how this pattern works in practice.

There is a helpful piece in the Query, QueryHandler relationship called the Dispatcher. I would recommend using one if you choose to go down this architectural path, but it is not required.

You can take a look at the source code for Mediatr, which is Jimmy Bogard's often-used implementation of the Mediator design pattern (but without the data-access specific Query semantics) if you want an example implementation of all 3 pieces (Query, QueryHandler, Dispatcher).

Let's look at an example implementation using our original ArticleRepository.GetArticles() method:

public ArticlesQuery : IQuery<IEnumerable<Article>>
{
    public int Count { get; set; }
    public string SiteName { get; set; }
    public string Culture { get; set; }
}

public ArticlesQueryHandler : IQueryHandler<ArticlesQuery, IEnumerable<Article>>
{
    public IEnumerable<Article> Execute(ArticlesQuery query)
    {
        return ArticleProvider.GetArticles()
            .OnSite(query.SiteName)
            .Culture(query.Culture)
            .TopN(query.Count)
            .OrderByDescending("DocumentPublishFrom")
            .TypedResult;
    }
}
Enter fullscreen mode Exit fullscreen mode

Now, let's see how we might use these:

var query = new ArticlesQuery
{
    Count = 10,
    Culture = "en-us",
    SiteName = "sandbox"
};

// This should be supplied through Dependency Injection
// as IArticlesQueryHandler

var handler = new ArticlesQueryHandler();

var articles = handler.Execute(query);
Enter fullscreen mode Exit fullscreen mode

Ok, now we can apply caching - not just to this query handler, but to all query handlers in our application!

But wait, a quick digression before we get there πŸ˜‹!

Generating Cache Keys With IQueryCacheKeysCreator

The DancingGoat code base generates cache item name keys through the following logic (edited for brevity):

    var builder = new StringBuilder(127);

    ...

    foreach (var value in invocation.Arguments)
    {
        builder.AppendFormat(
            CultureInfo.InvariantCulture, 
            "|{0}", 
            GetArgumentCacheKey(value));
    }

...

private string GetArgumentCacheKey(object argument)
{
    if (argument == null)
    {
        return string.Empty;
    }

    var keyArgument = argument as ICacheKey;
    if (keyArgument != null)
    {
        return keyArgument.GetCacheKey();
    }

    return argument.ToString();
}
Enter fullscreen mode Exit fullscreen mode

And it generates the cache dependency keys by looking only at the intercepted method return type or CacheDependencyAttributes which might be on the method returning data to be cached.

This was the logic that felt really opaque to me initially 😣, so let's look at a different way to generate these keys, leveraging the C# type system again and our IQuery, IQueryHandler types.

public interface IQueryCacheKeysCreator<TQuery, TResult> 
    where TQuery : IQuery<TResult>
{
    string[] DependencyKeys(TQuery query, TResult result);
    object[] ItemNameParts(TQuery query);
}
Enter fullscreen mode Exit fullscreen mode

This interface flows the types of our Query implementation through all the method calls and can be implemented for each Query type in a unique and type-safe way πŸ‘.

Here is an example for our ArticlesQuery:

public class ArticlesQueryCacheKeysCreator :
    IQueryCacheKeysCreator<ArticlesQuery, IEnumerable<Article>>
{
    public string[] DependencyKeys(ArticlesQuery query, IEnumerable<Article> result) =>
        new object[]
        {
            $"nodes|{query.SiteName}|{Article.CLASS_NAME}|all" 
        };

        // or, if it's appropriate for the kind of query we are executing,
        // and since we have access to the query result

        result.Select(a => $"nodeid|{a.NodeID}").ToArray();

    public object[] ItemNameParts(ArticlesQuery query) =>
        new [] 
        { 
            "myapp|data|articles", 
            query.SiteName, 
            query.Culture, 
            query.Count.ToString() 
        };
}
Enter fullscreen mode Exit fullscreen mode

I normally define the IQuery implementation in the same file as the IQueryCacheKeysCreator. I like to think of the IQueryCacheKeysCreator as a more flexible alternative to annotating the IQuery implementation class with an attribute, so I don't feel a need to separate them.

If you don't want to supply a bunch of "Ambient Context" (HttpContext-based) runtime data in the ArticlesQuery (like siteName, culture, or "Is Preview Enabled") you could supply these values through DI to the ArticlesQueryCacheKeysCreator.

This is what I do when using this pattern.

DI based Decoration With Query / QueryHandler

Now we can finally define our single, simple, and elegant caching decorator class for all IQueryHandler instances.

public class QueryHandlerCacheDecorator<TQuery, TResult> 
    : IQueryHandler<TQuery, TResult> 
    where TQuery : IQuery<TResult>
{
    private readonly IQueryHandlerSync<TQuery, TResult> handler;
    private readonly IQueryCacheKeysCreator<TQuery, TResult> cacheKeysCreator;

    public QueryHandlerCacheDecorator(
        IQueryHandler<TQuery, TResult> handler,
        IQueryCacheKeysCreator<TQuery, TResult> cacheKeysCreator)
    {
        this.handler = handler;
        this.cacheKeysCreator = cacheKeysCreator;
    }

    public TResult Execute(TQuery query) =>
        CacheHelper.Cache(
            (cacheSettings) => 
            {
                TResult result = handler.Execute(query);

                if (cacheSettings.Cached)
                {
                    cacheSettings.GetCacheDependency = () =>
                        CacheHelper.GetCacheDependency(
                            cacheKeysCreator.DependencyKeys(query, result));
                }

                return result;
            },
            new CacheSettings(
               cacheMinutes: 10,
               useSlidingExpiration: true,
               cacheItemNameParts: cacheKeysCreator.ItemNameParts(query)));
}
Enter fullscreen mode Exit fullscreen mode

I'd like to think this is much more readable, testable, simpler, and type-safe than the Interception based Caching AOP implementation πŸ€—.

The instances of IQueryCacheKeysCreator are extremely testable, and any changes to existing implementations or new implementations of IQuery only require tests for those types πŸ˜€.

The implementation of QueryHandlerCacheDecorator is so simple that you can write tests for it once and know that there aren't corner cases that might appear only after building out additional queries later on.


One caveat I feel I should mention is that my recommended approach results in more classes and, generally, more code.

Repositories: 1 interface, 1 implementation.

Query/QueryHandler: (1 query class, 1 query cache class, 1 handler class) x number of methods normally in a Repository.

But I see this happen pretty regularly when I move from a "bags of methods"/"god class" design to one emphasizing SRP and SOLID principles.

That said, I welcome feedback and recommendations!

Summary

Our ability, as Kentico 12 MVC developers, to effectively cache data returned from the database is a key component to building a well performing and scalable content delivery application.

The Kentico caching APIs provide us with the core calls to supply the 3 parts required when caching data - the Data, Context, and Validity 🧐.

However, these same caching APIs are very low level and we need additional architecture to scale their use across our code base while maintaining the Single Responsibility Principle (SRP) and the mantra of Don't Repeat Yourself (DRY).

Kentico provides an example of this architecture in their DancingGoat code base using Aspect Oriented Programming (AOP), something we developers should emulate.

Unfortunately (in my opinion!) its reliance on the Repository pattern forces it to use Interception instead of Dependency Injection (DI) Decoration πŸ˜”.

By replacing the Repository pattern with the Query and QueryHandler types, we can apply Caching to our database queries through DI Decoration.

The resulting code is clearer and more testable, allowing more flexibility for each query to apply its own means of providing Context and Validity to the caching API calls ⚑.

I haven't supplied all the calls to the IoC Container (Autofac in my case) to register these types I've created, but I'm happy to give some code examples if anyone is interested.

I'd love to hear your thoughts on these ideas, so leave a comment below if you'd like to share them.

Thanks for reading! πŸ™


If you are looking for additional Kentico content, checkout the Kentico tag here on DEV:

#kentico

Or my Kentico blog series:

Top comments (4)

Collapse
 
peterotmar profile image
peterotmar

Hi Sean
I've been following your articles as we are moving to K13. Great stuff, thank you.
I did try github.com/wiredviews/xperience-co..., however, the sample app crashes.
Also, you would not have a version for .net core 3.1 of the cqrs lib handy by any chance, would you?

Thanks
Peter

Collapse
 
seangwright profile image
Sean G. Wright

Hey Peter!

I'm glad my articles are helpful!

That specific library is still a work-in-progress. The pattern absolutely works (we are using it in production on many sites), however our implementation is using internal libraries which I've been working on translating to the CQRS library over time.

I plan on making some more updates soon and writing a blog post about the library, so keep an eye out for that. I will publish preview NuGet packages when it's ready to test.

I don't plan on making a .NET Core 3.1 version of the library since Kentico Xperience 13 runs fine on .NET 5 and I would like to encourage developers to start their sites running on .NET 5. There's also some language features in C#9 that I've come to find very helpful and C#9 isn't supported on .NET Core 3.1 (though some have gotten it to work).

Collapse
 
manjusaharan profile image
manjusaharan

Hi,
This is really great article. Do you have any sample code in git repo that we can refer to implement it.
Thanks in advance.

Collapse
 
seangwright profile image
Sean G. Wright

Hey, thanks for the compliment!

I've been working on an open source implementation here for Kentico Xperience 13

github.com/wiredviews/xperience-co...