How Data Protection APIs (DPAPI) Enhance Security in Your CMS

18/03/2025

Securing sensitive data is essential in modern CMS platforms. ASP.NET Core provides the Data Protection API (DPAPI) to help simplify encryption, decryption, and key management. DPAPI is used in many features, including ASP.NET authentication, so even if you're not using it directly, it is important to understand the basics.

What actually is DPAPI? 🤔

DPAPI is a set of built-in APIs that allow developers to protect sensitive short-lived data without having to manually implement their own cryptographic algorithms. It follows best practises, offers cross-platform support, and flexible storage options (which we will cover later).

I used the term short-lived because the default expiration timeframe is 90 days. Although this number is configurable, I wouldn't recommend extending it. It exists to keep key rotation to maintain security.

Microsoft's own documentation provides a basic example taking user input from a console application, encrypting it, and finally decrypting the value again.

using System;
using Microsoft.AspNetCore.DataProtection;
using Microsoft.Extensions.DependencyInjection;

public class Program
{
    public static void Main(string[] args)
    {
        // add data protection services
        var serviceCollection = new ServiceCollection();
        serviceCollection.AddDataProtection();
        var services = serviceCollection.BuildServiceProvider();

        // create an instance of MyClass using the service provider
        var instance = ActivatorUtilities.CreateInstance<MyClass>(services);
        instance.RunSample();
    }

    public class MyClass
    {
        IDataProtector _protector;

        // the 'provider' parameter is provided by DI
        public MyClass(IDataProtectionProvider provider)
        {
            _protector = provider.CreateProtector("Contoso.MyClass.v1");
        }

        public void RunSample()
        {
            Console.Write("Enter input: ");
            string input = Console.ReadLine();

            // protect the payload
            string protectedPayload = _protector.Protect(input);
            Console.WriteLine($"Protect returned: {protectedPayload}");

            // unprotect the payload
            string unprotectedPayload = _protector.Unprotect(protectedPayload);
            Console.WriteLine($"Unprotect returned: {unprotectedPayload}");
        }
    }
}

/*
 * SAMPLE OUTPUT
 *
 * Enter input: Hello world!
 * Protect returned: CfDJ8ICcgQwZZhlAlTZT...OdfH66i1PnGmpCR5e441xQ
 * Unprotect returned: Hello world!
 */

Who remembers MachineKey? 👴

If you've worked with older .NET Framework applications, you may remember MachineKey. DPAPI is a direct replacement for it.

In older .NET Framework systems, MachineKey was configured in the web.config for managing encryption keys. This had to be manually synced across servers, and lacked automatic key rotation. It also relied on older encryption algorithms compared to DPAPI which makes it less secure.

Why is it important to know the basics?

While working on a recent project in Xperience by Kentico, I encountered an issue where email validation links for a registration process were sometimes failing.

The majority of users successfully submitted the registration form, received the validation email, and clicked the link to confirm their accounts. However, a small subset of users encountered an error when clicking the link. How frustrating! 😢

What went wrong?

The site had been running perfectly on a single server setup, but the problems began after it was moved to a load-balanced environment. The keys used to encrypt the email validation tokens were not shared across the two servers.

If a user registered on Server 1 but then clicked the email link and was redirected to Server 2, Server 2 didn't have access to the same encryption keys, causing the validation to fail.

Each server (Server 1 and Server 2) manages its own encryption keys locally.

Flexible storage options for keys

The solution is to configure a persistent shared storage for encryption keys across all servers.

Both servers (Server 1 and Server 2) share encryption keys from a shared storage.

By default, ASP.NET will store keys in a folder on the machine, or in memory. This is fine for local development, but really for production you will want to look at a more robust storage solution.

Here are the most common options:

Option 1 - File System

The first option is file system storage. This allows full control over where the keys are stored locally. I recommend setting this as an absolute minimum, so you know where the keys are stored. It can also be useful for scenarios where you have shared network drives between servers.

services
    .AddDataProtection()
    .SetApplicationName("MyCMS")
    .PersistKeysToFileSystem(new DirectoryInfo(@"/keys"));

Option 2 - Azure Key Vault/Blob Storage

If you're using Azure, you can store encryption keys in Azure Blob Storage using the Azure.Extensions.AspNetCore.DataProtection.Blobs package.

services
    .AddDataProtection()
    .SetApplicationName("MyCMS")
    .PersistKeysToAzureBlobStorage(new Uri(<BlobStorageUri>), new DefaultAzureCredential())
    .ProtectKeysWithAzureKeyVault(new Uri(<KeyVaultUri>), new DefaultAzureCredential());

Option 3 - Redis Cache

If you're already using Redis for a distributed cache, you can store encryption keys there as well, using Microsoft.AspNetCore.DataProtection.StackExchangeRedis package.

var redis = ConnectionMultiplexer.Connect("<RedisUri>");
services
    .AddDataProtection()
    .SetApplicationName("MyCMS")
    .PersistKeysToStackExchangeRedis(redis, "DataProtection-Keys");

Xperience by Kentico

If you're using Kentico's SaaS platform, you don't have to worry about key storage! Kentico automatically handles storage for you. 😎

In February 2025's refresh, Kentico added a new Data Protection extension method which does all of the configuration for you. 👏

If you're updating an existing project, make sure you include the following to enable it:

using Kentico.Xperience.Cloud;

// ...

WebApplicationBuilder builder = WebApplication.CreateBuilder(args);

// ...

// Enables the DataProtection API in all SaaS environments
if (builder.Environment.IsQa() || builder.Environment.IsUat() || builder.Environment.IsProduction() || builder.Environment.IsEnvironment(CloudEnvironments.Custom))
{
    // ...

    builder.Services.AddXperienceCloudDataProtection(builder.Configuration);
}

Umbraco

If you're using Umbraco, you will need to consider configuring key storage. Out of the box, Umbraco enables DPAPI but doesn't configure it at all. If you don't configure storage, then you may experience similar problems I have described in load-balanced scenarios.

If you don't want to configure key storage, there is a community package available which helps you use the Umbraco database instead.

Conclusion

Using Data Protection APIs (DPAPI) is essential for securing short-lived data in your CMS. Unlike the old MachineKey, DPAPI provides automatic key rotation, modern encryption algorithms, and cross-server key storage.

For load-balanced environments, always configure persistent key storage using File System, Azure, or Redis.

If you're using Xperience by Kentico, encryption is already managed for you, but in Umbraco, you must configure an external key store.