Skip to content

cacheHandlers

The cacheHandlers configuration allows you to define custom cache storage implementations for 'use cache' and 'use cache: remote'. This enables you to store cached components and functions in external services or customize the caching behavior. 'use cache: private' is not configurable.

When to use custom cache handlers

Most applications don't need custom cache handlers. The default in-memory cache works well in the typical use case.

Custom cache handlers are for advanced scenarios where you need to either share cache across multiple instances or change where the cache is stored. For example, you can configure a custom remote handler for external storage (like a key-value store), then use 'use cache' in your code for in-memory caching and 'use cache: remote' for the external storage, allowing different caching strategies within the same application.

Sharing cache across instances

The default in-memory cache is isolated to each Next.js process. If you're running multiple servers or containers, each instance will have its own cache that isn't shared with others and is lost on restart.

Custom handlers let you integrate with shared storage systems (like Redis, Memcached, or DynamoDB) that all your Next.js instances can access.

Changing storage type

You might want to store cache differently than the default in-memory approach. You can implement a custom handler to store cache on disk, in a database, or in an external caching service. Reasons include: persistence across restarts, reducing memory usage, or integrating with existing infrastructure.

Usage

To configure custom cache handlers:

  1. Define your cache handler in a separate file, see examples for implementation details.
  2. Reference the file path in your Next config file
next.config.ts
import type { NextConfig } from 'next'
 
const nextConfig: NextConfig = {
  cacheHandlers: {
    default: './cache-handlers/default-handler.js',
    remote: './cache-handlers/remote-handler.js',
  },
}
 
export default nextConfig

Handler types

  • default: Used by the 'use cache' directive
  • remote: Used by the 'use cache: remote' directive

If you don't configure cacheHandlers, Next.js uses an in-memory LRU (Least Recently Used) cache for both default and remote. You can view the default implementation as a reference.

You can also define additional named handlers (e.g., sessions, analytics) and reference them with 'use cache: <name>'.

Note that 'use cache: private' does not use cache handlers and cannot be customized.

API Reference

A cache handler must implement the CacheHandler interface with the following methods:

get()

Retrieve a cache entry for the given cache key.

get(cacheKey: string, softTags: string[]): Promise<CacheEntry | undefined>
ParameterTypeDescription
cacheKeystringThe unique key for the cache entry.
softTagsstring[]Tags to check for staleness (used in some cache strategies).

Returns a CacheEntry object if found, or undefined if not found or expired.

Your get method should retrieve the cache entry from storage, check if it has expired based on the revalidate time, and return undefined for missing or expired entries.

class CacheHandler {
  async get(cacheKey, softTags) {
    const entry = cache.get(cacheKey)
    if (!entry) return undefined
 
    // Check if expired
    const now = Date.now()
    if (now > entry.timestamp + entry.revalidate * 1000) {
      return undefined
    }
 
    return entry
  }
}

set()

Store a cache entry for the given cache key.

set(cacheKey: string, pendingEntry: Promise<CacheEntry>): Promise<void>
ParameterTypeDescription
cacheKeystringThe unique key to store the entry under.
pendingEntryPromise<CacheEntry>A promise that resolves to the cache entry.

The entry may still be pending when this is called (i.e., its value stream may still be written to). Your handler should await the promise before processing the entry.

Returns Promise<void>.

Your set method must await the pendingEntry promise before storing it, since the cache entry may still be generating when this method is called. Once resolved, store the entry in your cache system.

class CacheHandler {
  async set(cacheKey, pendingEntry) {
    // Wait for the entry to be ready
    const entry = await pendingEntry
 
    // Store in your cache system
    cache.set(cacheKey, entry)
  }
}

refreshTags()

Called periodically before starting a new request to sync with external tag services.

refreshTags(): Promise<void>

This is useful if you're coordinating cache invalidation across multiple instances or services. For in-memory caches, this can be a no-op.

Returns Promise<void>.

For in-memory caches, this can be a no-op. For distributed caches, use this to sync tag state from an external service or database before processing requests.

class CacheHandler {
  async refreshTags() {
    // For in-memory cache, no action needed
    // For distributed cache, sync tag state from external service
  }
}

getExpiration()

Get the maximum revalidation timestamp for a set of tags.

getExpiration(tags: string[]): Promise<number>
ParameterTypeDescription
tagsstring[]Array of tags to check expiration for.

Returns:

  • 0 if none of the tags were ever revalidated
  • A timestamp (in milliseconds) representing the most recent revalidation
  • Infinity to indicate soft tags should be checked in the get method instead

If you're not tracking tag revalidation timestamps, return 0. Otherwise, find the most recent revalidation timestamp across all the provided tags. Return Infinity if you prefer to handle soft tag checking in the get method.

class CacheHandler {
  async getExpiration(tags) {
    // Return 0 if not tracking tag revalidation
    return 0
 
    // Or return the most recent revalidation timestamp
    // return Math.max(...tags.map(tag => tagTimestamps.get(tag) || 0));
  }
}

updateTags()

Called when tags are revalidated or expired.

updateTags(tags: string[], durations?: { expire?: number }): Promise<void>
ParameterTypeDescription
tagsstring[]Array of tags to update.
durations{ expire?: number }Optional expiration duration in seconds.

Your handler should update its internal state to mark these tags as invalidated.

Returns Promise<void>.

When tags are revalidated, your handler should invalidate all cache entries that have any of those tags. Iterate through your cache and remove entries whose tags match the provided list.

class CacheHandler {
  async updateTags(tags, durations) {
    // Invalidate all cache entries with matching tags
    for (const [key, entry] of cache.entries()) {
      if (entry.tags.some((tag) => tags.includes(tag))) {
        cache.delete(key)
      }
    }
  }
}

CacheEntry Type

The CacheEntry object has the following structure:

interface CacheEntry {
  value: ReadableStream<Uint8Array>
  tags: string[]
  stale: number
  timestamp: number
  expire: number
  revalidate: number
}
PropertyTypeDescription
valueReadableStream<Uint8Array>The cached data as a stream.
tagsstring[]Cache tags (excluding soft tags).
stalenumberDuration in seconds for client-side staleness.
timestampnumberWhen the entry was created (timestamp in milliseconds).
expirenumberHow long the entry is allowed to be used (in seconds).
revalidatenumberHow long until the entry should be revalidated (in seconds).

Good to know:

  • The value is a ReadableStream. Use .tee() if you need to read and store the stream data.
  • If the stream errors with partial data, your handler must decide whether to keep the partial cache or discard it.

Examples

Basic in-memory cache handler

Here's a minimal implementation using a Map for storage. This example demonstrates the core concepts, but for a production-ready implementation with LRU eviction, error handling, and tag management, see the default cache handler.

cache-handlers/memory-handler.js
const cache = new Map()
const pendingSets = new Map()
 
module.exports = class MemoryCacheHandler {
  async get(cacheKey, softTags) {
    // Wait for any pending set operation to complete
    const pendingPromise = pendingSets.get(cacheKey)
    if (pendingPromise) {
      await pendingPromise
    }
 
    const entry = cache.get(cacheKey)
    if (!entry) {
      return undefined
    }
 
    // Check if entry has expired
    const now = Date.now()
    if (now > entry.timestamp + entry.revalidate * 1000) {
      return undefined
    }
 
    return entry
  }
 
  async set(cacheKey, pendingEntry) {
    // Create a promise to track this set operation
    let resolvePending
    const pendingPromise = new Promise((resolve) => {
      resolvePending = resolve
    })
    pendingSets.set(cacheKey, pendingPromise)
 
    try {
      // Wait for the entry to be ready
      const entry = await pendingEntry
 
      // Store the entry in the cache
      cache.set(cacheKey, entry)
    } finally {
      resolvePending()
      pendingSets.delete(cacheKey)
    }
  }
 
  async refreshTags() {
    // No-op for in-memory cache
  }
 
  async getExpiration(tags) {
    // Return 0 to indicate no tags have been revalidated
    return 0
  }
 
  async updateTags(tags, durations) {
    // Implement tag-based invalidation
    for (const [key, entry] of cache.entries()) {
      if (entry.tags.some((tag) => tags.includes(tag))) {
        cache.delete(key)
      }
    }
  }
}

External storage pattern

For durable storage like Redis or a database, you'll need to serialize the cache entries. Here's a simple Redis example:

cache-handlers/redis-handler.js
const { createClient } = require('redis')
 
module.exports = class RedisCacheHandler {
  constructor() {
    this.client = createClient({ url: process.env.REDIS_URL })
    this.client.connect()
  }
 
  async get(cacheKey, softTags) {
    // Retrieve from Redis
    const stored = await this.client.get(cacheKey)
    if (!stored) return undefined
 
    // Deserialize the entry
    const data = JSON.parse(stored)
 
    // Reconstruct the ReadableStream from stored data
    return {
      value: new ReadableStream({
        start(controller) {
          controller.enqueue(Buffer.from(data.value, 'base64'))
          controller.close()
        },
      }),
      tags: data.tags,
      stale: data.stale,
      timestamp: data.timestamp,
      expire: data.expire,
      revalidate: data.revalidate,
    }
  }
 
  async set(cacheKey, pendingEntry) {
    const entry = await pendingEntry
 
    // Read the stream to get the data
    const reader = entry.value.getReader()
    const chunks = []
 
    try {
      while (true) {
        const { done, value } = await reader.read()
        if (done) break
        chunks.push(value)
      }
    } finally {
      reader.releaseLock()
    }
 
    // Combine chunks and serialize for Redis storage
    const data = Buffer.concat(chunks.map((chunk) => Buffer.from(chunk)))
 
    await this.client.set(
      cacheKey,
      JSON.stringify({
        value: data.toString('base64'),
        tags: entry.tags,
        stale: entry.stale,
        timestamp: entry.timestamp,
        expire: entry.expire,
        revalidate: entry.revalidate,
      }),
      { EX: entry.expire } // Use Redis TTL for automatic expiration
    )
  }
 
  async refreshTags() {
    // No-op for basic Redis implementation
    // Could sync with external tag service if needed
  }
 
  async getExpiration(tags) {
    // Return 0 to indicate no tags have been revalidated
    // Could query Redis for tag expiration timestamps if tracking them
    return 0
  }
 
  async updateTags(tags, durations) {
    // Implement tag-based invalidation if needed
    // Could iterate over keys with matching tags and delete them
  }
}

Platform Support

Deployment OptionSupported
Node.js serverYes
Docker containerYes
Static exportNo
AdaptersPlatform-specific

Version History

VersionChanges
v16.0.0cacheHandlers introduced.

Was this helpful?

supported.