Back to Skills

caching-strategies

verified

Backend caching patterns with Redis including write-through, write-behind, cache-aside, and invalidation strategies. Use when implementing Redis cache, managing TTL/expiration, preventing cache stampede, or optimizing cache hit rates.

View on GitHub

Marketplace

orchestkit

yonatangross/orchestkit

Plugin

orchestkit-complete

development

Repository

yonatangross/orchestkit
33stars

./skills/caching-strategies/SKILL.md

Last Verified

January 24, 2026

Install Skill

Select agents to install to:

Scope:
npx add-skill https://github.com/yonatangross/orchestkit/blob/main/./skills/caching-strategies/SKILL.md -a claude-code --skill caching-strategies

Installation paths:

Claude
.claude/skills/caching-strategies/
Powered by add-skill CLI

Instructions

# Backend Caching Strategies

Optimize performance with Redis caching patterns and smart invalidation.

## Pattern Selection

| Pattern | Write | Read | Consistency | Use Case |
|---------|-------|------|-------------|----------|
| Cache-Aside | DB first | Cache → DB | Eventual | General purpose |
| Write-Through | Cache + DB | Cache | Strong | Critical data |
| Write-Behind | Cache, async DB | Cache | Eventual | High write load |
| Read-Through | Cache handles | Cache → DB | Eventual | Simplified reads |

## Cache-Aside (Lazy Loading)

```python
import redis.asyncio as redis
from typing import TypeVar, Callable
import json

T = TypeVar("T")

class CacheAside:
    def __init__(self, redis_client: redis.Redis, default_ttl: int = 3600):
        self.redis = redis_client
        self.ttl = default_ttl

    async def get_or_set(
        self,
        key: str,
        fetch_fn: Callable[[], T],
        ttl: int | None = None,
        serialize: Callable[[T], str] = json.dumps,
        deserialize: Callable[[str], T] = json.loads,
    ) -> T:
        """Get from cache, or fetch and cache."""
        # Try cache first
        cached = await self.redis.get(key)
        if cached:
            return deserialize(cached)

        # Cache miss - fetch from source
        value = await fetch_fn()

        # Store in cache
        await self.redis.setex(
            key,
            ttl or self.ttl,
            serialize(value),
        )
        return value

# Usage
cache = CacheAside(redis_client)

async def get_analysis(analysis_id: str) -> Analysis:
    return await cache.get_or_set(
        key=f"analysis:{analysis_id}",
        fetch_fn=lambda: repo.get_by_id(analysis_id),
        ttl=1800,  # 30 minutes
    )
```

## Write-Through Cache

```python
class WriteThroughCache:
    def __init__(self, redis_client: redis.Redis, ttl: int = 3600):
        self.redis = redis_client
        self.ttl = ttl

    async def write(
        self,
        key: str,
        value: T,
     

Validation Details

Front Matter
Required Fields
Valid Name Format
Valid Description
Has Sections
Allowed Tools
Instruction Length:
8743 chars