Skip to content

HybridCache โ€‹

HybridCache is a 3-tier caching system that makes your Flowfull backend blazingly fast by storing frequently accessed data in memory and Redis.

The Problem It Solves โ€‹

Without caching, every request hits your database:

Request โ†’ Database โ†’ Response  (20-50ms per request)

With HybridCache:

Request โ†’ Memory Cache โ†’ Response  (1-2ms per request) โšก

Result: 97% cache hit rate, 50x faster responses!

The 3-Tier Architecture โ€‹

โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”
โ”‚                    REQUEST FLOW                          โ”‚
โ”œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ค
โ”‚                                                          โ”‚
โ”‚  1๏ธโƒฃ Check LRU Cache (In-Memory)                         โ”‚
โ”‚     โ”œโ”€ Hit? Return immediately (1-2ms)                  โ”‚
โ”‚     โ””โ”€ Miss? Go to step 2                               โ”‚
โ”‚                                                          โ”‚
โ”‚  2๏ธโƒฃ Check Redis Cache (Shared)                          โ”‚
โ”‚     โ”œโ”€ Hit? Backfill LRU + Return (5-10ms)              โ”‚
โ”‚     โ””โ”€ Miss? Go to step 3                               โ”‚
โ”‚                                                          โ”‚
โ”‚  3๏ธโƒฃ Query Database (Source of Truth)                    โ”‚
โ”‚     โ””โ”€ Backfill Redis + LRU + Return (20-50ms)          โ”‚
โ”‚                                                          โ”‚
โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜

Why 3 Tiers? โ€‹

TierSpeedShared?Use Case
LRU (Memory)โšกโšกโšก 1-2msโŒ Per instanceHot data (frequently accessed)
Redisโšกโšก 5-10msโœ… All instancesWarm data (shared across servers)
Databaseโšก 20-50msโœ… Source of truthCold data (rarely accessed)

Quick Start โ€‹

1. Install Dependencies โ€‹

bash
npm install ioredis lru-cache
bash
yarn add ioredis lru-cache
bash
bun add ioredis lru-cache

2. Configure Environment โ€‹

env
# Enable caching
CACHE_ENABLED=true

# Redis connection
REDIS_URL=redis://localhost:6379

# Optional: Redis password
REDIS_PASSWORD=your-password

3. Basic Usage โ€‹

typescript
import { HybridCache } from './lib/cache/hybrid-cache';

// Create cache instance
const userCache = new HybridCache<UserData>({
  cacheType: 'userContext',
  ttl: 300,           // 5 minutes
  maxSize: 10000,     // 10k items in LRU
  keyPrefix: 'user'
});

// Get from cache
const user = await userCache.get(userId);

// Set in cache
await userCache.set(userId, userData, 300);

// Delete from cache
await userCache.delete(userId);

// Clear all cache
await userCache.clear();

Real-World Examples โ€‹

Example 1: User Profile Cache โ€‹

typescript
const profileCache = new HybridCache<UserProfile>({
  cacheType: 'userProfile',
  ttl: 600,        // 10 minutes
  maxSize: 5000,
  keyPrefix: 'profile'
});

async function getUserProfile(userId: string) {
  // Try cache first
  let profile = await profileCache.get(userId);
  
  if (profile) {
    console.log('โœ… Cache hit!');
    return profile;
  }
  
  // Cache miss - query database
  console.log('โŒ Cache miss - querying DB');
  profile = await db.selectFrom('users')
    .where('id', '=', userId)
    .selectAll()
    .executeTakeFirst();
  
  // Store in cache for next time
  if (profile) {
    await profileCache.set(userId, profile, 600);
  }
  
  return profile;
}

Example 2: Session Cache โ€‹

typescript
const sessionCache = new HybridCache<SessionData>({
  cacheType: 'session',
  ttl: 3600,       // 1 hour
  maxSize: 50000,  // 50k sessions
  keyPrefix: 'sess'
});

async function validateSession(sessionId: string) {
  // Check cache first (1-2ms)
  let session = await sessionCache.get(sessionId);
  
  if (!session) {
    // Validate with Flowless (20-50ms)
    session = await flowless.validateSession(sessionId);
    
    // Cache for next request
    if (session) {
      await sessionCache.set(sessionId, session, 3600);
    }
  }
  
  return session;
}

Example 3: API Response Cache โ€‹

typescript
const apiCache = new HybridCache<ApiResponse>({
  cacheType: 'apiResponse',
  ttl: 60,         // 1 minute
  maxSize: 1000,
  keyPrefix: 'api'
});

app.get('/api/stats', async (c) => {
  const cacheKey = 'global-stats';
  
  // Try cache
  let stats = await apiCache.get(cacheKey);
  
  if (!stats) {
    // Expensive query
    stats = await db.selectFrom('stats')
      .select([
        db.fn.count('id').as('total_users'),
        db.fn.sum('revenue').as('total_revenue')
      ])
      .executeTakeFirst();
    
    // Cache for 1 minute
    await apiCache.set(cacheKey, stats, 60);
  }
  
  return c.json(stats);
});

Performance Metrics โ€‹

Real-world performance from production systems:

MetricValue
Cache Hit Rate97%
LRU Hit Time1-2ms
Redis Hit Time5-10ms
DB Query Time20-50ms
Speedup50x faster

Configuration Options โ€‹

typescript
interface HybridCacheOptions {
  cacheType: string;      // Cache identifier
  ttl: number;            // Time to live (seconds)
  maxSize: number;        // Max items in LRU
  keyPrefix: string;      // Redis key prefix
  redisUrl?: string;      // Redis connection
  enabled?: boolean;      // Enable/disable cache
}

Cache Invalidation โ€‹

Manual Invalidation โ€‹

typescript
// Delete specific key
await cache.delete(userId);

// Clear all cache
await cache.clear();

Automatic Invalidation โ€‹

typescript
// After user update
await db.updateTable('users')
  .set({ name: 'New Name' })
  .where('id', '=', userId)
  .execute();

// Invalidate cache
await userCache.delete(userId);

Time-Based Invalidation โ€‹

typescript
// Short TTL for frequently changing data
const priceCache = new HybridCache({
  ttl: 60  // 1 minute
});

// Long TTL for static data
const configCache = new HybridCache({
  ttl: 86400  // 24 hours
});

Best Practices โ€‹

โœ… Do โ€‹

  • Use short TTL for frequently changing data
  • Use long TTL for static data
  • Invalidate cache after updates
  • Monitor cache hit rates
  • Use appropriate maxSize

โŒ Don't โ€‹

  • Cache sensitive data without encryption
  • Set TTL too high (stale data)
  • Forget to invalidate after updates
  • Cache everything (memory limits)
  • Ignore cache errors

Monitoring โ€‹

typescript
// Get cache stats
const stats = cache.getStats();

console.log('Hits:', stats.hits);
console.log('Misses:', stats.misses);
console.log('Hit Rate:', stats.hitRate);
console.log('Size:', stats.size);

Next Steps โ€‹

Need Help? โ€‹

Released under the MIT License.