Skip to content

HybridCache Guide โ€‹

Complete guide to implementing the 3-tier caching system in Flowfull for lightning-fast performance.

What is HybridCache? โ€‹

HybridCache is a 3-tier caching system that makes your Flowfull backend blazingly fast by storing frequently accessed data in memory and Redis.

The Problem It Solves โ€‹

Without caching, every request hits your database:

Request โ†’ Database โ†’ Response  (20-50ms per request)

With HybridCache:

Request โ†’ Memory Cache โ†’ Response  (1-2ms per request) โšก

Result: 97% cache hit rate, 50x faster responses!

The 3-Tier Architecture โ€‹

โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”
โ”‚                    REQUEST FLOW                          โ”‚
โ”œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ค
โ”‚                                                          โ”‚
โ”‚  1๏ธโƒฃ Check LRU Cache (In-Memory)                         โ”‚
โ”‚     โ”œโ”€ Hit? Return immediately (1-2ms)                  โ”‚
โ”‚     โ””โ”€ Miss? Go to step 2                               โ”‚
โ”‚                                                          โ”‚
โ”‚  2๏ธโƒฃ Check Redis Cache (Shared)                          โ”‚
โ”‚     โ”œโ”€ Hit? Backfill LRU + Return (5-10ms)              โ”‚
โ”‚     โ””โ”€ Miss? Go to step 3                               โ”‚
โ”‚                                                          โ”‚
โ”‚  3๏ธโƒฃ Query Database (Source of Truth)                    โ”‚
โ”‚     โ””โ”€ Backfill Redis + LRU + Return (20-50ms)          โ”‚
โ”‚                                                          โ”‚
โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜

Why 3 Tiers? โ€‹

TierSpeedShared?Use Case
LRU (Memory)โšกโšกโšก 1-2msโŒ Per instanceHot data (frequently accessed)
Redisโšกโšก 5-10msโœ… All instancesWarm data (shared across servers)
Databaseโšก 20-50msโœ… Source of truthCold data (rarely accessed)

Installation โ€‹

bash
npm install ioredis lru-cache
bash
yarn add ioredis lru-cache
bash
bun add ioredis lru-cache

Configuration โ€‹

Environment Variables โ€‹

env
# Enable caching
CACHE_ENABLED=true

# Redis connection
REDIS_URL=redis://localhost:6379

# Optional: Redis password
REDIS_PASSWORD=your-password

Basic Setup โ€‹

typescript
import { HybridCache } from './lib/cache/hybrid-cache';

// Create cache instance
const userCache = new HybridCache<UserData>({
  cacheType: 'userContext',
  ttl: 300,           // 5 minutes
  maxSize: 10000,     // 10k items in LRU
  keyPrefix: 'user'
});

Basic Usage โ€‹

Get from Cache โ€‹

typescript
const user = await userCache.get(userId);

if (user) {
  console.log('โœ… Cache hit!');
  return user;
}

console.log('โŒ Cache miss - querying database');

Set in Cache โ€‹

typescript
await userCache.set(userId, userData, 300);  // TTL: 5 minutes

Delete from Cache โ€‹

typescript
await userCache.delete(userId);

Clear All Cache โ€‹

typescript
await userCache.clear();

Real-World Examples โ€‹

Example 1: User Profile Cache โ€‹

typescript
import { HybridCache } from './lib/cache/hybrid-cache';

const profileCache = new HybridCache<UserProfile>({
  cacheType: 'userProfile',
  ttl: 600,        // 10 minutes
  maxSize: 5000,
  keyPrefix: 'profile'
});

async function getUserProfile(userId: string) {
  // Try cache first
  let profile = await profileCache.get(userId);
  
  if (profile) {
    console.log('โœ… Cache hit!');
    return profile;
  }
  
  // Cache miss - query database
  console.log('โŒ Cache miss - querying DB');
  profile = await db.selectFrom('users')
    .where('id', '=', userId)
    .selectAll()
    .executeTakeFirst();
  
  // Store in cache for next time
  if (profile) {
    await profileCache.set(userId, profile, 600);
  }
  
  return profile;
}

Example 2: Session Cache โ€‹

typescript
const sessionCache = new HybridCache<SessionData>({
  cacheType: 'session',
  ttl: 3600,       // 1 hour
  maxSize: 50000,  // 50k sessions
  keyPrefix: 'sess'
});

async function validateSession(sessionId: string) {
  // Check cache first (1-2ms)
  let session = await sessionCache.get(sessionId);
  
  if (!session) {
    // Validate with Flowless (20-50ms)
    session = await flowless.validateSession(sessionId);
    
    // Cache for next request
    if (session) {
      await sessionCache.set(sessionId, session, 3600);
    }
  }
  
  return session;
}

Example 3: API Response Cache โ€‹

typescript
const apiCache = new HybridCache<ApiResponse>({
  cacheType: 'apiResponse',
  ttl: 60,         // 1 minute
  maxSize: 1000,
  keyPrefix: 'api'
});

app.get('/api/stats', async (c) => {
  const cacheKey = 'global-stats';
  
  // Try cache
  let stats = await apiCache.get(cacheKey);
  
  if (!stats) {
    // Expensive query
    stats = await db.selectFrom('stats')
      .select([
        db.fn.count('id').as('total_users'),
        db.fn.sum('revenue').as('total_revenue')
      ])
      .executeTakeFirst();
    
    // Cache for 1 minute
    await apiCache.set(cacheKey, stats, 60);
  }
  
  return c.json(stats);
});

Performance Metrics โ€‹

Typical Performance โ€‹

OperationLatencyCache Tier
LRU Hit1-2msโšกโšกโšก Memory
Redis Hit5-10msโšกโšก Redis
Database Query20-50msโšก Database

Real-World Results โ€‹

These concepts are used in many production systems worldwide with excellent results:

Cache Statistics:
โ”œโ”€ Total Requests: 1,000,000+
โ”œโ”€ Cache Hits: 97%
โ”œโ”€ Cache Misses: 3%
โ”œโ”€ Avg Response Time: 2.3ms
โ””โ”€ Performance Improvement: 50x faster

Configuration Options โ€‹

typescript
interface HybridCacheOptions {
  cacheType: string;      // Cache identifier
  ttl: number;            // Time to live (seconds)
  maxSize: number;        // Max items in LRU
  keyPrefix: string;      // Redis key prefix
  redisUrl?: string;      // Redis connection
  enabled?: boolean;      // Enable/disable cache
}

Advanced Configuration โ€‹

Custom Cache Instance โ€‹

typescript
const customCache = new HybridCache<MyData>({
  cacheType: 'custom',
  ttl: 300,
  maxSize: 5000,
  keyPrefix: 'custom',

  // Advanced Redis options
  redisOptions: {
    host: 'redis.example.com',
    port: 6380,
    password: 'secret',
    db: 2,
    tls: {
      rejectUnauthorized: false
    }
  },

  // Advanced LRU options
  lruOptions: {
    max: 5000,
    ttl: 300000,  // 5 minutes in ms
    updateAgeOnGet: true,
    updateAgeOnHas: false
  }
});

Cache Invalidation โ€‹

Pattern 1: Invalidate on Update โ€‹

typescript
async function updateUser(userId: string, data: Partial<User>) {
  // Update database
  await db.updateTable('users')
    .set(data)
    .where('id', '=', userId)
    .execute();

  // Invalidate cache
  await userCache.delete(userId);
}

Pattern 2: Refresh Cache After Update โ€‹

typescript
async function updateUserAndRefresh(userId: string, data: Partial<User>) {
  // Update database
  const updated = await db.updateTable('users')
    .set(data)
    .where('id', '=', userId)
    .returningAll()
    .executeTakeFirst();

  // Refresh cache with new data
  if (updated) {
    await userCache.set(userId, updated, 600);
  }
}
typescript
async function deleteUser(userId: string) {
  // Delete from database
  await db.deleteFrom('users')
    .where('id', '=', userId)
    .execute();

  // Clear all related caches
  await userCache.delete(userId);
  await profileCache.delete(userId);
  await sessionCache.clear(); // Clear all sessions
}

Best Practices โ€‹

โœ… Do โ€‹

  • Use short TTLs for frequently changing data (1-5 minutes)
  • Use longer TTLs for static data (10-60 minutes)
  • Invalidate cache after updates (create, update, delete)
  • Monitor hit rates to optimize cache size
  • Use key prefixes to organize cache namespaces

โŒ Don't โ€‹

  • Cache sensitive data without encryption
  • Set TTL too high for dynamic data
  • Forget to invalidate after updates
  • Cache everything - only cache hot data
  • Ignore cache metrics - monitor and optimize

Monitoring โ€‹

typescript
// Get cache stats
const stats = cache.getStats();

console.log('Hits:', stats.hits);
console.log('Misses:', stats.misses);
console.log('Hit Rate:', stats.hitRate);
console.log('Size:', stats.size);

Troubleshooting โ€‹

Cache Not Working โ€‹

Problem: Cache always misses

Solutions:

bash
# Check Redis connection
redis-cli ping
# Should return: PONG

# Check environment variables
echo $CACHE_ENABLED  # Should be: true
echo $REDIS_URL      # Should be: redis://...

# Check logs
LOG_LEVEL=debug bun run dev

Redis Connection Failed โ€‹

Problem: Error: Redis connection failed

Solutions:

env
# Option 1: Use local Redis
REDIS_URL=redis://localhost:6379

# Option 2: Use Redis Cloud
REDIS_URL=redis://username:password@host:port

# Option 3: Disable Redis (LRU only)
CACHE_ENABLED=true
REDIS_URL=  # Leave empty

High Memory Usage โ€‹

Problem: Flowfull using too much memory

Solutions:

typescript
// Reduce LRU cache size
const cache = new HybridCache({
  maxSize: 1000,  // Reduce from 10000
  ttl: 60         // Shorter TTL
});

// Or disable LRU, use Redis only
const cache = new HybridCache({
  maxSize: 0,     // Disable LRU
  ttl: 300
});

Summary โ€‹

HybridCache gives you:

โœ… 97% cache hit rate - Most requests served from memory
โœ… 50x faster responses - 1-2ms instead of 50ms
โœ… Horizontal scaling - Redis shared across instances
โœ… Automatic fallback - LRU โ†’ Redis โ†’ Database
โœ… Production ready - Used in many production systems

Next Steps โ€‹

Need Help? โ€‹

  • ๐ŸŒ Notside.com - Professional caching optimization
  • ๐Ÿ“ง Email: contact@notside.com
  • ๐Ÿ’ผ Services: Performance optimization, architecture consulting

Released under the MIT License.