Skip to content

Cache API Reference

Complete API reference for the HybridCache system.

HybridCache Class

The HybridCache class provides a 3-tier caching system: LRU → Redis → Database.

Constructor

typescript
class HybridCache<T = any> {
  constructor(config: CacheConfig)
}

Parameters:

typescript
interface CacheConfig {
  cacheType: string;        // Cache identifier (e.g., 'userContext')
  ttl: number;              // Time-to-live in seconds
  maxSize: number;          // Max items in LRU cache
  keyPrefix: string;        // Redis key prefix
  redisOptions?: {          // Optional Redis config
    host?: string;
    port?: number;
    password?: string;
    db?: number;
    tls?: object;
  };
  lruOptions?: {            // Optional LRU config
    max?: number;
    ttl?: number;
    updateAgeOnGet?: boolean;
    updateAgeOnHas?: boolean;
  };
}

Example:

typescript
import { HybridCache } from './lib/cache/hybrid-cache';

const userCache = new HybridCache<UserData>({
  cacheType: 'userContext',
  ttl: 300,           // 5 minutes
  maxSize: 10000,     // 10k items in LRU
  keyPrefix: 'user'
});

Methods

get()

Get value from cache (3-tier fallback: Redis → LRU → null).

typescript
async get(key: string): Promise<T | null>

Parameters:

ParameterTypeRequiredDescription
keystring✅ YesCache key

Returns: Promise<T | null>

Example:

typescript
const user = await userCache.get('user_123');

if (user) {
  console.log('✅ Cache hit!', user);
} else {
  console.log('❌ Cache miss - query database');
  const user = await db.selectFrom('users')
    .where('id', '=', 'user_123')
    .selectAll()
    .executeTakeFirst();
  
  // Store in cache
  await userCache.set('user_123', user);
}

set()

Set value in cache (both Redis and LRU).

typescript
async set(key: string, value: T, customTtl?: number): Promise<void>

Parameters:

ParameterTypeRequiredDescription
keystring✅ YesCache key
valueT✅ YesValue to cache
customTtlnumber❌ NoCustom TTL (seconds)

Returns: Promise<void>

Example:

typescript
// Use default TTL (from config)
await userCache.set('user_123', userData);

// Use custom TTL (10 minutes)
await userCache.set('user_123', userData, 600);

delete()

Delete value from cache (both Redis and LRU).

typescript
async delete(key: string): Promise<void>

Parameters:

ParameterTypeRequiredDescription
keystring✅ YesCache key to delete

Returns: Promise<void>

Example:

typescript
// After user update
await db.updateTable('users')
  .set({ name: 'New Name' })
  .where('id', '=', 'user_123')
  .execute();

// Invalidate cache
await userCache.delete('user_123');

deletePattern()

Delete multiple keys matching a pattern.

typescript
async deletePattern(pattern: string): Promise<void>

Parameters:

ParameterTypeRequiredDescription
patternstring✅ YesRegex pattern to match keys

Returns: Promise<void>

Example:

typescript
// Delete all user caches
await userCache.deletePattern('user_.*');

// Delete specific organization caches
await userCache.deletePattern('org_123_.*');

clear()

Clear all cache (both Redis and LRU).

typescript
async clear(): Promise<void>

Returns: Promise<void>

Example:

typescript
// Clear all user caches
await userCache.clear();

getMetrics()

Get cache performance metrics.

typescript
getMetrics(): CacheMetrics

Returns: CacheMetrics

typescript
interface CacheMetrics {
  totalRequests: number;
  cacheHits: number;
  cacheMisses: number;
  redisHits: number;
  lruHits: number;
  dbHits: number;
  errors: number;
  hitRate: number;          // Calculated: hits / total
  lastError?: string;
  lastErrorAt?: Date;
}

Example:

typescript
const metrics = userCache.getMetrics();

console.log('Cache Performance:');
console.log(`  Total Requests: ${metrics.totalRequests}`);
console.log(`  Hit Rate: ${(metrics.hitRate * 100).toFixed(2)}%`);
console.log(`  Redis Hits: ${metrics.redisHits}`);
console.log(`  LRU Hits: ${metrics.lruHits}`);
console.log(`  DB Hits: ${metrics.dbHits}`);

Cache Tiers

Tier 1: Redis (Distributed)

  • Speed: 2-5ms
  • Scope: Shared across all instances
  • Persistence: Survives restarts
  • Use: Production with multiple instances

Tier 2: LRU (In-Memory)

  • Speed: <1ms
  • Scope: Local to instance
  • Persistence: Lost on restart
  • Use: Hot data, frequently accessed

Tier 3: Database

  • Speed: 10-50ms
  • Scope: Source of truth
  • Persistence: Permanent
  • Use: Fallback when cache misses

Configuration

Environment Variables

env
# Enable caching
CACHE_ENABLED=true

# Redis connection
REDIS_URL=redis://localhost:6379

# Optional: Redis password
REDIS_PASSWORD=your-password

# Optional: Redis DB
REDIS_DB=0

Multiple Cache Instances

typescript
// User cache (long TTL)
const userCache = new HybridCache<User>({
  cacheType: 'user',
  ttl: 600,        // 10 minutes
  maxSize: 10000,
  keyPrefix: 'user'
});

// Session cache (short TTL)
const sessionCache = new HybridCache<Session>({
  cacheType: 'session',
  ttl: 300,        // 5 minutes
  maxSize: 5000,
  keyPrefix: 'session'
});

// API response cache (very short TTL)
const apiCache = new HybridCache<ApiResponse>({
  cacheType: 'api',
  ttl: 60,         // 1 minute
  maxSize: 1000,
  keyPrefix: 'api'
});

Best Practices

  1. Use appropriate TTLs - Balance freshness and performance
  2. Invalidate on updates - Keep cache consistent
  3. Monitor metrics - Track hit rates and errors
  4. Use Redis in production - For multi-instance deployments
  5. Set reasonable max sizes - Prevent memory issues
  6. Handle cache failures gracefully - Always have database fallback

See Also

Released under the MIT License.