ADR-0040: LRU Cache Eviction Policy
Status
Accepted - 2025-01-26
Context
Redis cache has memory limits (256MB Upstash free tier), requiring eviction policy when memory full.
Decision
LRU eviction (Least Recently Used) with volatile-lru for keys with TTL.
Rationale
- Keep Hot Data: Evict rarely-accessed keys
- TTL Respect: Only evict keys with expiry
- Predictable: Popular keys stay cached
- Industry Standard: Used by CDNs, browsers
Alternatives Considered
Alternative 1: LFU (Least Frequently Used)
Rejected - More complex, similar performance to LRU
Alternative 2: FIFO (First In First Out)
Rejected - Doesn't consider access patterns
Alternative 3: Random Eviction
Rejected - May evict hot keys
Configuration
Redis maxmemory Policy
# redis.conf (Upstash default)
maxmemory 256mb
maxmemory-policy volatile-lru  # Only evict keys with TTL
Policy options:
| Policy | Behavior | 
|---|---|
| volatile-lru | Evict LRU keys with TTL (recommended) | 
| allkeys-lru | Evict any LRU key (dangerous) | 
| volatile-ttl | Evict keys with shortest TTL | 
| noeviction | Return errors when memory full | 
Memory Management
Set Memory Limits
# Check memory usage
redis-cli INFO memory
# Output:
# used_memory: 128MB
# used_memory_peak: 200MB
# maxmemory: 256MB
Monitor Memory Usage
// src/monitoring/redisMemory.ts
export async function checkMemoryUsage() {
  const info = await redis.info('memory');
  const usedMemory = parseInt(info.match(/used_memory:(\d+)/)?.[1] || '0');
  const maxMemory = parseInt(info.match(/maxmemory:(\d+)/)?.[1] || '0');
  const usagePercent = (usedMemory / maxMemory) * 100;
  if (usagePercent > 80) {
    logger.warn({ usedMemory, maxMemory, usagePercent }, 'Redis memory high');
  }
  return { usedMemory, maxMemory, usagePercent };
}
// Run every 5 minutes
setInterval(checkMemoryUsage, 300000);
TTL Strategy
Set Appropriate TTLs
All cached keys MUST have TTL:
// Good: With TTL
await redis.setex('org:abc:properties', 3600, JSON.stringify(properties));
// Bad: No TTL (never evicted!)
await redis.set('org:abc:properties', JSON.stringify(properties));
TTL guidelines:
| Data Type | TTL | Eviction Priority | 
|---|---|---|
| Properties | 1 hour | Low (rarely changes) | 
| Availability | 5 minutes | High (frequent updates) | 
| Pricing | 15 minutes | Medium | 
| Sessions | 24 hours | Low (long-lived) | 
| Rate limits | 1 minute | High (short window) | 
Eviction Monitoring
Track Evictions
export async function getEvictionStats() {
  const info = await redis.info('stats');
  const evictedKeys = parseInt(info.match(/evicted_keys:(\d+)/)?.[1] || '0');
  return { evictedKeys };
}
// Alert if evictions spike
setInterval(async () => {
  const stats = await getEvictionStats();
  if (stats.evictedKeys > 1000) {
    logger.error({ evictedKeys: stats.evictedKeys }, 'High eviction rate - consider upgrading memory');
  }
}, 60000); // Every minute
Memory Optimization
Compress Large Values
// src/cache/compression.ts
import zlib from 'zlib';
import { promisify } from 'util';
const gzip = promisify(zlib.gzip);
const gunzip = promisify(zlib.gunzip);
export async function cacheCompressed(key: string, value: any, ttl: number) {
  const json = JSON.stringify(value);
  const compressed = await gzip(json);
  await redis.setex(`${key}:gz`, ttl, compressed.toString('base64'));
}
export async function getCompressed<T>(key: string): Promise<T | null> {
  const compressed = await redis.get(`${key}:gz`);
  if (!compressed) return null;
  const buffer = Buffer.from(compressed, 'base64');
  const decompressed = await gunzip(buffer);
  return JSON.parse(decompressed.toString());
}
// Usage (for large objects >1KB)
await cacheCompressed('org:abc:properties', properties, 3600);
Use Hash Data Structures
Instead of storing full objects, use Redis Hashes:
// Bad: Store entire object (1KB)
await redis.setex(
  'org:abc:property:123',
  3600,
  JSON.stringify({ id: '123', name: 'Villa', price: 100 })
);
// Good: Use hash (more memory-efficient)
await redis.hset('org:abc:property:123', {
  id: '123',
  name: 'Villa',
  price: '100',
});
await redis.expire('org:abc:property:123', 3600);
Scaling Strategy
Tier Upgrade Path (Upstash)
| Tier | Memory | Price | When to Upgrade | 
|---|---|---|---|
| Free | 256MB | $0 | MVP.0 (100 properties) | 
| Pay-as-you-go | Unlimited | $0.20/100K commands | MVP.1 (1,000 properties) | 
| Pro | 1GB+ | $60+/month | V1.0 (10,000 properties) | 
Upgrade triggers:
- Eviction rate >1000/hour
- Memory usage >80% for 24h
- Cache hit rate drops <70%
Cache Warming Strategy
Prevent eviction of critical keys:
// src/cache/criticalCaches.ts
const CRITICAL_KEYS = [
  'system:config',
  'system:feature-flags',
];
export async function refreshCriticalCaches() {
  for (const key of CRITICAL_KEYS) {
    // Touch key to mark as recently used
    await redis.expire(key, 86400); // 24h TTL
  }
}
// Refresh every 6 hours
setInterval(refreshCriticalCaches, 21600000);
Testing Eviction Behavior
// tests/cache/eviction.test.ts
describe('Cache Eviction', () => {
  it('should evict LRU keys when memory full', async () => {
    // Set small memory limit (test only!)
    await redis.config('SET', 'maxmemory', '10mb');
    await redis.config('SET', 'maxmemory-policy', 'volatile-lru');
    // Fill cache with 100 keys
    for (let i = 0; i < 100; i++) {
      await redis.setex(`test:key:${i}`, 3600, 'x'.repeat(100000)); // 100KB each
    }
    // Access first 10 keys (mark as recently used)
    for (let i = 0; i < 10; i++) {
      await redis.get(`test:key:${i}`);
    }
    // Add one more key (triggers eviction)
    await redis.setex('test:key:100', 3600, 'x'.repeat(100000));
    // First 10 keys should still exist (recently used)
    for (let i = 0; i < 10; i++) {
      const exists = await redis.exists(`test:key:${i}`);
      expect(exists).toBe(1);
    }
    // Some old keys should be evicted
    let evictedCount = 0;
    for (let i = 10; i < 100; i++) {
      const exists = await redis.exists(`test:key:${i}`);
      if (exists === 0) evictedCount++;
    }
    expect(evictedCount).toBeGreaterThan(0);
  });
});
Consequences
Positive
- ✅ Automatic: No manual cleanup needed
- ✅ Smart: Keeps frequently-accessed data
- ✅ Safe: Only evicts keys with TTL
- ✅ Predictable: Industry-standard behavior
Negative
- ❌ Memory Limits: Free tier only 256MB
- ❌ Eviction Overhead: Performance impact when memory full
Mitigations
- Monitor memory usage, upgrade tier proactively
- Compress large values (>1KB)
- Use short TTLs for transient data
Validation Checklist
-  volatile-lrueviction policy configured
- All cached keys have TTL
- Memory usage monitoring enabled
- Eviction rate alerts configured
- Compression for large objects
- Tier upgrade plan documented