ADR-0028: Webhook Idempotency with Payload Hashing
Status
Accepted - 2025-01-26
Context
Webhook partners (Hostaway, Airbnb, VRBO) may send duplicate events due to retries, requiring idempotent processing.
Decision
SHA-256 payload hashing for deduplication with Redis-backed idempotency cache.
Rationale
- Deterministic: Same payload = same hash
- Fast: O(1) lookup in Redis
- TTL-Based: Auto-expiry after 24 hours
- Memory Efficient: 32-byte hash vs full payload storage
Alternatives Considered
Alternative 1: Request ID Only
Rejected - Partners may reuse IDs, doesn't detect payload changes
Alternative 2: Database Deduplication
Rejected - Slower (disk I/O), requires cleanup job
Alternative 3: In-Memory Cache (Node.js)
Rejected - Lost on restart, doesn't work across replicas
Implementation
// src/webhooks/idempotency.ts
import crypto from 'crypto';
import Redis from 'ioredis';
const redis = new Redis(process.env.REDIS_URL);
export function hashPayload(payload: object): string {
  const canonical = JSON.stringify(payload, Object.keys(payload).sort());
  return crypto.createHash('sha256').update(canonical).digest('hex');
}
export async function isProcessed(hash: string): Promise<boolean> {
  const exists = await redis.exists(`webhook:${hash}`);
  return exists === 1;
}
export async function markProcessed(hash: string): Promise<void> {
  await redis.setex(`webhook:${hash}`, 86400, '1'); // 24-hour TTL
}
// src/routes/webhooks/hostaway.ts
app.post('/webhooks/hostaway', async (req, reply) => {
  const payloadHash = hashPayload(req.body);
  // Check if already processed
  if (await isProcessed(payloadHash)) {
    logger.info({ payloadHash }, 'Duplicate webhook ignored');
    return reply.status(200).send({ received: true });
  }
  // Process webhook
  await processHostawayWebhook(req.body);
  // Mark as processed
  await markProcessed(payloadHash);
  return reply.status(200).send({ received: true });
});
Deduplication Strategy
1. Canonical JSON Serialization
// Sort keys to ensure deterministic hash
const canonical = JSON.stringify(payload, Object.keys(payload).sort());
2. Redis Key Pattern
webhook:{sha256_hash} → "1" (TTL: 24h)
3. TTL Configuration
- 24 hours: Prevents infinite storage growth
- Partners retry max 24h: Covers all retry windows
Edge Cases
Case 1: Same Event, Different Timestamps
// Include event ID + resource ID, exclude timestamp
const dedupeKey = {
  eventId: webhook.eventId,
  resourceId: webhook.resourceId,
  eventType: webhook.eventType,
  // Exclude: timestamp, retryCount
};
const hash = hashPayload(dedupeKey);
Case 2: Redis Failure
try {
  if (await isProcessed(hash)) {
    return reply.status(200).send({ received: true });
  }
} catch (error) {
  logger.error({ error }, 'Redis idempotency check failed');
  // Fail open: process webhook anyway (trade-off: potential duplicate)
}
Consequences
Positive
- ✅ Exact Deduplication: Payload-level verification
- ✅ Fast Lookups: O(1) Redis performance
- ✅ Auto Cleanup: TTL-based expiry
Negative
- ❌ Redis Dependency: Single point of failure
- ❌ Clock Skew: Partners with fluctuating timestamps may bypass
Mitigations
- Use Redis with replication (Upstash)
- Extract deterministic fields (eventId, resourceId) for hashing
- Monitor duplicate processing rate
Validation Checklist
- SHA-256 payload hashing implemented
- Redis idempotency cache with 24h TTL
- Canonical JSON serialization (sorted keys)
- Redis failure handling (fail open)
- Duplicate webhook logging