Skip to main content

TVL Platform - PII Handling & Data Security

Summary

This document defines PII classification, encryption strategies, data retention policies, and GDPR compliance measures for the TVL platform's PostgreSQL database.

Table of Contents


PII Classification

PII Categories

The TVL platform handles the following categories of Personally Identifiable Information (PII):

High Sensitivity PII (Red)

  • Payment Information: Credit card numbers, CVV, bank account numbers (handled by Stripe, NOT stored in TVL database)
  • Government IDs: Passport numbers, driver's license numbers (not collected in MVP)
  • Authentication Credentials: API keys, webhook secrets, OAuth tokens

Storage: NEVER stored in TVL database; delegated to Stripe for payment data

Medium Sensitivity PII (Orange)

  • User Identity: Full name, email address, phone number
  • Guest Information: Booking guest names, emails, phones
  • Location Data: Property addresses, GPS coordinates
  • User Activity: Audit logs with IP addresses, user agents

Storage: Encrypted at rest (application-level encryption recommended)

Low Sensitivity PII (Yellow)

  • Organization Data: Organization name, subdomain
  • Property Metadata: Property descriptions, amenity lists
  • Booking Metadata: Booking numbers, check-in/out dates (without guest info)

Storage: Standard database security (RLS, TLS in transit)


PII Field Inventory

TableColumnSensitivityEncryption RequiredNotes
usersemailMediumRecommendedUsed for authentication
usersfull_nameMediumRecommendedDisplay name
usersavatar_urlLowNoPublic URL
usersauth_provider_idMediumRecommendedExternal auth system ID
bookingsguest_nameMediumRequiredPrimary guest identity
bookingsguest_emailMediumRequiredGuest contact
bookingsguest_phoneMediumRequiredGuest contact
booking_guestsguest_nameMediumRequiredAdditional guest identity
booking_guestsguest_emailMediumRequiredAdditional guest contact
booking_guestsguest_phoneMediumRequiredAdditional guest contact
propertiesaddress_line1MediumRecommendedPhysical location
propertiesaddress_line2MediumRecommendedApartment/unit number
propertieslatitudeMediumNoGPS coordinates (public)
propertieslongitudeMediumNoGPS coordinates (public)
connector_configsapi_key_encryptedHighRequiredExternal API credentials
connector_configsapi_secret_encryptedHighRequiredExternal API credentials
connector_configswebhook_secretHighRequiredWebhook verification
audit_logsip_addressMediumRecommendedUser activity tracking
audit_logsuser_agentLowNoBrowser/client info

Encryption Strategy

Encryption at Rest

Database-Level Encryption (Supabase Default)

  • Provider: Supabase (AWS RDS encryption)
  • Method: AES-256 block-level encryption
  • Key Management: AWS KMS (managed by Supabase)
  • Coverage: All tables, indexes, backups
  • Status: Enabled by default (no configuration required)

Limitations: Database-level encryption protects against disk theft but does NOT protect against:

  • Database compromise (attacker with database access can read plaintext)
  • SQL injection attacks
  • Insider threats (DBAs can read data)
  • Fields: guest_name, guest_email, guest_phone, api_key_encrypted, api_secret_encrypted
  • Method: AES-256-GCM (authenticated encryption)
  • Key Storage: Environment variables (NOT in database)
  • Key Rotation: Annual rotation with re-encryption

Implementation Example:

import { createCipheriv, createDecipheriv, randomBytes } from 'crypto';

// Encryption configuration
const ENCRYPTION_KEY = process.env.ENCRYPTION_KEY; // 32-byte hex string
const ALGORITHM = 'aes-256-gcm';

export function encrypt(plaintext: string): string {
const iv = randomBytes(16);
const cipher = createCipheriv(ALGORITHM, Buffer.from(ENCRYPTION_KEY, 'hex'), iv);

let encrypted = cipher.update(plaintext, 'utf8', 'hex');
encrypted += cipher.final('hex');

const authTag = cipher.getAuthTag();

// Format: iv:authTag:ciphertext (hex encoded)
return `${iv.toString('hex')}:${authTag.toString('hex')}:${encrypted}`;
}

export function decrypt(ciphertext: string): string {
const [ivHex, authTagHex, encrypted] = ciphertext.split(':');

const decipher = createDecipheriv(
ALGORITHM,
Buffer.from(ENCRYPTION_KEY, 'hex'),
Buffer.from(ivHex, 'hex')
);

decipher.setAuthTag(Buffer.from(authTagHex, 'hex'));

let decrypted = decipher.update(encrypted, 'hex', 'utf8');
decrypted += decipher.final('utf8');

return decrypted;
}

Usage in Application:

// Before INSERT
const encryptedName = encrypt(booking.guest_name);
await db.bookings.insert({
guest_name: encryptedName,
guest_email: encrypt(booking.guest_email),
// ...
});

// After SELECT
const booking = await db.bookings.findById(bookingId);
const decryptedName = decrypt(booking.guest_name);

Encryption in Transit

TLS/SSL Configuration

  • Supabase Connection: TLS 1.2+ enforced
  • API Endpoints: HTTPS only (HTTP redirects to HTTPS)
  • Certificate: Let's Encrypt (auto-renewed via Vercel/Railway)
  • Cipher Suites: Modern ciphers only (no TLS 1.0/1.1)

Connection String Example:

postgresql://user:password@db.supabase.co:5432/postgres?sslmode=require

API Security Headers:

Strict-Transport-Security: max-age=31536000; includeSubDomains
X-Content-Type-Options: nosniff
X-Frame-Options: DENY
Content-Security-Policy: default-src 'self'

Key Management

Encryption Key Lifecycle

  1. Generation:

    # Generate 256-bit key (32 bytes)
    openssl rand -hex 32
  2. Storage:

    • Development: .env.local (excluded from git)
    • Staging: Railway environment variables
    • Production: Railway environment variables (team access only)
  3. Rotation (Annual):

    • Generate new key
    • Re-encrypt all PII fields with new key (background job)
    • Deprecate old key after 30-day overlap period

Key Rotation Script:

async function rotateEncryptionKey(oldKey: string, newKey: string) {
const tables = [
{ table: 'bookings', columns: ['guest_name', 'guest_email', 'guest_phone'] },
{ table: 'booking_guests', columns: ['guest_name', 'guest_email', 'guest_phone'] },
{ table: 'connector_configs', columns: ['api_key_encrypted', 'api_secret_encrypted'] },
];

for (const { table, columns } of tables) {
const records = await db[table].findAll();

for (const record of records) {
const updates: any = {};
for (const col of columns) {
if (record[col]) {
const decrypted = decryptWithKey(record[col], oldKey);
updates[col] = encryptWithKey(decrypted, newKey);
}
}
await db[table].update(record.id, updates);
}
}
}

Data Retention Policies

Retention Periods

Data CategoryRetention PeriodDeletion MethodRationale
Bookings (active)Until check-out + 30 daysSoft deleteGuest support window
Bookings (completed)7 yearsArchive to cold storageTax/legal compliance
Payments7 yearsArchive to cold storageFinancial compliance
Transactions7 yearsArchive to cold storageAudit trail
Audit Logs3 yearsHard deleteCompliance/debugging
Analytics Events2 yearsHard deleteProduct analytics
Webhook Logs90 daysHard deleteDebugging only
Sync Logs90 daysHard deleteDebugging only
Quotes30 daysHard deleteExpired quotes
Holds7 days after expirationHard deleteExpired holds
User Accounts (deleted)30 days soft delete, then purgeHard delete after 30 daysRight to be forgotten

Automated Retention Jobs

Daily Cleanup Job (runs at 2 AM UTC):

-- Delete expired quotes (older than 30 days)
DELETE FROM quotes
WHERE created_at < NOW() - INTERVAL '30 days';

-- Delete expired holds (older than 7 days after expiration)
DELETE FROM holds
WHERE expires_at < NOW() - INTERVAL '7 days';

-- Delete old webhook logs (older than 90 days)
DELETE FROM webhook_logs
WHERE received_at < NOW() - INTERVAL '90 days';

-- Delete old sync logs (older than 90 days)
DELETE FROM sync_logs
WHERE started_at < NOW() - INTERVAL '90 days';

-- Delete old analytics events (older than 2 years)
DELETE FROM analytics_events
WHERE timestamp < NOW() - INTERVAL '2 years';

-- Delete old audit logs (older than 3 years)
DELETE FROM audit_logs
WHERE created_at < NOW() - INTERVAL '3 years';

-- Purge soft-deleted users (deleted > 30 days ago)
DELETE FROM users
WHERE deleted_at IS NOT NULL AND deleted_at < NOW() - INTERVAL '30 days';

Annual Archival Job (runs January 1st):

async function archiveOldBookings() {
// Archive bookings older than 7 years to S3 cold storage
const cutoffDate = new Date();
cutoffDate.setFullYear(cutoffDate.getFullYear() - 7);

const oldBookings = await db.bookings
.where('created_at', '<', cutoffDate)
.select('*');

// Export to JSON and upload to S3
const archive = {
archived_at: new Date().toISOString(),
count: oldBookings.length,
data: oldBookings,
};

await s3.putObject({
Bucket: 'tvl-archives',
Key: `bookings/archive-${cutoffDate.getFullYear()}.json.gz`,
Body: gzip(JSON.stringify(archive)),
StorageClass: 'GLACIER',
});

// Delete from database after successful archive
await db.bookings.deleteMany({ id: { in: oldBookings.map(b => b.id) } });
}

GDPR Compliance

Data Subject Rights

Right to Access (GDPR Article 15)

Implementation: API endpoint /api/v1/gdpr/data-export

Response Format:

{
"user_id": "uuid",
"email": "user@example.com",
"data_collected_at": "2025-01-24T10:30:00Z",
"data": {
"profile": { "full_name": "...", "email": "...", "phone": "..." },
"bookings": [ { "booking_number": "...", "check_in": "...", ... } ],
"payments": [ { "amount": 150.00, "status": "succeeded", ... } ],
"audit_logs": [ { "action": "login", "timestamp": "...", ... } ]
}
}

Delivery: Secure download link (expires after 24 hours)


Right to Erasure / Right to be Forgotten (GDPR Article 17)

Implementation: API endpoint /api/v1/gdpr/delete-account

Process:

  1. Soft delete user account (users.deleted_at = NOW())
  2. Anonymize PII in bookings:
    UPDATE bookings
    SET guest_name = 'DELETED',
    guest_email = 'deleted@privacy.tvl',
    guest_phone = NULL
    WHERE id IN (SELECT booking_id FROM bookings WHERE user_id = ?);
  3. Remove from active sessions
  4. Schedule hard delete after 30-day grace period
  5. Send confirmation email

Exceptions (GDPR Article 17(3)):

  • Bookings with active payments (retain for financial compliance)
  • Transactions (retain for 7 years per tax law)
  • Audit logs (retain for 3 years for legal compliance)

Right to Rectification (GDPR Article 16)

Implementation: Standard update endpoints with audit logging

Example:

app.patch('/api/v1/users/me', async (req, res) => {
const { email, full_name } = req.body;

await db.users.update(req.user.id, { email, full_name });

// Log change for audit trail
await db.audit_logs.insert({
org_id: req.user.org_id,
user_id: req.user.id,
action: 'updated',
resource_type: 'users',
resource_id: req.user.id,
changes: { email: { old: req.user.email, new: email } },
});

res.json({ success: true });
});

Right to Data Portability (GDPR Article 20)

Implementation: Export to machine-readable format (JSON, CSV)

Supported Formats:

  • JSON (structured data)
  • CSV (tabular exports for bookings, payments)

  • Email marketing (opt-in checkbox at registration)
  • SMS notifications (opt-in checkbox at registration)
  • Analytics tracking (opt-out banner)

Consent Storage:

ALTER TABLE users ADD COLUMN consent_email_marketing BOOLEAN DEFAULT false;
ALTER TABLE users ADD COLUMN consent_sms_notifications BOOLEAN DEFAULT false;
ALTER TABLE users ADD COLUMN consent_analytics_tracking BOOLEAN DEFAULT true;
ALTER TABLE users ADD COLUMN consent_updated_at TIMESTAMPTZ;

Consent Withdrawal:

app.post('/api/v1/users/me/consent/withdraw', async (req, res) => {
const { consent_type } = req.body; // 'email_marketing', 'sms_notifications', 'analytics_tracking'

await db.users.update(req.user.id, {
[`consent_${consent_type}`]: false,
consent_updated_at: new Date(),
});

// Stop processing immediately (remove from mailing lists, etc.)
await emailService.unsubscribe(req.user.email);

res.json({ success: true });
});

Data Processing Agreements (DPA)

Third-Party Data Processors

ProcessorPurposeData SharedDPA Status
SupabaseDatabase hostingAll platform data✅ GDPR-compliant (AWS Frankfurt region option)
StripePayment processingPayment info, guest email✅ GDPR-compliant (PCI-DSS certified)
HostawayPMS integrationProperty data, bookings✅ DPA signed
SendGrid/AWS SESEmail deliveryEmail addresses, names✅ GDPR-compliant
TwilioSMS deliveryPhone numbers✅ GDPR-compliant
Upstash RedisCaching/queuesSession data (no PII)✅ GDPR-compliant (EU region)
RailwayBackend hostingAll platform data✅ GDPR-compliant
VercelFrontend hostingSession tokens (no PII)✅ GDPR-compliant

Data Transfer Mechanisms:

  • EU Region Selection: Supabase (AWS Frankfurt), Upstash (EU region)
  • Standard Contractual Clauses (SCCs): For US-based services (Stripe, Twilio)

Data Access Controls

Role-Based Access Control (RBAC)

RoleAccess LevelPII AccessNotes
Platform AdminFull accessYes (all orgs)TVL internal staff only
OwnerOrg-wideYes (own org)Organization owner
ManagerOrg-wide (read/write)Yes (own org)Property manager
StaffLimited (assigned properties)Partial (booking details only)Operations staff
FinanceFinancial data onlyNo (anonymized bookings)Accounting team

RLS Policy Example:

-- Users can only access bookings in their organization
CREATE POLICY "Users can access bookings in their organization"
ON bookings FOR ALL
USING (org_id = current_setting('app.current_org_id')::uuid);

-- Staff can only access bookings for properties they manage
CREATE POLICY "Staff can access assigned property bookings"
ON bookings FOR SELECT
USING (
unit_id IN (
SELECT u.id FROM units u
JOIN properties p ON u.property_id = p.id
JOIN property_managers pm ON p.id = pm.property_id
WHERE pm.user_id = current_setting('app.current_user_id')::uuid
)
);

Database Access Logging

Enable PostgreSQL Audit Logging (Supabase Dashboard):

  • Log all SELECT, UPDATE, DELETE on PII tables
  • Retention: 90 days
  • Review: Weekly automated reports to security team

Example Audit Log Entry:

{
"timestamp": "2025-01-24T10:30:00Z",
"user": "postgres",
"database": "tvl_production",
"query": "SELECT guest_name, guest_email FROM bookings WHERE id = ?",
"parameters": ["uuid-123"],
"source_ip": "10.0.1.50"
}

Incident Response

Data Breach Response Plan

Phase 1: Detection & Containment (0-1 hour)

  1. Detect: Automated alerts (Sentry, Supabase logs)
  2. Isolate: Revoke compromised credentials immediately
  3. Assess: Determine scope (which tables/records affected)
  4. Notify: Alert incident response team via PagerDuty

Phase 2: Investigation (1-24 hours)

  1. Root Cause Analysis: Review logs, identify attack vector
  2. Data Impact Assessment: Count affected users/records
  3. Regulatory Review: Determine if notification required (GDPR Article 33: 72-hour deadline)

Phase 3: Notification (24-72 hours)

  1. Data Protection Authority (if >1000 users affected or high-risk data)
  2. Affected Users (email notification with mitigation steps)
  3. Public Disclosure (if legally required)

Phase 4: Remediation

  1. Patch Vulnerability: Deploy security fixes
  2. Key Rotation: Rotate all encryption keys, API keys
  3. Password Reset: Force password reset for affected users
  4. Post-Mortem: Document incident, update security policies

Notification Template:

Subject: Important Security Notice - TVL Platform Data Incident

Dear [User],

We are writing to inform you of a security incident that may have affected your account.

What Happened:
On [date], we discovered unauthorized access to [affected system]. We immediately took action to secure our systems and investigate the incident.

What Information Was Involved:
The following information may have been accessed:
- [List of PII fields]

What We Are Doing:
- We have secured the vulnerability
- We are working with security experts to investigate
- We have notified the relevant authorities

What You Should Do:
- Change your password immediately
- Monitor your account for suspicious activity
- Contact us at security@tvl.com with any concerns

We sincerely apologize for this incident and are committed to protecting your data.

Sincerely,
The TVL Security Team

Validation & Alternatives

Security Architecture Decisions

Agree: Application-level encryption for high-sensitivity PII

  • Alternative: Database-level encryption only (Supabase default)
  • Trade-off: Application-level adds complexity but provides defense in depth

Agree: 7-year retention for financial records

  • Alternative: 10 years (some jurisdictions require)
  • Trade-off: 7 years sufficient for US/EU, can extend per jurisdiction

Agree: Soft delete with 30-day grace period

  • Alternative: Immediate hard delete on user request
  • Trade-off: Grace period allows recovery of accidental deletions, compliant with GDPR Article 17

⚠️ Consider: Field-level encryption vs. full row encryption

  • Current: Individual field encryption (guest_name, guest_email, etc.)
  • Alternative: Encrypt entire booking JSON and store in single column
  • Recommendation: Field-level allows granular access control and indexing

⚠️ Consider: Tokenization for guest emails

  • Current: Encrypted storage
  • Alternative: Store hashed token, keep plaintext in external vault (e.g., HashiCorp Vault)
  • Recommendation: Tokenization adds complexity; encryption sufficient for MVP

Known Gaps & Assumptions

Assumptions

  1. Supabase default encryption (AES-256) sufficient for database-level protection
  2. US/EU operations only (no APAC-specific regulations like China PIPL)
  3. Annual key rotation acceptable (not quarterly)
  4. 7-year retention compliant with all target markets

Gaps

  1. No Hardware Security Module (HSM) for key storage
    • Risk: Encryption keys in environment variables (accessible to admins)
    • Mitigation: Upgrade to AWS KMS or HashiCorp Vault for production
  2. No Data Loss Prevention (DLP) tooling
    • Risk: Accidental PII leakage in logs, error messages
    • Mitigation: Implement log sanitization (redact PII before logging)
  3. No Automated PII Discovery in codebase
    • Risk: Developers inadvertently log PII
    • Mitigation: Add pre-commit hooks to scan for PII patterns (regex for emails, phones)
  4. No Customer-Managed Encryption Keys (CMEK)
    • Risk: Supabase manages encryption keys (not customer-controlled)
    • Mitigation: Enterprise Supabase plan supports CMEK if needed

Compliance Checklist

GDPR Compliance

  • Data inventory (PII fields documented)
  • Encryption at rest (Supabase default)
  • Encryption in transit (TLS 1.2+)
  • Right to access (data export API)
  • Right to erasure (account deletion API)
  • Right to rectification (update endpoints)
  • Right to data portability (JSON/CSV export)
  • Consent management (opt-in/opt-out)
  • Data retention policies (automated cleanup)
  • Breach notification plan (72-hour SLA)
  • Data Processing Agreements (all vendors)

PCI-DSS Compliance (Delegated to Stripe)

  • No card data stored in TVL database
  • Stripe-hosted payment forms
  • Tokenized payment methods only
  • TLS for all payment transactions

SOC 2 Type II (Future)

  • Formal security policies documented
  • Annual penetration testing
  • Security awareness training for team
  • Incident response drills (tabletop exercises)
  • Third-party audit (deferred until Series A)

Sources

  • /mnt/c/GitHub/claude-test/docs/04-data/schema-overview.md
  • /mnt/c/GitHub/claude-test/docs/04-data/data-dictionary.md
  • GDPR Official Text (Regulation EU 2016/679)
  • Supabase Security Documentation
  • OWASP Application Security Guidelines