██████╗ ██╗ ██╗ ██╗ ███████╗ ███████╗ ██╗ ██╗ ███╗ ███╗ █████╗ █████╗
██╔══██╗ ██║ ██║ ██╔╝ ██╔════╝ ██╔════╝ ██║ ██║ ████╗ ████║ ██╔══██╗ ██╔══██╗
██████╔╝ ██║ █████╔╝ ███████╗ █████╗ ██║ ██║ ██╔████╔██║ ███████║ ███████║
██╔═══╝ ██║ ██╔═██╗ ╚════██║ ██╔══╝ ██║ ██║ ██║╚██╔╝██║ ██╔══██║ ██╔══██║
██║ ██║ ██║ ██╗ ███████║ ███████╗ ███████╗ ██║ ██║ ╚═╝ ██║ ██║ ██║ ██║ ██║
╚═╝ ╚═╝ ╚═╝ ╚═╝ ╚══════╝ ╚══════╝ ╚══════╝ ╚═╝ ╚═╝ ╚═╝ ╚═╝ ╚═╝ ╚═╝ ╚═╝
Technical work I've shipped — architecture, trade-offs, and honest reflections. Click any entry to expand the full case study.
Running a self-hosted AI chat service (Open WebUI + Ollama) for personal use requires GDPR compliance — users must give informed, documented consent before their data is processed. No lightweight, off-the-shelf solution exists for adding a consent layer to arbitrary self-hosted services. The service also needed a verifiable audit trail to demonstrate compliance without depending on any third-party cloud provider.
Built a TypeScript reverse proxy that sits in front of the upstream service and intercepts every request. Before forwarding traffic, it validates a signed consent cookie against a local database. Users who have not consented are served a privacy policy page; those who accept receive a signed cookie that grants access for one year. All consent events are written to an append-only audit log. The proxy is service-agnostic and designed to front multiple services, each with their own independently versioned policy.
sequenceDiagram
actor User as User (Browser)
participant NX as Nginx (TLS)
participant CP as Consent Proxy
participant DB as SQLite
participant OW as Open WebUI
User->>NX: HTTPS request
NX->>CP: HTTP (internal)
CP->>DB: Validate consent cookie
alt No valid consent
DB-->>CP: Invalid / not found
CP-->>User: 200 Privacy Policy page
User->>CP: POST /consent (accept)
CP->>DB: Write consent record + audit log
CP-->>User: Set signed cookie, redirect
else Valid consent
DB-->>CP: Valid
CP->>OW: Forward request
OW-->>CP: Response (HTTP or WebSocket)
CP-->>User: Forward response
end
WebSocket proxying through a Fastify HTTP pipeline: Open WebUI uses Socket.IO for real-time streaming. Routing WebSocket upgrades through Fastify's request pipeline left the Node.js HTTP parser attached to the client socket as a competing data listener, silently consuming all client-to-upstream bytes. Diagnosed via systematic debug logging and fixed by handling the upgrade event directly on the raw Node.js HTTP server, bypassing the framework entirely.
Bidirectional socket piping without data loss: After the WebSocket handshake, both sockets must exchange raw bytes in both directions simultaneously. The solution uses Node.js stream.pipe() on clean sockets obtained from the native server.on('upgrade') event, ensuring no buffered bytes are lost and no competing listeners interfere.
Empty-body JSON requests breaking the proxy: Open WebUI sends DELETE requests with Content-Type: application/json but no body — a valid HTTP pattern that Fastify's default parser rejects with a 400 error. Fixed with a custom content-type parser that accepts and passes through empty bodies, making the proxy transparent to all request shapes.
| Category | Technologies |
|---|---|
| Language | TypeScript (Node.js 20) |
| Framework | Fastify |
| Database | SQLite (via better-sqlite3) |
| Infrastructure | Docker, Alpine Linux, Nginx |
| Frontend | EJS templates, HTMX |
What I'd do differently: Start from an established API gateway (such as Kong) and extend it with a consent plugin, rather than implementing raw HTTP and WebSocket proxying from scratch — the lower-level socket handling surfaces edge cases that mature gateways have already solved.
What worked well: The decision to store all routing and policy configuration in a database rather than code made the proxy genuinely service-agnostic from day one, and kept the blast radius of any single configuration change small.
*** END OF LIST ***