Tools: Next.js 15 + Supabase: I Accidentally Blew Past My Quota by 1000% (and How “Local‑First” Saved It)

Tools: Next.js 15 + Supabase: I Accidentally Blew Past My Quota by 1000% (and How “Local‑First” Saved It)

1. The morning my dashboard tried to jump-scare me ## 2. Technical post-mortem: two failures back-to-back ## The stack ## Failure 1: Broadcast everything (Supabase Realtime) ## Failure 2: “Fine, I’ll use Redis + polling” ## 3. The real realization: I was solving the wrong problem ## 4. The pivot: Local‑First, client‑only (“no server” mode) ## New architecture: client-only pass-and-play ## Local timer (no drift, no server) ## State transitions happen in memory ## INP optimization for 99 players ## Security note (because someone will ask) ## Result: costs down, UX up ## What changed ## Takeaway ## Try it TL;DR: My “perfectly reasonable” real-time architecture for an online party game turned into a billing horror story — first with Supabase Realtime broadcasts, then with Redis + polling. The fix wasn’t “optimize the server.” It was stop needing the server for the common use case. One morning I woke up to warning emails from Supabase and Vercel: “Your project has significantly exceeded its Realtime message quota.” I opened the dashboard and had to re-check that I was looking at the right project. In just a 10-day span, Supabase Realtime (Broadcast + Presence) had processed roughly 50,000,000 messages (the bill line item showed 54,557,731 Realtime Messages). That wasn’t “a little over.” It was 1000%+ over the included quota. For context: I run Imposter Game — a browser-based party game (think “Liar Game” / social deduction) that works with 3 to 99 players. No installs, no logins — just open a URL and play. User growth is great… until your side project starts throwing punches at your wallet. My first approach was the classic “real-time multiplayer” instinct: Here’s the core math that bit me: Supabase charges on egress messages — effectively: 1 event × number of subscribers in the room (N) So with N = 50 players: Result: ~50M messages in about 10 days, quota obliterated. My next thought was also extremely common: “Realtime is expensive. Let’s store state in Redis and have clients poll.” So I turned off the broadcast approach and switched to: This looked “cheaper” in my head. It wasn’t. If each poll triggers ~3 Redis commands (Room/Round/Player): 10 concurrent users × 1 poll/sec × 3 commands = 1,800 commands/min And Upstash’ free monthly quota (500k commands)? It evaporated in less than half a day. I ended up adding a credit card for Pay As You Go just to keep the app alive. At that point I had to admit it: “Congrats. I just wrote my own DDoS script.” (Also: Vercel and Upstash being in different regions increased RTT and made the whole thing feel even worse.) My initial “solutions” were all server-side optimizations: Then I paused and pictured the real-world usage. Most people play party games… in the same room, around the same table. So why were 10 friends at a campsite burning LTE data and battery life, constantly syncing with a server across the planet? The problem wasn’t “how do I scale my server cheaper?” “How do I remove the server from the default experience?” For in-person play, don’t use the network at all. Not “serverless.” Not “edge.” Just 0 API calls. One phone acts as the host. Players pass the device around to confirm roles (“pass and play”), then play together locally. The Local Mode component is a use client Next.js client component, but internally it behaves like a little state machine. Instead of server setInterval, I use requestAnimationFrame + Date.now() to compute time-left deterministically. (Yes, background tabs have constraints — but this is optimized for in-person local play, where the app stays in the foreground.) Role assignment (3–99 players), voting, win conditions — everything runs in browser memory. No network round trip means phase transitions feel instant. Rendering and updating a 99-player list can get janky fast. React 18’s useTransition helped keep heavy updates non-blocking: Online mode still requires server-side validation. But in Local Mode, the person holding the phone is effectively authenticated by physics. Your friends’ eyeballs are the anti-cheat. I restructured the site: The best part: users preferred the version with no installs, no logins, and no dependency on good internet. As developers, we’re often drawn to “real-time,” “websockets,” and “edge everything.” But the best scaling strategy I’ve learned recently is: Don’t optimize the server — make the server unnecessary. Sometimes an Array.map beats a Redis cluster. 👉 Play Imposter Game It’s a real demo of a smooth 99-player local party game built with React — no app install required. Templates let you quickly answer FAQs or store snippets for re-use. Are you sure you want to ? It will become hidden in your post, but will still be visible via the comment's permalink. as well , this person and/or COMMAND_BLOCK: // useGameTimer.ts (simplified) useEffect(() => { const startTime = Date.now(); let animationFrameId: number; const tick = () => { const elapsed = Math.floor((Date.now() - startTime) / 1000); const newTimeLeft = Math.max(duration - elapsed, 0); setTimeLeft(newTimeLeft); if (newTimeLeft > 0) { animationFrameId = requestAnimationFrame(tick); } }; animationFrameId = requestAnimationFrame(tick); return () => cancelAnimationFrame(animationFrameId); }, [duration]); COMMAND_BLOCK: // useGameTimer.ts (simplified) useEffect(() => { const startTime = Date.now(); let animationFrameId: number; const tick = () => { const elapsed = Math.floor((Date.now() - startTime) / 1000); const newTimeLeft = Math.max(duration - elapsed, 0); setTimeLeft(newTimeLeft); if (newTimeLeft > 0) { animationFrameId = requestAnimationFrame(tick); } }; animationFrameId = requestAnimationFrame(tick); return () => cancelAnimationFrame(animationFrameId); }, [duration]); COMMAND_BLOCK: // useGameTimer.ts (simplified) useEffect(() => { const startTime = Date.now(); let animationFrameId: number; const tick = () => { const elapsed = Math.floor((Date.now() - startTime) / 1000); const newTimeLeft = Math.max(duration - elapsed, 0); setTimeLeft(newTimeLeft); if (newTimeLeft > 0) { animationFrameId = requestAnimationFrame(tick); } }; animationFrameId = requestAnimationFrame(tick); return () => cancelAnimationFrame(animationFrameId); }, [duration]); COMMAND_BLOCK: const nextPhase = () => { setGame(prev => { if (prev.phase === "voting") { const result = calculateWinner(prev.votes); // local compute return { ...prev, phase: "result", winner: result }; } // ... }); }; COMMAND_BLOCK: const nextPhase = () => { setGame(prev => { if (prev.phase === "voting") { const result = calculateWinner(prev.votes); // local compute return { ...prev, phase: "result", winner: result }; } // ... }); }; COMMAND_BLOCK: const nextPhase = () => { setGame(prev => { if (prev.phase === "voting") { const result = calculateWinner(prev.votes); // local compute return { ...prev, phase: "result", winner: result }; } // ... }); }; COMMAND_BLOCK: const addPlayer = () => { startTransition(() => { setGame(prev => { const newPlayers = [...prev.players, createNewPlayer()]; return balanceRoles(newPlayers); }); }); }; COMMAND_BLOCK: const addPlayer = () => { startTransition(() => { setGame(prev => { const newPlayers = [...prev.players, createNewPlayer()]; return balanceRoles(newPlayers); }); }); }; COMMAND_BLOCK: const addPlayer = () => { startTransition(() => { setGame(prev => { const newPlayers = [...prev.players, createNewPlayer()]; return balanceRoles(newPlayers); }); }); }; - Framework: Next.js 15 (App Router) - Database: Supabase (Postgres) - Realtime: Supabase Realtime (Broadcast + Presence) - State: Upstash Redis (Vercel KV) - Any state change? Broadcast it immediately. - Timer? Send updates every second. - Presence? Track joins/leaves live. - Every second: 1 timer tick × 50 recipients = 50 messages/sec - One 15-minute round (900 sec): 50 × 900 = 45,000 messages - Add votes, reactions, and Presence traffic…and the number explodes. - State stored in Upstash Redis - Client polls GET /api/game-state once per second - batch Redis reads (MGET) - reduce timer update frequency (1s → 5s) - compress payloads - Local Game (Offline Mode) became the main CTA on the homepage. - Online Game stayed as a backup feature (“Remote Mode”). - Realtime messages: ~50M → near zero (because most sessions moved to Local Mode) - Redis usage: easily kept within free/low tiers - Reliability: no more games dying due to disconnects (even in basements, mountains, and spotty areas)