Tools: Part 1 : Building a Game in 30 Hours Using AI — Here's the Actual Timeline

Tools: Part 1 : Building a Game in 30 Hours Using AI — Here's the Actual Timeline

Source: Dev.to

The Commit Log ## What "AI-Assisted" Actually Meant ## The Method Behind the Numbers ## Why the Timeline Matters ## What the Numbers Don't Show What happens when you treat AI as a development partner instead of an autocomplete tool? I wanted to find out, so I ran an experiment: build a game as fast as possible, prioritizing speed and iteration over polish. No extensive planning phase. No detailed specs. Just prototyping at velocity to see what's actually possible. The result is MagLava—a magnetic platformer where you swing between nodes to escape rising lava. Six development sessions. Roughly 30 hours of work. A published game with 25 levels, 7 language translations, and a desktop app. To be clear: this approach was the point of the experiment. For production software or anything beyond rapid prototyping, proper planning and testing remain essential (I plan to make a post around that later). But I wanted to stress-test the other extreme. Here's exactly how that broke down. I commit frequently using an AI tool that generates summaries after each task, so commit clusters map well to actual work sessions. I went through my git history and reconstructed the timeline: Total: ~25-35 hours across 6 sessions. Notice the gaps. Nine days between Dec 21 and Dec 30. Sixteen days before the final session. The calendar says "one month of development." The reality is about a week of actual work, scattered across that month. The first day alone produced 6,555 lines of code—project structure, game config, menu systems, routing, storage services, the event bridge between Vue and Phaser. That's scaffolding I would have spent days on manually. But the surprising part wasn't the code generation. It was what else came out of the process. The promotional website with SEO, sitemaps, and structured data? AI-built. The full Electron desktop app with the level editor? Same story. The internationalization system supporting 7 languages? Implemented in a single session. I didn't set out to have AI handle all of these. They emerged because once you're in a flow state with an AI partner, the friction for "one more feature" drops dramatically. Adding a desktop build went from "future roadmap item" to "let's just do it this session." You might have heard the term "vibe coding"— in this case, staying in player mode and describing problems experientially rather than technically. That's what this looked like in practice. Here's an actual prompt from one of my sessions: "game just freezes! I can't actually SEE the LAVA LINE!!! I WANA SEE THE LAVA line. And finally, we start way too close to the lava." Typos, caps lock, frustration—all of it. I didn't calculate a pixel buffer. I expressed a feeling of unfairness. The AI translated "way too close" into a concrete fix: increasing the spawn buffer from 150 to 500 pixels above the lava line. This only works because emotion is a valid technical spec when your goal is player experience. "This feels unfair" contains more design information than "adjust the Y-offset parameter." I'm not claiming 30 hours is fast or slow. Game complexity varies wildly. What I am claiming is that this timeline was possible because of how I worked, not just the tools I used. The traditional solo dev pipeline looks like this: plan extensively, build the architecture, implement features, then test. You front-load decisions before you have information. My pipeline inverted that. Build something immediately. Play it. Describe what feels wrong. Iterate. The AI handled the translation from "this sucks" to "here's the code change." Every system stayed disposable. Every change stayed focused and modular. When something wasn't working, I didn't defend it—I described a different vision and let it get rebuilt. That's hard to do when you've personally written every line. It's easy when you're directing rather than typing. The timeline looks clean in retrospect. It wasn't. Day 3—that massive 27-commit session—included throwing away the entire movement system I'd built on Day 2. The original concept was polarity-based climbing: red platforms attract, blue platforms repel. I built the whole thing. Then I played it and it felt indirect, frustrating, disconnected from what I actually wanted. So I described something different: "I want swinging. Like Spider-Man but with magnets." Within hours, the game had a completely new identity. That pivot—and one other major pivot that came later—are probably the reason MagLava works at all. But that's a longer story. Next in this series: The two pivots that almost killed the project, and why emotional prompts caught problems that specs would have missed. Play MagLava | itch.io page Tech stack for the curious: Vue 3 + Vite, Phaser 3, TypeScript, Electron for desktop, Firebase + itch.io for hosting, Claude (Opus via Claude Code CLI) for AI assistance. Templates let you quickly answer FAQs or store snippets for re-use. Are you sure you want to hide this comment? It will become hidden in your post, but will still be visible via the comment's permalink. Hide child comments as well For further actions, you may consider blocking this person and/or reporting abuse