Michael Felix

Goochland, Virginia

Taste is the bottleneck

Over the recent holiday break, I built a browser game with my kids called Alien Abducto-rama.

It started the simplest way possible. We sat down, brainstormed ideas, and landed on an alien abduction game—you pilot a UFO and beam up humans for points. While my kids sketched everything we’d need, I opened Claude Code and started talking through a rough game design spec. Nothing fancy, just describing what the game should feel like, how scoring might work, what the player does minute to minute.

Once we had an outline, we read it out loud together. My kids complained about things that felt boring or unfair. We tweaked mechanics, added ideas, and debated until it felt right. With the design spec done and the drawings digitized, I piped the whole thing into Claude and watched it assemble the foundation of the game.

Of course, it didn’t come out exactly how we wanted. So I showed my kids how to use voice dictation to talk to Claude directly. We described problems, asked for changes, and watched it rewrite parts of the game in real time. Explosions appeared. Sound effects landed. Power-ups got layered in.


A few weeks later I got Gastown running locally and decided to try something small but real: a classic arcade-style leaderboard with top scores and three-letter initials.

I wrote up a plan and bounced it back and forth between Codex and Claude Code until it felt solid, then handed the final version to the Gastown Mayor, plugged in my laptop, made sure it wouldn’t go to sleep, and went to bed.

When I woke up, there was a cascade of fresh commits in the repo. Gastown had autonomously figured out how to persist the data, design the UI, divide the work, wire everything together, and deploy it to the internet.

Alien Abducto-rama

I invited people to play the game, try to beat my score, and tell me what was broken or boring or missing. People showed up and started to push on it, bend it, find the boundaries I had never imagined.


The East Coast recently got hammered with ice and snow. School was closed for over a week, everyone was stuck at home. So we did what any reasonable family would do: we went through the feedback and started shipping features.

Someone asked for a way to stun tanks, so now if you lift one more than a third of the way up and drop it, it gets temporarily disabled with little stars spinning around its head. Someone else asked for boss levels—we gave them an arms dealer instead, in the form of a UFO Shopping Mall where you can buy power-ups, bombs, and a laser turret between waves. Another player wanted more maneuverability, so we added a warp juke that lets you dodge incoming fire. My kids spent about twenty minutes tuning the abduction sounds, beaming things up over and over to get the pitch shift just right. The leaderboard now shows the top 100 scores.

We also shipped an in-game feedback system so players can rate their experience and submit suggestions. So far we’ve collected 40 responses, averaging 4.5 out of 5 for enjoyment, 4.5 for difficulty, and 4.1 for replayability.

Then a player named CHK discovered the game and proceeded to claim over 80% the top 100 leaderboard spots. They’ve been playing obsessively for over a week now, with a current high score of 39,248 points at Wave 12. When I was building this, I assumed Wave 7 was about as far as anyone could reasonably get. Watching someone blast past that into territory I never even play-tested has been one of the great joys of this whole project. They found the edge of the game and leaned over it!


Most of the feedback was like that—specific, solvable, shippable in an afternoon. But one request kept surfacing that I couldn’t just prompt my way through: people wanted the characters animated. Walking, idling, reacting. Not just sliding across the screen but actually moving. That meant going back to how the art was made in the first place.

When I originally made the game with my kids, we started with drawings they made on paper. I took photos, digitized them, and used Affinity Designer to cut out the backgrounds. I tried doing animation that way at first, but it was super tedious. There had to be a better way.

What I really wanted was a paper-doll workflow: cut a character into parts, define pivot points, rotate arms and legs, generate frames. The kind of tool animators have, but simple enough for me and my kids to use. Then my oldest kid Ruby had a great idea: what if we just made the animation software ourselves? How hard could that be?

I wrote a rough design spec, then had Claude generate a few static HTML mockups. We made them interactive and played with modes and workflows until things started to feel right. From there I took the UI ideas and the outline and worked with Claude to produce a proper design document. I edited, rewrote, and refined it until it felt like something I could actually build and use to solve my animation workflow problem. We called it Mr. Animation.

Then I ran it through Quibble, a CLI tool I built to automate debates between Codex and Claude. Quibble has Codex review a technical plan, identify issues and opportunities, then feeds those back to Claude, which either agrees and updates the doc or pushes back. A few rounds later you usually end up with a much stronger plan.

With the plan quibbled, I handed it to Ralph Orchestrator to manage the implementation. About eleven iterations and roughly an hour later, completely hands-off, I had a dev server running a very broken, very wonky, barely functional version of Mr. Animation.

From there came the real work. Using the thing, feeling where it fell apart, finding bugs, discovering interactions that made no sense. Going back and forth with Claude and Codex to fix things, adjust things, rethink things entirely. There were dead ends and abandoned ideas, including a failed attempt at a skeletal management system. Slowly I worked through the real-world complexity of animations made of twenty-plus stacked pieces, each needing to be rigged and transformed.

Eventually it all fell into place. The paper-doll model came together, the workflow stabilized, and exporting PNG and GIF sequences finally worked. After a few focused bug-squashing sessions, Mr. Animation was good enough to actually use! It’s still kinda rough, but you can try it out here.

I created all the animations for the game with it—each one eight to twelve frames, exported as PNG sequences. Then, in about fifteen minutes of prompting with Claude, I added a build step to the game that detects animation assets, figures out frame order, aggregates them, and wires them into the game automatically.


Alien Abducto-rama took weeks to build and countless more to polish and refine. So did Quibble and Mr. Animation. Making each of them required me sitting with nuanced problems, reading a lot of code I didn’t write, fixing things that broke, throwing away approaches that seemed promising, and slowly developing opinions about what felt right and what didn’t.

AI can write better code faster than me, but that’s not where the work is. You still have to use your heart, mind, and soul to feel your way through problems, form opinions, and develop the intuitions that only come from doing the work over and over and over. The quality of how you describe what you want, your ability to craft rich explanations of what you mean, and recognize when the result falls short is where the work lives now. That’s not a skill the models gave you. That’s something you built, slowly, the old way.

What one patient, curious, stubborn person can build has shifted dramatically. What’s changed is the leverage and where you put your attention.

© 2026