AI is a gift of fire
Sara Baase’s A Gift of Fire was assigned in my CS100 class. Prometheus gave humanity fire. It warmed them, fed them, and burned them alive. Baase’s point was that computers are the same kind of gift. That idea smoldered in my brain for years, and it wasn’t until recently that it sparked up again.
I was never an AI skeptic, but I wasn’t a true believer either. I used the tools with reservations—interested, but cautious in a way that felt professionally responsible. I mean, we’re software engineers: we write code. These tools felt like shortcuts, and shortcuts imply a debt. I told myself that I should still know how to do the work the long way, that understanding came from typing every line, from struggling through syntax and edge cases myself.
Not anymore.
What’s changed is the realization that it doesn’t make sense for me to write code anymore. I’m a mediocre programmer at best. The models are good enough now, and I feel comfortable with how they work. I still need to understand how software goes together, how systems behave, where they break, and what “good” feels like. It helps that I have decades of experience doing it the old way.
Steve Yegge recently released Gas Town, an orchestrator for running large numbers of AI coding agents in parallel. He describes a set of stages of AI-assisted development, ranging from no AI at all, to copilots in the IDE, to multi-agent workflows and custom orchestration. His claim is provocative: if you are not operating at the later stages, you need to GET THE F*** OUT. This is Gas Town, after all. Gas burns.
Progress through these stages is not just about skill or intelligence. It is about exposure and experience. Exposure means knowing which tools exist, who to pay attention to, what context engineering is (and why it matters). But exposure alone is not enough, you have to put in the reps. You have to run the agents, hit the walls, feel the friction, develop the intuitions that only come from doing it over and over. The same techniques that worked three months ago are already obsolete. If you are not continuously embedded in that flow of information—and continuously acting on it—none of this stuff will make sense.
In 1968, Douglas Engelbart gave what is now called the Mother of All Demos. He showed the mouse, hypertext, collaborative editing, and networked computers in a single 90-minute presentation. His thesis was that computers should augment human intellect. Not replace it, but extend it. At the time this looked like science fiction. Today it is so normal that you are using nearly all of it just to read this sentence.
In 1985, Bill Atkinson dropped acid and crystallized an idea about knowledge as isolated pools of light, separated by distance and language. He came back with the vision for HyperCard. It let ordinary people link ideas together and build interactive systems without traditional programming. It dramatically shortened the distance between having an idea and making something real.
Over the recent holiday break, I made two small games with my daughters: Kirby Sky Jumpers and Alien Abducto Rama. We used voice dictation to talk to Claude Code. We described what we wanted and watched it build in real time. We play-tested, noticed problems, talked through fixes, and watched the system rebuild itself.
At one point my eight-year-old leaned into the microphone and asked for faster clouds and a more purple tractor beam. The change appeared almost immediately. She wasn’t coding. But she was doing the thing Engelbart and Atkinson were pointing toward all along: forming intent, evaluating outcomes, and iterating in real time.
I can write code, design interfaces, and do product stuff. I know what good software should feel like. I learned all that by failing repeatedly, manually, over and over. That scar tissue helped me build taste and understand quality.
A few nights a week, I let my daughter have her own conversations with an AI through voice model. I listen nearby. She learns about courageous women like Ada Lovelace, Marie Curie, and Ruth Bader Ginsburg. She asks follow-up questions and gets excited to learn more. She is building a relationship with knowledge that works nothing like mine did when I was her age.
So do I gatekeep, do I say that she shouldn’t use these tools until she’s struggled enough… until she’s been burned the way I was? That feels wrong. She is not skipping learning. She is learning differently. She is developing intuition about what is possible, how to ask better questions, and how to follow curiosity.
I do not know whether getting burned was the point, or just the historical cost of entry. The hard way was the only way, until suddenly it was not.
The fire is everywhere and spreading fast. I’m jumping in. From the outside, this probably looks like self-immolation.
Maybe it is. But from the inside, it feels like becoming.
Fire is clean. It burns away old assumptions, old reflexes, old certainties about how things should work. And from the ashes, something new rises: a phoenix, shining bright.