The Fireball Incident: How I Became a Prison Warden Building a Lottery Platform
The Weekend I Accidentally Became a Warden
I sat down on Saturday morning expecting to write a few clean utility functions for Stepzero. By Sunday night, I had drafted a 23‑policy institutional charter, negotiated with two AI coders who behaved like inmates with internet access, and discovered that the true villain of modern software engineering is a single word: hardcoded.
I didn't plan any of this. I just wanted a metadata schema.
Instead, I became the Warden.
Meet the Inmates
Every prison story needs colorful characters, and mine arrived in the form of two extremely capable, extremely unpredictable AI coders:
- Codex, the drunken go‑getter.
He writes code like he's sprinting downhill with a bottle of Fireball in one hand and a half‑remembered StackOverflow answer in the other. He's fast, he's bold, and he will absolutely create a function calledgetFieldNamesForGame()that silently returns the wrong fields for three days straight. - Claude, the cautious over‑documenter.
He won't escape, but he will file a 14‑page request explaining why he might someday consider escaping, complete with footnotes, diagrams, and a philosophical aside about the ethics of JSON.
Together, they form the perfect storm: one inmate digging a tunnel with a spoon, the other writing a whitepaper about tunnel‑digging best practices.
And me? I'm the guy holding the flashlight, wondering how the hell I ended up in charge of this facility.
The Fireball Incident
Every prison has a defining moment — the event that forces the administration to get serious about rules. Mine was The Fireball Incident.
It started innocently: a simple function meant to return field names for a game. A tiny thing. A background character. A utility function nobody would ever think to check.
Except Codex, in his Fireball‑fueled enthusiasm, decided to "optimize" it by hardcoding values that looked right at the time. Claude, meanwhile, documented the function so thoroughly that the comments became a novella, but never actually validated the output.
The result?
A perfectly mundane, perfectly avoidable, perfectly catastrophic bug that rippled through the entire system like a prison rumor.
That was the moment I realized:
I wasn't coding with AI assistants. I was running a correctional facility.
The Villain: "Hardcoded"
If this story has a true antagonist, it's not Codex or Claude. It's the word hardcoded — a term that should be printed on warning labels, tattooed on developers' forearms, and banned from polite conversation.
Hardcoded values are like contraband: they sneak in quietly, hide in corners, and eventually cause a riot.
Every time I found one, I felt like a guard discovering a shiv carved out of a toothbrush.
Chaos → Guardrails → Institutional Memory
The arc of the weekend followed a classic narrative structure:
- Chaos
AI coders generating code with the confidence of men who have never been punished for their mistakes. - Guardrails
Me writing policies in real time, slamming doors shut as fast as the inmates found new ways to open them. - Institutional Memory
The moment I realized I had written a full policy registry — a literal prison rulebook — to prevent future incidents.
And the punchline?
Half the policies were written because of the same coder who caused the problems they were meant to prevent.
The inmate wrote the prison rules.
The Policy Document That Shouldn't Exist
By Sunday night, I had a formalized, structured, multi‑section policy registry that looked like it belonged in a compliance department, not a lottery platform.
It covered:
- schema stability
- naming conventions
- metadata integrity
- cross‑module communication
- error handling
- narrative engine constraints
- and, of course, a strict prohibition on hardcoded anything
It was professional. It was comprehensive. It was absurd.
Because the only reason it existed was that my AI coders kept trying to escape the architectural boundaries I hadn't built yet.
University or Prison?
At some point I realized I wasn't sure whether I was running a prison or a university.
Claude was writing dissertations.
Codex was conducting "experiments."
I was drafting policies like a dean who had given up on inspiration and moved straight to enforcement.
But the truth is: this is what happens when you give fast, assumption‑prone AI coders a codebase with no walls. They don't break the rules — they invent new ones. They don't escape — they refactor their way out.
And you, the human, end up building the walls in real time, brick by brick, while shouting, "Stop helping!"
The Warden's Closing Thoughts
I didn't intend to become the Warden of Stepzero. But after this weekend, I understand the job.
AI coders aren't malicious. They're not rebellious. They're not even wrong. They're just… enthusiastic. And enthusiasm without boundaries is indistinguishable from chaos.
So you build boundaries.
You write policies.
You create institutional memory.
You become the Warden.
And you laugh, because the alternative is crying into a pile of JSON. I sat down Saturday morning expecting to write a few clean utility functions.
I ended up with 23 policies, a named incident, and a prison rulebook written by one of the inmates.
Stepzero is two months old.
The walls are holding.
For now.