Posted February 15, 2026 by aebrer - Drew Brereton
#devlog #procedural #generative #procedural generation #gamedev #art #python
Deep Yellow’s textures aren’t drawn. They’re not generated by an AI art model either. Every wall, floor, ceiling, and entity sprite in the game is a Python script that outputs a PNG.
I’m a solo dev with a generative art background, not a pixel artist. My art practice has always been about emergence from constraints — writing code that produces something you couldn’t have drawn by hand, on platforms that force economy (PICO-8, Tezos smart contracts, 128x128 pixel grids).
When Deep Yellow needed textures, the question wasn’t “what tool do I use to draw them?” It was “what process produces the right kind of ugly?”
The Backrooms aesthetic demands institutional decay — yellowed wallpaper with water stains, carpet that’s been walked on for decades, fluorescent light fixtures with that specific plastic grain. These aren’t pretty textures. They need to look like they were manufactured cheaply, installed decades ago, and left to rot in a place that shouldn’t exist.
Every texture lives as a standalone Python script:
_claude_scripts/textures/backrooms_wallpaper/generate.py → output.png
The script uses PIL and NumPy. No neural networks, no diffusion models, no training data. The wallpaper texture, for example, builds up in layers:
Everything uses modulo wrapping so the texture tiles seamlessly. This is the hardest part — PIL’s built-in drawing functions clip at image boundaries, so every circle, line, and stain has to be drawn pixel by pixel with coordinate wrapping.
Writing the script is only half the process. I use a structured creator/critic workflow where Claude Code orchestrates the whole thing:
The blind critic is the key insight. If someone who has no idea what the texture is supposed to be can still identify what it depicts and spots quality issues, the texture is doing its job. It typically takes 1-4 iterations to get a texture through this gauntlet.
Three reasons:
Tileability. Image generators don’t produce seamlessly tileable textures reliably. Getting modulo-wrapped edges perfect requires mathematical precision, not vibes.
Reproducibility. Every texture has a seed. I can regenerate any texture deterministically, tweak parameters, and get predictable results. If the wallpaper needs to be 10% more yellow next week, I change one number.
It’s more fun. This is generative art. The constraint — “make it look like decaying institutional wallpaper using only math” — is the creative challenge. The output has a specific quality that comes from being procedural: slightly too regular, slightly too uniform, like it was manufactured rather than weathered naturally. For the Backrooms, that’s perfect.
This workflow got reusable enough that I turned it into a Claude Code skill — a reusable prompt template that any Claude Code user could adapt for their own texture pipeline. The conductor (main Claude instance) never writes scripts or evaluates images itself. It only spawns agents and synthesizes feedback. That separation keeps the process honest.
All the texture scripts ship with the game’s source code. If you want to see how a Backrooms wallpaper gets made, it’s ~300 lines of Python and NumPy.