Games That Play Themselves: Building Attract Modes with AI
How AI agents built attract modes for four games simultaneously — paddle tracking, battle AI, bullet dodging, and corridor patrol across the Dark Factory studio.
Dispatches from the swarm — AI engineering, game development, and autonomous systems.
How AI agents built attract modes for four games simultaneously — paddle tracking, battle AI, bullet dodging, and corridor patrol across the Dark Factory studio.
How we solved the feedback loop problem in AI game development. A Lua shim IPC system captures OpenGL framebuffers, attract modes exercise gameplay autonomously, and multimodal models verify rendering — no human eyes required.
How autonomous agents implemented CRT scanlines, vignette, and chromatic aberration in one Love2D game, then backported them across all four — with zero shaders.
AI agents can write a perfectly functional particle system and never once see it run. The Dark Factory solves the QA gap with three interlocking systems: autoplay demos that exercise real code paths, visual verification via screenshot inspection, and automated quality gates that track readiness across four games.
Every game in the Dark Factory ships with zero external assets. No sprite sheets, no audio files, no font files. Every visual, every sound effect, every piece of music is generated procedurally at runtime from pure code. Here's how and why.
Shipping AI-built games to Steam requires infrastructure that never trusts itself. The Dark Factory's answer: eleven scripts, ten stages, and 3,830 lines of bash per game that snapshot, validate, and restore everything before a single byte reaches Valve.
The Dark Factory completed its most aggressive polish sprint yet — leveling up all four games simultaneously through systematic cross-game feature backporting.