When architecture diagrams become bug specifications
Generate a DOT graph from code. Compare it to the design doc.
Feed the graph to AI agents as a specification.
Let them measure the code against its own architecture.
It started with the team-knowledge-base project. The user generated a deep DOT graph of the team-kb codebase, then compared it against the design document. The gap between diagram and design revealed concrete, actionable issues.
The user then built DOT generation into the team-kb CLI and ran it against amplifier-chat, producing two key files:
System architecture overview — 5-tier: entry points → daemon → plugin backend → SSE streaming → frontend SPA → disk storage
Deep event flow with 7 annotated bugs (BUG-01 through BUG-07) and 3 structural gaps embedded directly in the graph
Comparing the generated architecture graph against the original design document revealed concrete discrepancies:
Functions declared in the design but never implemented — empty shells that silently passed through.
Data fields declared in structures but never populated by any code path — ghosts in the schema.
Logic duplicated between modules — copy-pasted implementations that should have been shared.
The architecture diagram didn't just describe the system. It became a contract the code was measured against.
A 667-line DOT file encodes more structural information than thousands of lines of raw source code.
Architecture as compression.
4 parallel Amplifier sessions.
Each consuming a DOT file.
Each diving into the code.
Fed amplifier-chat.dot. Dispatched 3 parallel agents (frontend, backend, test/git-history). Found 24 issues across 4 categories.
Fed event-flow-deep.dot with its annotated bugs (BUG-01 through BUG-07). Agents validated each against source code.
Read Session 1's findings via session-analyst. Dispatched parallel agents (Security, Frontend, Backend) to independently verify every issue. Found an additional vulnerability the original session missed.
Cross-checked with all other sessions. Designed a 4-PR fix plan organized by regression risk. Implemented all fixes using git worktrees.
The user became the orchestrator — a human message bus routing findings between competing AI sessions for independent verification. Each session could challenge the others' conclusions.
31 confirmed unique issues. Session 4 discovered an additional bug during implementation. All stemming from two DOT files used as specifications.
When parallel sessions independently converge on the same findings, confidence is multiplicative.
The workflow was so powerful it demanded its own infrastructure. The user designed a cross-session consensus protocol — sessions that could autonomously converge on findings, observable and interruptable.
The architecture diagram is both the map and the test.
The codebase audits itself.
From DOT generation to merged fixes, driven by architecture-as-specification and cross-session validation.
Data as of: March 27, 2026
Pattern status: Active — discovered workflow, not a shipped product feature
Primary source:
Quotes:
Gaps & caveats:
Primary contributor: Single user conducted all sessions and provided the full account
Generate a DOT graph. Feed it to an agent.
Let the code audit itself.