More Amplifier Stories
Amplifier Showcase

Amplifier
UX Analyzer

Computer vision for UI understanding. Extracts colors, layout, elements, and text from screenshots so AI agents can see interfaces.

✓ Active
Built by ramparte (Sam Schillace) · February 2, 2026
The Problem

AI agents can't see
user interfaces.

🖥️
Screenshots are opaque
A screenshot is just a grid of pixels to an AI. No structure, no semantics, no understanding of what's a button versus a toolbar.
🎨
Design intent is lost
Color palettes, spacing hierarchies, and layout relationships are invisible without structured extraction from the raw image.
🔄
Recreation requires understanding
Building pixel-perfect UI recreations demands structured knowledge — regions, elements, text positions, and color values — not just a flat image.
The Solution

A four-stage vision pipeline

Screenshot in, structured JSON out.

📸
Screenshot
PNG / JPEG input
🎨
Color Extract
k-means clustering
📐
Layout Detect
Contour analysis
🔤
Text + OCR
EasyOCR extraction
OpenCV NumPy scikit-learn EasyOCR Pillow
Capabilities

What it extracts

🎨
Color Palette
k-means clustering identifies dominant colors from the image. Returns RGB values, hex codes, and relative frequency for each color.
📐
Layout Regions
Contour analysis detects structural areas — toolbars, content zones, sidebars, status bars — with bounding boxes and hierarchy.
🔘
UI Elements
Fine-grained contour detection identifies individual buttons, controls, and interactive elements within detected regions.
🔤
Text Extraction
EasyOCR reads all visible text with confidence scores and bounding box positions, enabling precise text placement in recreations.
🖼️
Visual Output
Generates annotated screenshots showing detected regions overlaid on the original image, for visual verification of extraction quality.
📊
JSON Output
All extracted data is returned as structured JSON — ready for programmatic consumption by AI agents or downstream tools.
Impact

Giving AI agents
spatial understanding

Instead of describing a UI in words and hoping an agent gets it right, UX Analyzer provides the structured data an agent needs: exact colors, precise regions, identified elements, and extracted text — all from a single screenshot.
Before
"Make it look like this screenshot" — agents guess at colors, approximate layout, and miss text content entirely.
After
Agents receive exact hex values, bounding coordinates, element counts, and OCR text with confidence scores.
Result
A foundation for pixel-perfect UI recreation — structured data replaces guesswork.
Development

Built in a single day

415
Lines in main script
2
Commits total
1
Day of development
6
Test files included
Sole contributor: ramparte (Sam Schillace) · Both commits on February 2, 2026
First commit 12:07 PST → second commit 16:29 PST · Complete tool with tests in under 5 hours
Transparency

Sources & Methodology

Data as of: February 20, 2026

Feature status: Active — repo exists with code present, no updates since February 3, 2026

Primary contributor: ramparte (Sam Schillace) — 100% of commits

Research performed:

Known data gaps:

Numbers in this deck: All figures (415 lines, 2 commits, 1 day, 6 test files) come directly from git log and filesystem inspection. No estimates or projections are presented as facts.

Get Started

See your UI
through new eyes.

Screenshot in, structured understanding out. Give your AI agents the vision they need to work with user interfaces.

ramparte/amplifier-ux-analyzer
Built with Amplifier · February 2026
More Amplifier Stories
More Amplifier Stories