Docs

A neuro-linter that detects cognitive overload in UI designs using simulated brain data and automatically rewrites chaotic CSS into clean components.

Tech Stack

  • Next.js 16 + Tailwind CSS — Frontend UI, captures Base64 screenshots and sends raw code to the backend
  • Node.js + Express — Middleware layer, routes data between the brain model and Gemini, handles CORS and API logic
  • Python + FastAPI — Hosts the TRIBE v2 brain simulation model, exposes a REST endpoint for image analysis
  • Google Gemini 2.5 Flash — LLM that rewrites high-friction UI code
  • Axios — Used by the Node server to forward image data to the Python model

Meta TRIBE v2

TRIBE v2 is a neuroscience-inspired brain simulation model built in Python. It analyzes UI screenshots and produces a cognitive friction score by simulating how different regions of the human brain respond to visual stimuli.

Output

  • visual_cortex — Detects competing visual elements, excessive color, or layout chaos
  • prefrontal — Measures decision fatigue from unclear hierarchy or too many choices

Data Flow

  1. 01Frontend sends raw_code + image_base64 to POST /api/evaluate-ui
  2. 02Backend forwards the image to the Python brain model
  3. 03Brain model returns a friction_score and region activations
  4. 04If score > 40, Gemini rewrites the code
  5. 05Final payload returned with score, severity, brain regions, and clean code

API Reference

POST/api/evaluate-ui

Main endpoint. Accepts raw UI code and a Base64 screenshot, returns friction score and refactored code.

GET/api/health

Returns server status and uptime.

GET/api/history

Returns the last 50 friction score evaluations stored in memory.

Severity Levels

Low0–25
Medium26–50
High51–75
Critical76–100

Brain Regions

visual_cortex — Visual Overload. High activation means too much competing visual information.

prefrontal — Decision Fatigue. High activation means too many choices or unclear hierarchy.

Cortex | Meta Tribe V2