Your AI assistant writes more code than you can read. Legible turns that code into static maps, runtime narratives, and a human-readable timeline of what your software actually does — per task.
Prefer to read how it works first?
POST /upload — request arrives, auth middleware runsvalidateImage() — checks MIME type and size < 10 MBresizeImage() — sharp resizes the image to 800 px widthstoreToS3() — writes the buffer under uploads/img-4a2b…enqueueJob() — schedules thumbnail worker, responds 202thumb-worker.ts — picks up the job async, writes a 256 px thumbnailDrop Legible on a project and get three progressively richer views. Stay on the cover page for the overview. Drill down when you need the detail.
A cover page showing what this software actually does. Auto-inferred tasks like “Upload an image” or “Sign up a user”, each with a one-sentence summary.
Files on the left, an interactive architecture diagram on the right. Zoom from system level all the way into individual functions. No more five-minute grep safaris.
Files left, narrative timeline right. Scrub through the execution of a task and watch each function fire, read its human-written summary, and see exactly where control flows next.
Signup takes ten seconds — no credit card. Your API token is shown right after you sign up; you’ll paste it into the CLI in Step 3.
Node 20+ required. While @legible/cli isn’t on the public npm registry yet, you can install the latest tarball directly from this site.
npm install -g https://dev.livetimelapse.com.au/legible/downloads/legible-cli-latest.tgz
legible --version
Legible parses files locally with tree-sitter and uploads only the structured index — never your raw source. Then run your code once with the tracer attached to build a timeline.
legible login --server https://dev.livetimelapse.com.au/legible --token <paste-from-settings>
cd my-ai-app
legible scan . --name my-ai-app
legible trace --project <id> -- npm test
“Comprehension debt is the hidden cost of AI-generated code. A junior engineer can now generate code faster than a senior engineer can critically audit it.”— Industry consensus on the AI-code reading crisis
Every plan includes the three views, the CLI, and unlimited scans. Pro adds private repos; Team adds collaboration; Enterprise adds self-hosted.
No. The CLI parses your files locally with tree-sitter and uploads only the structured index (filenames, symbols, call graph) and trace events. The raw source stays on your disk unless you explicitly click “share this snippet.”
JavaScript, TypeScript, Python, and PHP for static analysis and runtime tracing at launch. Go and Rust get static support in Phase 2. Any tree-sitter grammar can be added as an adapter.
Cursor and Copilot help you write code. CodeRabbit and Greptile help you catch bugs in code. Legible helps you read and understand code you didn’t write yourself. Same codebase, different tool for different job.
The parser kernel, the CLI, and every language adapter are Apache-2.0. The web UI and AI narrator are proprietary but the source is available under a Sentry-style dual licence after our public launch.
Only when you opt in. Default mode is one-shot — record a single request or test run. Sampling mode captures 1% of traffic. Never always-on.
Self-hosted Enterprise tier ships in Q3. Same binary as our cloud; your data never leaves your VPC.
Free forever for public repos. No credit card.