Open SystemDecoder on a larger screen to build systems, run simulations, and inject chaos.
What's waiting for you on desktop
Live Simulations
Watch latency spike, queues fill, and nodes fail in real time. Every slider change is instant.
Visual Architecture Canvas
Drag nodes, draw edges, and build any distributed system topology from scratch.
Chaos Engineering
Kill servers, introduce packet loss, throttle CPUs — and watch your system react.
Real-time Insights
Throughput, p99 latency, error rates — all charted live as your simulation runs.
40+
Concepts
<1s
Feedback
∞
Replays
"The best way to understand a distributed system
is to break it."
Youtube
🎬
2.7B users. 500 hours uploaded per minute.
Upload → Process → Stream to everyone, everywhere
Users upload 500 hours of video every minute. Each video must be watchable on 2G mobile and gigabit fibre, on phones and 8K TVs, by viewers in Mumbai and São Paulo. Handling this with a traditional API-and-database is fundamentally broken.
Request Flow
Upload problem
A 4K video is 5–30 GB. If every byte goes through your API server, you need thousands of cores just doing I/O — no business logic, just forwarding data.
Processing problem
A raw uploaded video can't be played on most devices. It must be transcoded into multiple resolutions and formats — a process that takes minutes to hours per video.
Delivery problem
Serving a 4K video from a single region in us-east-1 adds 150–300ms latency for viewers in Asia. Without CDN edge caching, every request hits the origin — latency and cost skyrocket.
Why We Need This
YouTube solves upload, processing, and delivery as three separate, independent systems — not one monolith.
Key Insight
Video systems are fundamentally different from request/response systems. Data flows one way and is enormous. You need dedicated upload offloading, async processing, and segmented CDN delivery.
Overview