How does Mako orchestrate a real-time data pipeline?
Mako is an open-source Go framework that describes real-time data pipelines as declarative YAML files, with no code. A Mako pipeline has 3 sections: sources (Kafka, Change Data Capture on Postgres or MySQL, HTTP endpoints), transforms (WASM modules compiled from Rust, Go, or Python), and sinks (Kafka, S3, relational database, webhook). At runtime, Mako builds a directed acyclic graph (DAG) in memory and applies automatic backpressure between stages — if a sink is slow, the sources slow down without saturating memory. Mako exposes 14 standard Prometheus metrics (throughput, p50/p95/p99 latency, queue sizes, errors per stage) and ships with a reference Grafana dashboard. The binary is under 30 MB and starts in under 200 ms.