Self-hosted. Production-grade. TypeScript-native.
From local development to global scale. No vendor lock-in.
Run Ollama, vLLM, OpenAI, Anthropic, or Google models with identical code. Switch providers without changing a line.
Build complex agent pipelines with retry, compensation, and human-in-the-loop steps.
6 coordination strategies out of the box. Hierarchical, consensus, round-robin, and more.
Redis for speed, Postgres for persistence, pgvector for semantic search. Your agents remember everything.
Run untrusted code in Docker containers or WASM. Never on your host.
First-class support for Model Context Protocol. Connect any MCP server as a tool.
Express, Fastify, Hono, Koa — mount agents as REST APIs with SSE streaming, WebSocket, and auto-generated Swagger docs.
OpenTelemetry traces, Prometheus metrics, cost tracking. Know exactly what your agents are doing.