agent-intelligence.ai
Build and deploy agents in minutes.
Developer-native. Memory-backed. Agent-ready.
-
[>] terminal-native F01 CLI is the product, not a wrapper around a UI[>] zero to running F02 ai init → ai serve, nothing else required
-
[>] fast by default F03 Go binary, <200ms cold start, <25MB[>] local first F08 Cypherlite embedded graph — no cloud account needed to start
-
[>] protocol-native F09 MCP + A2A endpoints live from day one, not bolted on[>] composable F10 <20 tools per agent — build the UNIX way
-
[>] own your prompts F05 agent.toml is plain text — no hidden platform injections[>] own your context F06 Explicit token budgets, no invisible spills
-
[>] stateless reducers F07 Pure functions — pause, resume, replay are first-class[>] progressive exposure F04 Zero flags on day one, full power when you need it
-
[>] observable F11 --debug shows everything; OTel traces built in[>] accessible F12 --plain for pipelines, CI, and screen readers
[>] how it compares
Unlike Python-based agent frameworks, agent-intelligence.ai compiles to a single Go binary that installs in seconds — no pip, no virtualenv, no Docker image required. Agents are fully described in agent.toml: plain text, version-controllable, and readable without executing code. MCP and A2A protocols are built in from day one, so Claude Code, Cursor, and any MCP-compatible client connects directly to your agent without additional configuration.
| Feature | agent-intelligence.ai | LangGraph | CrewAI | ADK | Agents SDK |
|---|---|---|---|---|---|
| Language | Go | Python | Python | Python | Python |
| Protocols | MCP + A2A | none native | none native | MCP | MCP |
| Memory / Graph | built-in | plugin | plugin | plugin | none |
| Install | single binary | pip | pip | pip | pip |
| Config | agent.toml | code | code | code | code |
| Local-first | yes | no | no | no | no |
| Binary size | <25 MB | 50+ MB | 50+ MB | 50+ MB | 50+ MB |
curl -fsSL https://agent-intelligence.ai/install.sh | sh
[>] quickstart — zero to running agent in 5 minutes
-
1. install the binary
curl -fsSL https://agent-intelligence.ai/install.sh | sh -
2. scaffold a new agent
ai init my-agent && cd my-agent -
3. configure agent.toml — set your model and API key
[agent] name = "my-agent" [agent.model] provider = "anthropic" model = "claude-sonnet-4-6" api_key = "${ANTHROPIC_API_KEY}" -
4. start the agent server
ai serve -
5. run your first task
ai run "summarise recent AI research papers"
→ for persistent memory: ai graph connect $GRAPH_URI
user@ai $ ai init my-research-agent
✓ scaffolded agent: my-research-agent/