A C++ AI agent orchestrator that runs anywhere as a native build — Linux, macOS, Windows, iOS, and Android — with zero external dependencies.
The only one that runs on mobile: a true personal assistant.
Privacy and security by design — it runs on your smartphone.
Multi-agent · Real-time task board · Web control panel · Skills system · Browser automation
Multi-provider · Scheduler · Subagents · Memory · File management · MCP Server
One command to start. Everything from the browser. No coding required.
IonClaw is an AI agent orchestrator built from the ground up in C++. One codebase, compiled natively for each platform — no runtime, no interpreter, no container required.
On a server (Linux, macOS, Windows), it starts with one command and serves a full web panel. On iOS and Android, the app embeds the same C++ engine and runs everything locally on your smartphone. It is the only AI agent orchestrator that runs on mobile — a true personal assistant, with privacy and security by design, because it runs on your device. Same codebase, same capabilities, everywhere.
Because native means fast startup, low memory, no dependencies, and true portability. The entire platform — web panel, project templates, built-in skills — is compiled into the binary. You deploy one file and it just works.
- Multi-agent — run multiple agents with independent models, tools, and workspaces
- Real-time task board — track every agent task live, with full history and status
- Web control panel — configure agents, providers, credentials, and skills from the browser
- Skills system — extend agent capabilities with simple Markdown files
- Browser automation — agents can navigate, click, type, screenshot, and extract data from web pages
- Multi-provider — Anthropic, OpenAI, Gemini, Grok, OpenRouter, DeepSeek, Kimi, and any OpenAI-compatible endpoint
- Scheduler — cron expressions, intervals, and one-shot tasks with full board tracking
- Subagents — agents can spawn child agents for parallel work
- Memory — persistent memory with search-based recall across sessions
- File management — read, write, search, and organize files within sandboxed workspaces
- MCP Server — expose agents via the Model Context Protocol for use with Claude Code, Cursor, GitHub Copilot, and other MCP clients
- MCP Client — connect to external MCP servers to use their tools and resources
- Secure — sandboxed workspaces, JWT auth, tool policy per agent, hook system for custom rules
See screenshots for some platforms running IonClaw.
Requirements: CMake 3.20+, C++17 compiler, Node.js 18+ (for web client).
git clone https://github.com/ionclaw-org/ionclaw.git
cd ionclaw
make setup-web
make build-web
make build
make installThen initialize and start a project:
ionclaw-server init /path/to/your/project
ionclaw-server start --project /path/to/your/projectOpen http://localhost:8080 in your browser. The web panel is served automatically.
- About IonClaw — Overview, advantages, and audience
- How It Works — Complete execution flow from message to response
- Architecture — System design and components
- Installation — Homebrew, build from source, Docker, and cloud deploy
- Build — Build system and Makefile targets
- Configuration — Full config.yml reference
- Custom Providers — Ollama, LM Studio, MiniMax, and other OpenAI-compatible providers
- Flutter — Flutter app, release builds, and signing
- Skills — Creating and managing skills
- Tools — Built-in tools reference
- MCP — MCP Server and Client (Model Context Protocol)
- Image Generation — Provider-specific image generation and editing
- Docker — Docker build, run, and compose
- Deploy — One-click deploy to cloud platforms
IonClaw has a community token on Solana via pump.fun:
CA: H88xMt2eK9TXB8cgA9ZCX7j4oMehbnGATFaHxNdHpump
MIT — see LICENSE for details.
- GitHub · Issues · Discussions
Made by Paulo Coutinho