Self-hosted control plane for AI autonomous agents
Isolated Linux workspaces and git-backed Library configuration
Vision · Features · Ecosystem · Screenshots · Getting Started
Ready to deploy? Ask your local AI agent to check out INSTALL.md it'll guide you through everything.
What if you could:
Hand off entire dev cycles. Point an agent at a GitHub issue, let it write code, test by launching a Minecraft server, and open a PR when tests pass. You review the diff, not the process.
Run multi-day operations unattended. Give an agent SSH access to your home GPU through a VPN. It reads Nvidia docs, sets up training, fine-tunes models while you sleep.
Keep sensitive data local. Analyze your sequenced DNA against scientific literature. Local inference, isolated containers, nothing leaves your machines.
- Mission Control: Start, stop, and monitor agents remotely with real-time streaming
- Isolated Workspaces: Containerized Linux environments (systemd-nspawn) with per-mission directories
- Git-backed Library: Skills, tools, rules, agents, and MCPs versioned in a single repo
- MCP Registry: Global MCP servers running on the host, available to all workspaces
- Multi-platform: Web dashboard (Next.js) and iOS app (SwiftUI) with Picture-in-Picture
Open Agent is a control plane for OpenCode, the open-source AI coding agent. It delegates all model inference and autonomous execution to OpenCode while handling orchestration, workspace isolation, and configuration management.
Works great with oh-my-opencode for enhanced agent capabilities and prebuilt skill packs.
Real-time monitoring with CPU, memory, network graphs and mission timeline
Git-backed Library with skills, commands, rules, and inline editing
MCP server management with runtime status and Library integration
- Get a dedicated server (e.g., Hetzner, Ubuntu 24.04)
- Point a domain to your server IP
- Clone this repo locally
- Ask Claude to set it up:
"Please deploy Open Agent on my server at
<IP>with domainagent.example.com"
That's it. Claude will handle nginx, SSL, systemd services, and everything else.
If you feel smarter than the AI, check out INSTALL.md.
Backend:
export OPENCODE_BASE_URL="http://127.0.0.1:4096"
cargo runDashboard:
cd dashboard
bun install
bun devOpen http://localhost:3001
Work in Progress — This project is under active development. Contributions and feedback welcome.
MIT



