Prism is a free and open-source, self-hosted web agent. It supports multiple LLM providers, GitHub integration, and secure sandboxed tool execution.
-
Multi-Provider LLM Support: OpenAI, Anthropic, Google AI, OpenRouter, Groq, DeepSeek, and Ollama
-
Secure by Design: AES-256 encrypted API key storage, JWT authentication, sandboxed Docker execution
-
GitHub Integration: OAuth-based repository access for code operations
-
File Uploads: Support for images, PDFs, and code files
-
Configurable Tool Approval: Auto-execute by default, with per-tool approval settings
-
Mobile Responsive: Access from any device through your hosted instance
-
API Access: Generate API keys for external client access
-
Modern UI: ChatGPT/Claude-like chat interface with dark/light themes
- Docker and Docker Compose
- Go 1.20+ (for local development)
- Node.js 20+ (for local development)
-
Clone the repository:
git clone https://github.com/jackulau/Prism.git cd prism -
Copy the environment file:
cp .env.example .env
-
Generate an encryption key:
openssl rand -hex 32
Add this to your
.envfile asENCRYPTION_KEY. -
Start the application:
make prod
-
Open http://localhost:3000 in your browser.
For local development with hot reload:
# Install dependencies
make setup
# Start development servers
make devOr run components separately:
# Terminal 1: Backend
make run-backend
# Terminal 2: Frontend
make run-frontend| Variable | Description | Default |
|---|---|---|
PORT |
Backend server port | 8080 |
DATABASE_URL |
SQLite database path | ./data/prism.db |
ENCRYPTION_KEY |
32-byte hex key for encrypting API keys | (required) |
JWT_SECRET |
Secret for JWT tokens | change-me-in-production |
GITHUB_CLIENT_ID |
GitHub OAuth App client ID | (optional) |
GITHUB_CLIENT_SECRET |
GitHub OAuth App client secret | (optional) |
OLLAMA_HOST |
Ollama API endpoint | http://localhost:11434 |
Users configure their own API keys through the Settings UI. Prism supports a wide range of providers:
| Provider | Models | Tools | Vision | Best For |
|---|---|---|---|---|
| OpenAI | GPT-4.1, o3, o4-mini | Yes | Yes | General purpose, reasoning |
| Anthropic | Claude Opus/Sonnet/Haiku 4.5 | Yes | Yes | Coding, analysis |
| Google AI | Gemini 2.5 Flash/Pro | Yes | Yes | Large context (1M tokens) |
| OpenRouter | 200+ models | Varies | Varies | Access to all models |
| Groq | Llama 3.3, Mixtral | Yes | Some | Ultra-fast inference |
| DeepSeek | DeepSeek V3, R1, Coder | Yes | No | Cost-effective reasoning |
| Ollama | Any local model | Some | Some | Privacy, no API costs |
- Click the Settings icon in the sidebar
- Select a provider and enter your API key
- Choose a model from the dropdown
For detailed setup instructions and model recommendations, see Provider Documentation.
- Go to GitHub Settings > Developer Settings > OAuth Apps
- Create a new OAuth App
- Set the callback URL to
http://your-domain/api/v1/github/callback - Add the Client ID and Secret to your
.envfile
prism/
├── backend/ # Go backend (Fiber)
│ ├── cmd/server/ # Entry point
│ └── internal/
│ ├── api/ # HTTP handlers & WebSocket
│ ├── llm/ # LLM provider implementations
│ ├── tools/ # Tool execution & sandbox
│ ├── github/ # GitHub integration
│ └── security/ # Crypto, JWT, auth
├── frontend/ # React + TypeScript
│ └── src/
│ ├── components/ # UI components
│ ├── hooks/ # React hooks
│ ├── services/ # API & WebSocket clients
│ └── store/ # Zustand state
└── sandbox/ # Docker sandbox images
- API Keys: Encrypted at rest with AES-256-GCM
- Passwords: Hashed with Argon2id
- Sessions: JWT with 15-minute access tokens
- Tool Execution: Isolated Docker containers with:
- Memory limits (512MB default)
- CPU limits (0.5 cores default)
- Timeout (60 seconds default)
- Network isolation
- Read-only root filesystem
Connect to /api/v1/ws?token=<access_token> for real-time chat streaming.
POST /api/v1/auth/register- RegisterPOST /api/v1/auth/login- LoginGET /api/v1/conversations- List conversationsPOST /api/v1/chat/completions- Send chat message (non-streaming)
Contributions are welcome! Please read our Contributing Guide first.
Apache License 2.0 - see LICENSE for details.
