Installation
Synapse AI requires Python 3.11+ and Node.js 22+ (Node 20.9+ also works). Everything else — including the uv package manager — is installed automatically by the setup script.
One-command install
macOS / Linux
curl -sSL https://raw.githubusercontent.com/naveenraj-17/synapse-ai/main/setup.sh | bash
Windows (PowerShell)
irm https://raw.githubusercontent.com/naveenraj-17/synapse-ai/main/setup.ps1 | iex
Requirements:
Check for Python 3.11+, Node.js 22+, and uv
After install, open http://localhost:3000 to access the UI.
npm
npm install -g synapse-orch-ai
synapse
pip
pip install synapse-orch-ai
synapse
Docker
No Python or Node.js required on the host. Ideal for teams deploying on shared infrastructure or servers.
docker run -d \
-p 3000:3000 \
-v synapse-data:/data \
-v /var/run/docker.sock:/var/run/docker.sock \
synapseorchai/synapse-ai:latest
Then open http://localhost:3000. Pass your LLM API keys and any config as environment variables:
docker run -d \
-p 3000:3000 \
-v synapse-data:/data \
-v /var/run/docker.sock:/var/run/docker.sock \
-e ANTHROPIC_API_KEY=sk-ant-... \
-e OPENAI_API_KEY=sk-... \
-e OLLAMA_BASE_URL=http://host-gateway:11434 \
--add-host=host-gateway:host-gateway \
synapseorchai/synapse-ai:latest
Custom ports
Override the default ports (frontend 3000, backend 8765) with environment variables:
docker run -d \
-p 8080:8080 \
-p 9000:9000 \
-e SYNAPSE_FRONTEND_PORT=8080 \
-e SYNAPSE_BACKEND_PORT=9000 \
-v synapse-data:/data \
-v /var/run/docker.sock:/var/run/docker.sock \
synapseorchai/synapse-ai:latest
The -p HOST:CONTAINER values must match the -e values.
Manual install
If you prefer to set things up yourself:
1. Clone the repository
git clone https://github.com/naveenraj-17/synapse-ai.git
cd synapse-ai
2. Backend setup
cd backend
python3.11 -m venv venv
source venv/bin/activate # Windows: venv\Scripts\activate
pip install -r requirements.txt
python3.11 main.py # Starts on port 8000 (or $SYNAPSE_BACKEND_PORT)
3. Frontend setup
Open a second terminal:
cd frontend
npm install
npm run dev # Starts on port 3000 (or $SYNAPSE_FRONTEND_PORT)
System requirements
| Requirement | Minimum | Recommended |
|---|---|---|
| Python | 3.11 | 3.12 |
| Node.js | 20.9 | 22 |
| RAM | 2 GB | 8 GB |
| Disk | 1 GB | 5 GB (+ model size for Ollama) |
| Docker | Optional | Required for Python sandbox tool |
Optional dependencies
| Feature | Dependency | Install |
|---|---|---|
| Python sandbox tool | Docker | docker.com |
| Local LLM | Ollama | ollama.com |
| Code embeddings | PostgreSQL + pgvector | See Code Repos |
| Browser Automation | Playwright | npx @playwright/mcp install-browser chrome-for-testing |
Upgrading
| Install method | Upgrade command |
|---|---|
| Bash / PowerShell installer (recommended) | synapse upgrade |
| pip | pip install --upgrade synapse-orch-ai |
| npm | npm update -g synapse-orch-ai |
| Docker | docker pull synapseorchai/synapse-ai:latest |
Uninstalling
synapse uninstall
Stops all services and removes ~/.synapse (your data directory). The repo folder is left intact.
Verifying the install
synapse status
Expected output:
Backend ✓ running on http://localhost:8000
Frontend ✓ running on http://localhost:3000
Head to Quick Start to create your first agent and send a message in under 5 minutes.