Skip to main content

Quick Start

This guide gets you from zero to a working AI agent in under 5 minutes. Make sure you have Synapse installed first.

Step 1 — Configure a LLM provider

Open the Synapse UI at http://localhost:3000 and navigate to Settings → LLM.

Using Ollama (no API key required)

  1. Install Ollama if you haven't already
  2. Pull a model: ollama pull mistral
  3. In Synapse Settings, set Mode to local and Default model to mistral

Using Claude or GPT-4

  1. Set Mode to cloud
  2. Enter your API key (Anthropic, OpenAI, Gemini, Grok, or DeepSeek)
  3. Set Default model — e.g. claude-3-5-sonnet-20241022 or gpt-4o

Step 2 — Create an agent

  1. Click New Agent on the canvas page (or the + button in the sidebar)
  2. Fill in:
    • Name — e.g. Research Agent
    • Typeconversational
    • Tools — Select web scraper or Browser Automation tool (to start)
    • System prompt — e.g. You are a helpful research assistant. When given a topic, search the web and provide a concise summary with key facts.
  3. Click Save

Step 3 — Chat with your agent

  1. Select your agent from the navbar
  2. Type a message: What is the Model Context Protocol and how does it work?
  3. Watch the agent reason through the problem, call tools if needed, and return a response

What's next?

TopicDescription
AgentsLearn about agent types, tools, memory, and system prompts
OrchestrationsBuild multi-step deterministic workflows
ToolsExplore the 10 native tools available to agents
MCP ServersConnect GitHub, Jira, Zapier, and more
API ReferenceFull REST API documentation
DockerRun Synapse in a container