Quickstart
From zero to “use brain” in under 5 minutes.
Prerequisites
You need Python 3.11+ and an MCP client (Claude Desktop, Cursor, or Windsurf). Apple Silicon recommended for fast local embeddings.
Install brain-mcp
Install via pipx (recommended) or pip:
pipx install brain-mcpOr with pip: pip install brain-mcp
Run setup
The setup wizard discovers sources, imports conversations, creates embeddings, and configures your MCP clients — all in one command:
brain-mcp setupThis discovers conversations from Claude Code, Claude Desktop, Cursor, Windsurf, and Gemini CLI.
Run the health check
Verify everything is configured correctly:
brain-mcp doctor✅ Python 3.12.4
✅ Dependencies installed
✅ Data directory exists
✅ Conversation sources found: 3
✅ Embedding model available (nomic-embed-text-v1.5)
✅ LanceDB initialized
✅ Ready to go!Configure your MCP client
brain-mcp setup already configures your MCP clients automatically. To configure a specific client later:
brain-mcp setup claude # Claude Desktop + Code
brain-mcp setup cursor # Cursor
brain-mcp setup windsurf # WindsurfManual configuration
If you prefer to configure manually, add this to your MCP config:
{
"mcpServers": {
"my-brain": {
"command": "brain-mcp",
"args": ["serve"]
}
}
}Your first query
Open your MCP client and type:
“use brain”
Then try your first real query:
> tunnel_state("my-project")🧠 Tunnel State: my-project
Stage: executing | Tone: focused
Open questions: 3 | Decisions: 5
❓ Top open questions:
1. Which chunking strategy for embeddings?
2. Should we add streaming support?
3. Rate limiting approach for API?
✅ Recent decisions:
• Use semantic chunking over fixed-size
• Target 768-dim embeddings (nomic)
• Keep everything local-first
📊 Last active: 2 hours ago
Conversations: 12 | Messages: 847