Installation
Get brain-mcp running in 5 minutes. Everything runs locally — no API keys required for core functionality.
Prerequisites
- Python 3.11+ — for the MCP server and data pipelines
- An MCP client — Claude Desktop, Cursor, Windsurf, or any MCP-compatible tool
- ~2 GB disk space — for data, vectors, and the embedding model
- Apple Silicon recommended — for fast local embeddings via MPS acceleration (works on Intel/Linux too)
Optional: mcporter
mcporter is a CLI tool for calling MCP servers from the command line. Useful for testing but not required — brain-mcp works directly with any MCP client.
Install
Install from PyPI
pip install brain-mcpOn Apple Silicon, this automatically installs PyTorch with MPS support for fast local embeddings.
Or install from source:
git clone https://github.com/mordechaipotash/brain-mcp.git
cd brain-mcp
pip install -e .Initialize brain-mcp
brain-mcp initThis discovers available data sources, creates the data directory structure, and generates your initial configuration. You'll be prompted to select which conversation sources to import.
Run the health check
brain-mcp doctorVerifies all dependencies, checks data paths, and confirms the embedding model can load. Fix any issues it reports before continuing.
Configure your MCP client
The fastest way:
brain-mcp setup claude # Auto-configure Claude DesktopOr configure manually — see the client configuration section below.
Import Your Data
brain-mcp needs conversation data to work with. Import from any of these sources:
ChatGPT
Export your data from Settings → Data Controls → Export Data in ChatGPT. Place the conversations.json in the brain-mcp data directory.
Claude Desktop
brain-mcp reads Claude Desktop conversation logs from their default location. No export needed.
Claude Code / Clawdbot
Session transcripts are synced automatically via the sync pipeline.
After importing, generate embeddings:
python -m cli sync # Sync all conversation sources
python -m cli embed # Generate local embeddings for semantic searchClient Configuration
Claude Desktop
Add to ~/Library/Application Support/Claude/claude_desktop_config.json:
{
"mcpServers": {
"brain": {
"command": "python",
"args": ["mcp_brain_server.py"],
"cwd": "/path/to/brain-mcp"
}
}
}Cursor
Add to .cursor/mcp.json in your project root or global Cursor config:
{
"mcpServers": {
"brain": {
"command": "python",
"args": ["mcp_brain_server.py"],
"cwd": "/path/to/brain-mcp"
}
}
}Windsurf
Add to your Windsurf MCP configuration:
{
"mcpServers": {
"brain": {
"command": "python",
"args": ["mcp_brain_server.py"],
"cwd": "/path/to/brain-mcp"
}
}
}Path configuration
Replace /path/to/brain-mcp with the actual path where you cloned the repository. Use the full absolute path for reliability.
Verify Installation
Open your MCP client and type “use brain”. You should see the AI call brain-mcp tools and return context from your conversation history.
You can also verify from the command line with mcporter:
# Install mcporter (optional CLI tool)
npm install -g mcporter
# Test a tool call
mcporter call brain.brain_stats
# Should return something like:
# 📊 Brain Stats
# Messages: 377,326 | Conversations: 12,847
# Embeddings: 82,000 | Summaries: 9,979Next up
Now that you're set up, learn about the architecture or jump straight to the tools overview.