loading…
Search for a command to run...
loading…
Exposes local Ollama instances as tools for Claude Code, allowing users to offload code generation, text drafting, and embedding tasks to local GPUs. It support
Exposes local Ollama instances as tools for Claude Code, allowing users to offload code generation, text drafting, and embedding tasks to local GPUs. It supports multi-turn conversations and model management through the Model Context Protocol.
An MCP (Model Context Protocol) server that exposes local Ollama instances as tools for Claude Code.
Lets Claude offload code generation, drafts, embeddings, and quick questions to your local GPUs.
Run the setup script:
bash setup.sh
This creates a venv, installs dependencies, generates a machine-specific config.json, and registers the MCP server with Claude Code.
Note:
setup.shusescygpathand targets Windows (Git Bash / MSYS2). On Linux/macOS, replace thecygpath -wcalls with the paths directly, or register manually:claude mcp add ollama -s user -- /path/to/.venv/bin/python /path/to/src/ollama_mcp/server.py
Restart Claude Code.
| Tool | Description |
|---|---|
ollama_generate |
Single-turn prompt → response |
ollama_chat |
Multi-turn conversation |
ollama_embed |
Generate embedding vectors |
ollama_list_models |
List models on your Ollama instances |
Copy config.example.json to config.json and fill in your machine details, or let setup.sh generate it interactively.
pip install -e ".[dev]"
pytest tests/ -v
Extracted from a private developer infrastructure repo and published as a standalone tool. This server runs daily as part of a multi-project AI development workflow spanning game engines, RAG pipelines, and task orchestration — see mcp-rag and orchestration-engine for projects that use it.
| Problem | Cause | Fix |
|---|---|---|
config.json not found |
Setup not run | Run bash setup.sh |
| 404 on embed calls | Ollama < 0.4.0 | Upgrade Ollama (ollama update) |
Cannot connect to... |
Ollama not running on target host | Start Ollama: ollama serve or check Docker |
Request timed out |
Large model / slow hardware | Increase timeout in config.json, or pass timeout parameter |
OFFLINE in list_models |
Host unreachable | Check network, firewall, Ollama port 11434 |
cygpath: command not found |
Running setup.sh on Linux/macOS | See setup note above |
MIT
Добавь это в claude_desktop_config.json и перезапусти Claude Desktop.
{
"mcpServers": {
"ollama-mcp-server": {
"command": "npx",
"args": []
}
}
}