loading…
Search for a command to run...
loading…
Bridge to local Ollama LLM server. Run Llama, Mistral, Qwen and other local models through MCP.
Bridge to local Ollama LLM server. Run Llama, Mistral, Qwen and other local models through MCP.
MCP Server - Bridge to local Ollama LLM server.
Part of the HumoticaOS / SymbAIon ecosystem.
pip install mcp-server-ollama-bridge
Add to your claude_desktop_config.json:
{
"mcpServers": {
"ollama": {
"command": "mcp-server-ollama-bridge",
"env": {
"OLLAMA_HOST": "http://localhost:11434"
}
}
}
}
docker build -t mcp-server-ollama-bridge .
docker run -i -e OLLAMA_HOST=http://host.docker.internal:11434 mcp-server-ollama-bridge
| Variable | Default | Description |
|---|---|---|
OLLAMA_HOST |
http://localhost:11434 |
Ollama server URL |
MIT
One Love, One fAmIly!
This package is officially distributed via:
Note: Third-party directories may list this package but are not official or verified distribution channels for Humotica software.
Добавь это в claude_desktop_config.json и перезапусти Claude Desktop.
{
"mcpServers": {
"jaspertvdm-mcp-server-ollama-bridge": {
"command": "npx",
"args": []
}
}
}