loading…
Search for a command to run...
loading…
Transforms your local repository into a shared project brain using recursive reasoning and local LLMs to analyze, reason, and remember project architecture.
Transforms your local repository into a shared project brain using recursive reasoning and local LLMs to analyze, reason, and remember project architecture.
rlm-mcp is an advanced Model Context Protocol (MCP) server that transforms your local repository into a "Shared Project Brain." It uses recursive reasoning (via RLM) and local LLMs (Ollama) to analyze, reason, and remember your project's architecture, helping teams maintain deep understanding of complex codebases like Grails Core.
uv), and pulls optimized reasoning models..mcp/ or .rlm/ folders.If you have Rust installed:
cargo install --git https://github.com/borinquenkid/rlm-mcp
rlm-mcp and move it to your /usr/local/bin (or equivalent).chmod +x rlm-mcp.Note: The first time you run rlm-mcp, it will automatically provision your local Python environment, pull the necessary reasoning models (approx. 5GB), and configure the background services.
.mcp/: Configuration and project-specific knowledge base.knowledge_base/: Distilled "permanent facts" about your project (version-controlled).trajectories/: Raw logs of every "thinking" session (ignored by Git).rlm-mcp is designed to orchestrate multiple specialized AI tools. You can "inject" sub-MCP servers into the Master Brain by adding them to .mcp/rlm_config.json:
{
"sub_servers": {
"git": {
"command": "npx",
"args": ["-y", "@modelcontextprotocol/server-git", "--repository", "."]
}
}
}
Once defined, rlm-mcp will automatically discover these tools, making them available to your recursive reasoning engine (e.g., mcp.git.get_diff()).
To use rlm-mcp in your IDE (like Claude Desktop), add this to your MCP configuration:
{
"mcpServers": {
"rlm-mcp": {
"command": "/path/to/rlm-mcp"
}
}
}
rlm-mcp will then auto-provision the local Ollama backend and Python environment on its first launch.
Add this to claude_desktop_config.json and restart Claude Desktop.
{
"mcpServers": {
"rlm-mcp": {
"command": "npx",
"args": []
}
}
}