rlm-mcp
Click on "Install Server".
Wait a few minutes for the server to deploy. Once ready, it will show a "Started" state.
In the chat, type
@followed by the MCP server name and your instructions, e.g., "@rlm-mcpanalyze the architecture of this project"
That's it! The server will respond to your query, and you can continue using it as needed.
Here is a step-by-step guide with screenshots.
rlm-mcp: The Recursive Project Brain
rlm-mcp is an advanced Model Context Protocol (MCP) server that transforms your local repository into a "Shared Project Brain." It uses recursive reasoning (via RLM) and local LLMs (Ollama) to analyze, reason, and remember your project's architecture, helping teams maintain deep understanding of complex codebases like Grails Core.
Features
Frictionless Auto-Pilot: Automatically installs Ollama, manages Python environments (
uv), and pulls optimized reasoning models.Persistent Knowledge: Distills recursive reasoning into portable YAML knowledge bases that can be checked into Git.
Workspace-Aware: Automatically discovers project configurations in
.mcp/or.rlm/folders.Recursive Reasoning: Uses multi-step LLM loops to trace complex class hierarchies and architectural patterns.
Quick Start
Ensure you have Ollama installed and running.
Clone the repository and run:
cargo runThe server will automatically:
Detect your OS and hardware.
Provision a hermetic Python environment using
uv.Pull the recommended
deepseek-r1:7bmodel.Launch the MCP server on stdio.
Project Structure
.mcp/: Configuration and project-specific knowledge base.knowledge_base/: Distilled "permanent facts" about your project (version-controlled).trajectories/: Raw logs of every "thinking" session (ignored by Git).
Documentation & Roadmap
See PLAN.md for the full "Project Brain" architecture roadmap.
License
This project is licensed under the MIT License - see the LICENSE file for details.
This server cannot be installed
Maintenance
Resources
Unclaimed servers have limited discoverability.
Looking for Admin?
If you are the server author, to access and configure the admin panel.
Latest Blog Posts
MCP directory API
We provide all the information about MCP servers via our MCP API.
curl -X GET 'https://glama.ai/api/mcp/v1/servers/borinquenkid/rlm-mcp'
If you have feedback or need assistance with the MCP directory API, please join our Discord server