Skip to main content
Glama

marm-mcp

DESCRIPTION.md6.08 kB
# About MARM Systems ## Universal MCP Server for AI Memory Intelligence MARM started from frustration with AI conversations that felt like talking to someone with severe memory loss. After analyzing hundreds of user complaints, we built something that actually works - a **Universal MCP Server** providing intelligent memory across all AI platforms. **What makes MARM Systems different:** - **Universal MCP Server** - Memory intelligence for Claude, Gemini, Qwen, ChatGPT, and any MCP-compatible AI - **Production-ready architecture** - Docker deployment, 19 complete MCP tools, semantic search with AI embeddings - **Cross-platform memory** - One memory database shared across all AI agents and platforms - **Multi-agent development** - Built collaboratively using Claude (primary architecture), Gemini (validation), Qwen (research), and ChatGPT (testing) - **Community-driven** - Growing ecosystem with active contributors and real-world validation - **Enterprise-grade** - Rate limiting, health monitoring, connection pooling, security hardening ### How MARM MCP Server Works **Universal Memory Layer:** The MCP server provides persistent, intelligent memory that works with any MCP-compatible AI client. Claude, Gemini, Qwen, and other AI agents can all contribute to and access the same evolving knowledge base. **Semantic Intelligence:** Using AI embeddings and vector similarity search, MARM finds relevant memories by meaning, not just keywords. Ask about "authentication problems" and it surfaces related discussions about "login issues" and "user verification." **Professional Architecture:** Built with FastAPI, SQLite optimization, Docker deployment, and enterprise-grade features for reliable production use. ### Evolution from Protocol to Platform MARM evolved from a simple conversation protocol into a comprehensive AI memory platform through community feedback and multi-agent development collaboration. **Current Ecosystem:** - **Universal MCP Server** - Production-ready memory intelligence (primary focus) - **Live Chatbot Demo** - Interactive testing environment at <https://marm-systems-chatbot.onrender.com> - **Original Protocol** - Copy/paste instructions that work with any AI - **Growing Community** - Active contributors, forks, and enterprise implementations **Development Approach:** This project showcases advanced multi-agent collaboration, with Claude handling primary architecture and development, Gemini providing validation and code review, Qwen contributing research and analysis, and ChatGPT supporting testing workflows. This multi-LLM development process resulted in robust, well-validated solutions that no single AI agent could achieve alone. ### Get Started **🐳 Production MCP Server (Recommended):** ```bash docker pull lyellr88/marm-mcp-server:latest docker run -d --name marm-mcp-server -p 8001:8001 -v ~/.marm:/home/marm/.marm lyellr88/marm-mcp-server:latest claude mcp add --transport http marm-memory http://localhost:8001/mcp ``` **🌐 Live Demo:** <https://marm-systems-chatbot.onrender.com> **📋 Original Protocol:** Copy/paste instructions that work with any AI ### Future Platform Development MARM's Universal MCP Server establishes the foundation for advanced AI memory ecosystems: - **Enhanced reasoning protocols** for specialized AI workflows - **Multi-tenant deployment** for team and enterprise environments - **Extended semantic capabilities** with advanced embedding models - **Community marketplace** for memory-augmented AI tools and integrations ### Get Involved Whether you're testing MARM, suggesting improvements, or just curious about AI conversation design. You're welcome here. **Questions?** Check the FAQ or open a GitHub issue. **Want to contribute?** See the CONTRIBUTING guide. **Found a bug?** Let me know and I'll fix it. **MARM is built by the community, for the community.** --- ## 📁 Project Documentation ### **Usage Guides** - **[MARM-HANDBOOK.md](https://github.com/Lyellr88/MARM-Systems/blob/MARM-main/MARM-HANDBOOK.md)** - Original MARM protocol handbook for chatbot usage - **[MCP-HANDBOOK.md](https://github.com/Lyellr88/MARM-Systems/blob/MARM-main/MCP-HANDBOOK.md)** - Complete MCP server usage guide with commands, workflows, and examples - **[PROTOCOL.md](https://github.com/Lyellr88/MARM-Systems/blob/MARM-main/PROTOCOL.md)** - Quick start commands and protocol reference - **[FAQ.md](https://github.com/Lyellr88/MARM-Systems/blob/MARM-main/docs/FAQ.md)** - Answers to common questions about using MARM ### **MCP Server Installation** - **[INSTALL-DOCKER.md](https://github.com/Lyellr88/MARM-Systems/blob/MARM-main/docs/INSTALL-DOCKER.md)** - Docker deployment (recommended) - **[INSTALL-WINDOWS.md](https://github.com/Lyellr88/MARM-Systems/blob/MARM-main/docs/INSTALL-WINDOWS.md)** - Windows installation guide - **[INSTALL-LINUX.md](https://github.com/Lyellr88/MARM-Systems/blob/MARM-main/docs/INSTALL-LINUX.md)** - Linux installation guide - **[INSTALL-PLATFORMS.md](https://github.com/Lyellr88/MARM-Systems/blob/MARM-main/docs/INSTALL-PLATFORMS.md)** - Platfrom installtion guide ### **Chatbot Installation** - **[CHATBOT-SETUP.md](https://github.com/Lyellr88/MARM-Systems/blob/MARM-main/docs/CHATBOT-SETUP.md)** - Web chatbot setup guide ### **Project Information** - **[README.md](https://github.com/Lyellr88/MARM-Systems/blob/MARM-main/README.md)** - This file - ecosystem overview and MCP server guide - **[CONTRIBUTING.md](https://github.com/Lyellr88/MARM-Systems/blob/MARM-main/docs/CONTRIBUTING.md)** - How to contribute to MARM - **[DESCRIPTION.md](https://github.com/Lyellr88/MARM-Systems/blob/MARM-main/docs/DESCRIPTION.md)** - Protocol purpose and vision overview - **[CHANGELOG.md](https://github.com/Lyellr88/MARM-Systems/blob/MARM-main/docs/CHANGELOG.md)** - Version history and updates - **[ROADMAP.md](https://github.com/Lyellr88/MARM-Systems/blob/MARM-main/docs/ROADMAP.md)** - Planned features and development roadmap - **[LICENSE](https://github.com/Lyellr88/MARM-Systems/blob/MARM-main/docs/LICENSE)** - MIT license terms

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/Lyellr88/marm-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server