Skip to main content
Glama

MCP - Model Context Protocol

🧠 MCP - Model Context Protocol

Complete project for a conversational application with:

  • ✅ FastAPI + JWT
  • ✅ Local LLM via Ollama (ex: Mistral)
  • ✅ Context with vector memory (ChromaDB)
  • ✅ Support for multiple users and sessions
  • ✅ Automatic summary of long history
  • ✅ Plugin system to perform real actions

🚀 How to rotate

1. Clone the project and create the environment

git clone <repo> cd mcp python3 -m venv .venv source .venv/bin/activate pip install -r requirements.txt

2. Configure .env

MODEL_NAME=mistral VECTOR_DB_PATH=./chroma DB_PATH=./mcp.db CONTEXT_LIMIT=5 SUMMARY_TRIGGER=20

3. Launch Ollama

ollama run mistral

4. Upload the server

chmod +x start.sh ./start.sh

🛡️ Authentication

  • POST /auth/register: Create new user
  • POST /auth/login: Returns JWT token

Use the JWT token in requests to /mcp/chat.

🤖 Plugins

To call a plugin:

{ "session_id": "sessao01", "prompt": "plugin: {\"name\": \"list_files\", \"args\": {\"path\": \"/etc\"}}" }

📁 Structure

app/ ├── routes/ # Rotas da API ├── services/ # Lógica de negócio (MCP, plugins, memória) ├── db/ # Persistência (SQLite e vetorial) ├── models/ # Schemas Pydantic ├── plugins/ # Plugins executáveis pelo MCP ├── auth/ # Login, JWT, usuários

📬 Contact

Developed by [Everson 🧠].

-
security - not tested
F
license - not found
-
quality - not tested

local-only server

The server can only run on the client's local machine because it depends on local resources.

A conversational application server that integrates LLM capabilities via Ollama with vector memory context, supporting multiple users, sessions, automatic history summarization, and a plugin system for executing real actions.

  1. 🚀 How to rotate
    1. Clone the project and create the environment
    2. Configure .env
    3. Launch Ollama
    4. Upload the server
  2. 🛡️ Authentication
    1. 🤖 Plugins
      1. 📁 Structure
        1. 📬 Contact

          Related MCP Servers

          • -
            security
            F
            license
            -
            quality
            An interactive chat interface that combines Ollama's LLM capabilities with PostgreSQL database access through the Model Context Protocol (MCP). Ask questions about your data in natural language and get AI-powered responses backed by real SQL queries.
            Last updated -
            28
            TypeScript
          • -
            security
            F
            license
            -
            quality
            A TypeScript-based server that provides a memory system for Large Language Models (LLMs), allowing users to interact with multiple LLM providers while maintaining conversation history and offering tools for managing providers and model configurations.
            Last updated -
            20
            JavaScript
            • Apple
          • A
            security
            F
            license
            A
            quality
            This MCP server provides persistent memory integration for chat applications by utilizing a local knowledge graph to remember user information across interactions.
            Last updated -
            9
            14,700
            1
            JavaScript
          • -
            security
            -
            license
            -
            quality
            A Model Context Protocol server that enables LLMs to interact with Ramp's financial data by retrieving, analyzing, and running tasks through Ramp's Developer API, using an in-memory SQLite database to overcome token limitations.
            Last updated -
            14
            Python
            MIT License

          View all related MCP servers

          MCP directory API

          We provide all the information about MCP servers via our MCP API.

          curl -X GET 'https://glama.ai/api/mcp/v1/servers/eversonpereira/mcp'

          If you have feedback or need assistance with the MCP directory API, please join our Discord server