Skip to main content
Glama

MCP Chat

  • Apple
  • Linux

MCP Chat

MCP Chat is a command-line interface application. The application supports document retrieval, command-based prompts, and extensible tool integrations via the MCP (Model Control Protocol) architecture.

Prerequisites

  • Python 3.9+
  • Any Chat Completions LLM API Key and Provider (i.e: Gemini)

Setup

Step 1: Configure the environment variables

  1. Create or edit the .env file in the project root and verify that the following variables are set correctly:
LLM_API_KEY="" # Enter your GEMINI API secret key LLM_CHAT_COMPLETION_URL="https://generativelanguage.googleapis.com/v1beta/openai/" LLM_MODEL="gemini-2.0-flash"

Step 2: Install dependencies

uv is a fast Python package installer and resolver.

  1. Install uv, if not already installed:
pip install uv
  1. Create and activate a virtual environment:
uv venv source .venv/bin/activate # On Windows: .venv\Scripts\activate
  1. Install dependencies:
uv sync
  1. Start MCP Server:
uv run uvicorn mcp_server:mcp_app --reload
  1. Run the project with ChatAgent in CLI
uv run main.py
  1. Optionally start inspector
npx @modelcontextprotocol/inspector

Usage

Basic Interaction

Simply type your message and press Enter to chat with the model.

Document Retrieval

Use the @ symbol followed by a document ID to include document content in your query:

> Tell me about @deposition.md

Commands

Use the / prefix to execute commands defined in the MCP server:

> /summarize deposition.md

Commands will auto-complete when you press Tab.

Development

Adding New Documents

Edit the mcp_server.py file to add new documents to the docs dictionary.

Implementing MCP Features

To fully implement the MCP features:

  1. Complete the TODOs in mcp_server.py
  2. Implement the missing functionality in mcp_client.py

Linting and Typing Check

There are no lint or type checks implemented.

-
security - not tested
F
license - not found
-
quality - not tested

A command-line interface application that enables interaction with LLMs through document retrieval, command-based prompts, and extensible tool integrations using the Model Control Protocol architecture.

  1. Prerequisites
    1. Setup
      1. Step 1: Configure the environment variables
      2. Step 2: Install dependencies
    2. Usage
      1. Basic Interaction
      2. Document Retrieval
      3. Commands
    3. Development
      1. Adding New Documents
      2. Implementing MCP Features
      3. Linting and Typing Check

    Related MCP Servers

    • -
      security
      F
      license
      -
      quality
      A Model Context Protocol server that allows LLMs to execute shell commands and receive their output in a controlled manner.
      Last updated -
      3
      Python
    • -
      security
      A
      license
      -
      quality
      A comprehensive toolkit that enhances LLM capabilities through the Model Context Protocol, allowing LLMs to interact with external services including command-line operations, file management, Figma integration, and audio processing.
      Last updated -
      22
      Python
      Apache 2.0
      • Linux
      • Apple
    • -
      security
      F
      license
      -
      quality
      Allows LLM tools like Claude Desktop and Cursor AI to access and summarize code files through a Model Context Protocol server, providing structured access to codebase content without manual copying.
      Last updated -
      1
      TypeScript
      • Linux
      • Apple
    • -
      security
      F
      license
      -
      quality
      A customized MCP server that enables integration between LLM applications and documentation sources, providing AI-assisted access to LangGraph and Model Context Protocol documentation.
      Last updated -
      1
      Python
      • Linux
      • Apple

    View all related MCP servers

    MCP directory API

    We provide all the information about MCP servers via our MCP API.

    curl -X GET 'https://glama.ai/api/mcp/v1/servers/Abdullah-1121/MCP-2'

    If you have feedback or need assistance with the MCP directory API, please join our Discord server