Skip to main content
Glama

MCP Server with LLM Integration

by MelaLitho

MCP Server

A Model Context Protocol (MCP) server implementation with LLM integration and chat memory capabilities.

Features

  • MCP Server: Full Model Context Protocol server implementation
  • LLM Integration: Support for OpenAI and Anthropic models
  • Chat Memory: Persistent conversation storage and retrieval
  • Tool System: Extensible tool framework for various operations

Installation

  1. Clone this repository:
git clone <repository-url> cd MCP
  1. Install dependencies:
pip install -r requirements.txt

Or using the development environment:

pip install -e .[dev]

Configuration

Set up your API keys as environment variables:

export OPENAI_API_KEY="your-openai-api-key" export ANTHROPIC_API_KEY="your-anthropic-api-key"

Or create a .env file:

OPENAI_API_KEY=your-openai-api-key ANTHROPIC_API_KEY=your-anthropic-api-key

Usage

Running the MCP Server

Start the server using the command line:

python -m mcp

Or run directly:

python mcp.py

Available Tools

The server provides the following tools:

Echo Tool

Simple echo functionality for testing.

{ "name": "echo", "arguments": { "text": "Hello, world!" } }
Chat Memory Tools

Store Memory

{ "name": "store_memory", "arguments": { "conversation_id": "conv_123", "content": "User preferences: dark mode enabled", "metadata": {"type": "preference"} } }

Get Memory

{ "name": "get_memory", "arguments": { "conversation_id": "conv_123" } }
LLM Chat Tool
{ "name": "llm_chat", "arguments": { "message": "What is the capital of France?", "model": "gpt-3.5-turbo" } }

Supported Models

OpenAI Models:

  • gpt-3.5-turbo
  • gpt-4
  • gpt-4-turbo
  • gpt-4o

Anthropic Models:

  • claude-3-haiku-20240307
  • claude-3-sonnet-20240229
  • claude-3-opus-20240229

Development

Running Tests

pytest

Code Formatting

black . isort .

Type Checking

mypy .

Architecture

Components

  • mcp.py: Main MCP server implementation and tool registration
  • llmintegrationsystem.py: LLM provider integration and chat completions
  • chatmemorysystem.py: Persistent conversation storage with SQLite

Database Schema

The chat memory system uses SQLite with two main tables:

  • memories: Individual conversation messages and metadata
  • conversation_summaries: Conversation overviews and statistics

Contributing

  1. Fork the repository
  2. Create a feature branch
  3. Make your changes
  4. Add tests for new functionality
  5. Submit a pull request

License

MIT License - see LICENSE file for details.

Troubleshooting

Common Issues

API Key Errors Ensure your API keys are properly set in environment variables.

Database Permissions The server creates a chat_memory.db file in the current directory. Ensure write permissions.

Port Conflicts The MCP server uses stdio communication by default. No port configuration needed.

Logging

Enable debug logging:

PYTHONPATH=. python -c "import logging; logging.basicConfig(level=logging.DEBUG); import mcp; mcp.main()"
-
security - not tested
F
license - not found
-
quality - not tested

hybrid server

The server is able to function both locally and remotely, depending on the configuration or use case.

Enables chat with multiple LLM providers (OpenAI and Anthropic) while maintaining persistent conversation memory. Provides extensible tool framework for various operations including echo functionality and conversation storage/retrieval.

  1. Features
    1. Installation
      1. Configuration
        1. Usage
          1. Running the MCP Server
          2. Available Tools
          3. Supported Models
        2. Development
          1. Running Tests
          2. Code Formatting
          3. Type Checking
        3. Architecture
          1. Components
          2. Database Schema
        4. Contributing
          1. License
            1. Troubleshooting
              1. Common Issues
              2. Logging

            MCP directory API

            We provide all the information about MCP servers via our MCP API.

            curl -X GET 'https://glama.ai/api/mcp/v1/servers/MelaLitho/MCPServer'

            If you have feedback or need assistance with the MCP directory API, please join our Discord server