Skip to main content
Glama

MCP Server with LLM Integration

by MelaLitho

MCP Server

A Model Context Protocol (MCP) server implementation with LLM integration and chat memory capabilities.

Features

  • MCP Server: Full Model Context Protocol server implementation

  • LLM Integration: Support for OpenAI and Anthropic models

  • Chat Memory: Persistent conversation storage and retrieval

  • Tool System: Extensible tool framework for various operations

Installation

  1. Clone this repository:

git clone <repository-url> cd MCP
  1. Install dependencies:

pip install -r requirements.txt

Or using the development environment:

pip install -e .[dev]

Configuration

Set up your API keys as environment variables:

export OPENAI_API_KEY="your-openai-api-key" export ANTHROPIC_API_KEY="your-anthropic-api-key"

Or create a .env file:

OPENAI_API_KEY=your-openai-api-key ANTHROPIC_API_KEY=your-anthropic-api-key

Usage

Running the MCP Server

Start the server using the command line:

python -m mcp

Or run directly:

python mcp.py

Available Tools

The server provides the following tools:

Echo Tool

Simple echo functionality for testing.

{ "name": "echo", "arguments": { "text": "Hello, world!" } }

Chat Memory Tools

Store Memory

{ "name": "store_memory", "arguments": { "conversation_id": "conv_123", "content": "User preferences: dark mode enabled", "metadata": {"type": "preference"} } }

Get Memory

{ "name": "get_memory", "arguments": { "conversation_id": "conv_123" } }

LLM Chat Tool

{ "name": "llm_chat", "arguments": { "message": "What is the capital of France?", "model": "gpt-3.5-turbo" } }

Supported Models

OpenAI Models:

  • gpt-3.5-turbo

  • gpt-4

  • gpt-4-turbo

  • gpt-4o

Anthropic Models:

  • claude-3-haiku-20240307

  • claude-3-sonnet-20240229

  • claude-3-opus-20240229

Development

Running Tests

pytest

Code Formatting

black . isort .

Type Checking

mypy .

Architecture

Components

  • mcp.py: Main MCP server implementation and tool registration

  • llmintegrationsystem.py: LLM provider integration and chat completions

  • chatmemorysystem.py: Persistent conversation storage with SQLite

Database Schema

The chat memory system uses SQLite with two main tables:

  • memories: Individual conversation messages and metadata

  • conversation_summaries: Conversation overviews and statistics

Contributing

  1. Fork the repository

  2. Create a feature branch

  3. Make your changes

  4. Add tests for new functionality

  5. Submit a pull request

License

MIT License - see LICENSE file for details.

Troubleshooting

Common Issues

API Key Errors Ensure your API keys are properly set in environment variables.

Database Permissions The server creates a chat_memory.db file in the current directory. Ensure write permissions.

Port Conflicts The MCP server uses stdio communication by default. No port configuration needed.

Logging

Enable debug logging:

PYTHONPATH=. python -c "import logging; logging.basicConfig(level=logging.DEBUG); import mcp; mcp.main()"
-
security - not tested
F
license - not found
-
quality - not tested

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/MelaLitho/MCPServer'

If you have feedback or need assistance with the MCP directory API, please join our Discord server