Click on "Install Server".
Wait a few minutes for the server to deploy. Once ready, it will show a "Started" state.
In the chat, type
@followed by the MCP server name and your instructions, e.g., "@MCP Server with LLM Integrationask Claude to summarize our conversation about project requirements"
That's it! The server will respond to your query, and you can continue using it as needed.
Here is a step-by-step guide with screenshots.
MCP Server
A Model Context Protocol (MCP) server implementation with LLM integration and chat memory capabilities.
Features
MCP Server: Full Model Context Protocol server implementation
LLM Integration: Support for OpenAI and Anthropic models
Chat Memory: Persistent conversation storage and retrieval
Tool System: Extensible tool framework for various operations
Installation
Clone this repository:
git clone <repository-url>
cd MCPInstall dependencies:
pip install -r requirements.txtOr using the development environment:
pip install -e .[dev]Configuration
Set up your API keys as environment variables:
export OPENAI_API_KEY="your-openai-api-key"
export ANTHROPIC_API_KEY="your-anthropic-api-key"Or create a .env file:
OPENAI_API_KEY=your-openai-api-key
ANTHROPIC_API_KEY=your-anthropic-api-keyUsage
Running the MCP Server
Start the server using the command line:
python -m mcpOr run directly:
python mcp.pyAvailable Tools
The server provides the following tools:
Echo Tool
Simple echo functionality for testing.
{
"name": "echo",
"arguments": {
"text": "Hello, world!"
}
}Chat Memory Tools
Store Memory
{
"name": "store_memory",
"arguments": {
"conversation_id": "conv_123",
"content": "User preferences: dark mode enabled",
"metadata": {"type": "preference"}
}
}Get Memory
{
"name": "get_memory",
"arguments": {
"conversation_id": "conv_123"
}
}LLM Chat Tool
{
"name": "llm_chat",
"arguments": {
"message": "What is the capital of France?",
"model": "gpt-3.5-turbo"
}
}Supported Models
OpenAI Models:
gpt-3.5-turbo
gpt-4
gpt-4-turbo
gpt-4o
Anthropic Models:
claude-3-haiku-20240307
claude-3-sonnet-20240229
claude-3-opus-20240229
Development
Running Tests
pytestCode Formatting
black .
isort .Type Checking
mypy .Architecture
Components
mcp.py: Main MCP server implementation and tool registration
llmintegrationsystem.py: LLM provider integration and chat completions
chatmemorysystem.py: Persistent conversation storage with SQLite
Database Schema
The chat memory system uses SQLite with two main tables:
memories: Individual conversation messages and metadataconversation_summaries: Conversation overviews and statistics
Contributing
Fork the repository
Create a feature branch
Make your changes
Add tests for new functionality
Submit a pull request
License
MIT License - see LICENSE file for details.
Troubleshooting
Common Issues
API Key Errors Ensure your API keys are properly set in environment variables.
Database Permissions
The server creates a chat_memory.db file in the current directory. Ensure write permissions.
Port Conflicts The MCP server uses stdio communication by default. No port configuration needed.
Logging
Enable debug logging:
PYTHONPATH=. python -c "import logging; logging.basicConfig(level=logging.DEBUG); import mcp; mcp.main()"This server cannot be installed
Resources
Unclaimed servers have limited discoverability.
Looking for Admin?
If you are the server author, to access and configure the admin panel.