README.md•3.26 kB
# MCP Server
A Model Context Protocol (MCP) server implementation with LLM integration and chat memory capabilities.
## Features
- **MCP Server**: Full Model Context Protocol server implementation
- **LLM Integration**: Support for OpenAI and Anthropic models
- **Chat Memory**: Persistent conversation storage and retrieval
- **Tool System**: Extensible tool framework for various operations
## Installation
1. Clone this repository:
```bash
git clone <repository-url>
cd MCP
```
2. Install dependencies:
```bash
pip install -r requirements.txt
```
Or using the development environment:
```bash
pip install -e .[dev]
```
## Configuration
Set up your API keys as environment variables:
```bash
export OPENAI_API_KEY="your-openai-api-key"
export ANTHROPIC_API_KEY="your-anthropic-api-key"
```
Or create a `.env` file:
```env
OPENAI_API_KEY=your-openai-api-key
ANTHROPIC_API_KEY=your-anthropic-api-key
```
## Usage
### Running the MCP Server
Start the server using the command line:
```bash
python -m mcp
```
Or run directly:
```bash
python mcp.py
```
### Available Tools
The server provides the following tools:
#### Echo Tool
Simple echo functionality for testing.
```json
{
"name": "echo",
"arguments": {
"text": "Hello, world!"
}
}
```
#### Chat Memory Tools
**Store Memory**
```json
{
"name": "store_memory",
"arguments": {
"conversation_id": "conv_123",
"content": "User preferences: dark mode enabled",
"metadata": {"type": "preference"}
}
}
```
**Get Memory**
```json
{
"name": "get_memory",
"arguments": {
"conversation_id": "conv_123"
}
}
```
#### LLM Chat Tool
```json
{
"name": "llm_chat",
"arguments": {
"message": "What is the capital of France?",
"model": "gpt-3.5-turbo"
}
}
```
### Supported Models
**OpenAI Models:**
- gpt-3.5-turbo
- gpt-4
- gpt-4-turbo
- gpt-4o
**Anthropic Models:**
- claude-3-haiku-20240307
- claude-3-sonnet-20240229
- claude-3-opus-20240229
## Development
### Running Tests
```bash
pytest
```
### Code Formatting
```bash
black .
isort .
```
### Type Checking
```bash
mypy .
```
## Architecture
### Components
- **mcp.py**: Main MCP server implementation and tool registration
- **llmintegrationsystem.py**: LLM provider integration and chat completions
- **chatmemorysystem.py**: Persistent conversation storage with SQLite
### Database Schema
The chat memory system uses SQLite with two main tables:
- `memories`: Individual conversation messages and metadata
- `conversation_summaries`: Conversation overviews and statistics
## Contributing
1. Fork the repository
2. Create a feature branch
3. Make your changes
4. Add tests for new functionality
5. Submit a pull request
## License
MIT License - see LICENSE file for details.
## Troubleshooting
### Common Issues
**API Key Errors**
Ensure your API keys are properly set in environment variables.
**Database Permissions**
The server creates a `chat_memory.db` file in the current directory. Ensure write permissions.
**Port Conflicts**
The MCP server uses stdio communication by default. No port configuration needed.
### Logging
Enable debug logging:
```bash
PYTHONPATH=. python -c "import logging; logging.basicConfig(level=logging.DEBUG); import mcp; mcp.main()"
```