README.md•5.19 kB
# MCP Conversation Server
A Model Context Protocol (MCP) server implementation for managing conversations with OpenRouter's language models. This server provides a standardized interface for applications to interact with various language models through a unified conversation management system.
## Features
- **MCP Protocol Support**
- Full MCP protocol compliance
- Resource management and discovery
- Tool-based interaction model
- Streaming response support
- Error handling and recovery
- **OpenRouter Integration**
- Support for all OpenRouter models
- Real-time streaming responses
- Automatic token counting
- Model context window management
- Available models include:
- Claude 3 Opus
- Claude 3 Sonnet
- Llama 2 70B
- And many more from OpenRouter's catalog
- **Conversation Management**
- Create and manage multiple conversations
- Support for system messages
- Message history tracking
- Token usage monitoring
- Conversation filtering and search
- **Streaming Support**
- Real-time message streaming
- Chunked response handling
- Token counting
- **File System Persistence**
- Conversation state persistence
- Configurable storage location
- Automatic state management
## Installation
```bash
npm install mcp-conversation-server
```
## Configuration
### Configuration
All configuration for the MCP Conversation Server is now provided via YAML. Please update the `config/models.yaml` file with your settings. For example:
```yaml
# MCP Server Configuration
openRouter:
apiKey: "YOUR_OPENROUTER_API_KEY" # Replace with your actual OpenRouter API key.
persistence:
path: "./conversations" # Directory for storing conversation data.
models:
# Define your models here
'provider/model-name':
id: 'provider/model-name'
contextWindow: 123456
streaming: true
temperature: 0.7
description: 'Model description'
# Default model to use if none specified
defaultModel: 'provider/model-name'
```
### Server Configuration
The MCP Conversation Server now loads all its configuration from the YAML file. In your application, you can load the configuration as follows:
```typescript
const config = await loadModelsConfig(); // Loads openRouter, persistence, models, and defaultModel settings from 'config/models.yaml'
```
*Note: Environment variables are no longer required as all configuration is provided via the YAML file.*
## Usage
### Basic Server Setup
```typescript
import { ConversationServer } from 'mcp-conversation-server';
const server = new ConversationServer(config);
server.run().catch(console.error);
```
### Available Tools
The server exposes several MCP tools:
1. **create-conversation**
```typescript
{
provider: 'openrouter', // Provider is always 'openrouter'
model: string, // OpenRouter model ID (e.g., 'anthropic/claude-3-opus-20240229')
title?: string; // Optional conversation title
}
```
2. **send-message**
```typescript
{
conversationId: string; // Conversation ID
content: string; // Message content
stream?: boolean; // Enable streaming responses
}
```
3. **list-conversations**
```typescript
{
filter?: {
model?: string; // Filter by model
startDate?: string; // Filter by start date
endDate?: string; // Filter by end date
}
}
```
### Resources
The server provides access to several resources:
1. **conversation://{id}**
- Access specific conversation details
- View message history
- Check conversation metadata
2. **conversation://list**
- List all active conversations
- Filter conversations by criteria
- Sort by recent activity
## Development
### Building
```bash
npm run build
```
### Running Tests
```bash
npm test
```
### Debugging
The server provides several debugging features:
1. **Error Logging**
- All errors are logged with stack traces
- Token usage tracking
- Rate limit monitoring
2. **MCP Inspector**
```bash
npm run inspector
```
Use the MCP Inspector to:
- Test tool execution
- View resource contents
- Monitor message flow
- Validate protocol compliance
3. **Provider Validation**
```typescript
await server.providerManager.validateProviders();
```
Validates:
- API key validity
- Model availability
- Rate limit status
### Troubleshooting
Common issues and solutions:
1. **OpenRouter Connection Issues**
- Verify your API key is valid
- Check rate limits on [OpenRouter's dashboard](https://openrouter.ai/dashboard)
- Ensure the model ID is correct
- Monitor credit usage
2. **Message Streaming Errors**
- Verify model streaming support
- Check connection stability
- Monitor token limits
- Handle timeout settings
3. **File System Errors**
- Check directory permissions
- Verify path configuration
- Monitor disk space
- Handle concurrent access
## Contributing
1. Fork the repository
2. Create a feature branch
3. Commit your changes
4. Push to the branch
5. Create a Pull Request
## License
ISC License