Skip to main content
Glama

MCP Conversation Server

by bsmi021
config.example.ts2.98 kB
import { ServerConfig } from './src/types/server.js'; import * as path from 'path'; /** * Example configuration for the MCP Conversation Server * * This configuration includes examples for all supported providers: * - OpenAI * - DeepSeek * - OpenRouter * * Storage paths can be configured in several ways: * 1. Use environment variable: CONVERSATIONS_PATH * 2. Set absolute path in config * 3. Set relative path (relative to project root) * 4. Let it default to OS-specific app data directory */ const config: ServerConfig = { providers: { deepseek: { endpoint: 'https://api.deepseek.com/v1', apiKey: process.env.DEEPSEEK_API_KEY || '', models: { 'deepseek-chat': { id: 'deepseek-chat', contextWindow: 32768, streaming: true }, 'deepseek-reasoner': { id: 'deepseek-reasoner', contextWindow: 64000, streaming: true } }, timeouts: { completion: 300000, // 5 minutes for non-streaming stream: 120000 // 2 minutes per stream chunk } }, openai: { endpoint: 'https://api.openai.com/v1', apiKey: process.env.OPENAI_API_KEY || '', models: { 'gpt-4': { id: 'gpt-4', contextWindow: 8192, streaming: true }, 'gpt-3.5-turbo': { id: 'gpt-3.5-turbo', contextWindow: 4096, streaming: true } }, timeouts: { completion: 300000, // 5 minutes for non-streaming stream: 60000 // 1 minute per stream chunk } } }, defaultProvider: 'deepseek', defaultModel: 'deepseek-chat', persistence: { type: 'filesystem' as const, // Use environment variable or default to d:\Projects\Conversations path: process.env.CONVERSATIONS_PATH || path.normalize('d:\\Projects\\Conversations') }, resources: { maxSizeBytes: 10 * 1024 * 1024, // 10MB allowedTypes: ['.txt', '.md', '.json', '.csv'], chunkSize: 1024 // 1KB chunks } }; export default config; /** * Example usage: * * ```typescript * import { ConversationServer } from './src/index.js'; * import { config } from './config.js'; * * // Override storage path if needed * config.persistence.path = '/custom/path/to/conversations'; * * const server = new ConversationServer(config); * server.initialize().then(() => { * console.log('Server initialized, connecting...'); * server.connect().catch(err => console.error('Failed to connect:', err)); * }).catch(err => console.error('Failed to initialize:', err)); * ``` */

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/bsmi021/mcp-conversation-server'

If you have feedback or need assistance with the MCP directory API, please join our Discord server