chat_completion
Generate AI chat responses by processing user messages with configurable parameters for model selection, temperature, and token limits.
Input Schema
TableJSON Schema
| Name | Required | Description | Default |
|---|---|---|---|
| message | No | ||
| messages | No | ||
| model | No | deepseek-reasoner | |
| temperature | No | ||
| max_tokens | No | ||
| top_p | No | ||
| frequency_penalty | No | ||
| presence_penalty | No |
Implementation Reference
- src/index.ts:199-260 (handler)Handler function that processes arguments, calls DeepSeek chat completions API (with fallback to deepseek-chat model), and returns text content.async (args) => { let messages: ChatMessage[]; if (args.message) { messages = [{ role: 'user', content: args.message }]; } else if (args.messages) { messages = args.messages; } else { throw new Error("Either 'message' or 'messages' must be provided"); } try { const response = await this.axiosInstance.post<DeepSeekResponse>( API_CONFIG.ENDPOINTS.CHAT, { messages, model: args.model, temperature: args.temperature, max_tokens: args.max_tokens, top_p: args.top_p, frequency_penalty: args.frequency_penalty, presence_penalty: args.presence_penalty } ); return { content: [{ type: "text", text: response.data.choices[0].message.content }] }; } catch (error) { console.error("Error with deepseek-reasoner, falling back to deepseek-chat"); try { const fallbackResponse = await this.axiosInstance.post<DeepSeekResponse>( API_CONFIG.ENDPOINTS.CHAT, { messages, model: 'deepseek-chat', temperature: args.temperature, max_tokens: args.max_tokens, top_p: args.top_p, frequency_penalty: args.frequency_penalty, presence_penalty: args.presence_penalty } ); return { content: [{ type: "text", text: "Note: Fallback to deepseek-chat due to reasoner error.\n\n" + fallbackResponse.data.choices[0].message.content }] }; } catch (fallbackError) { if (axios.isAxiosError(fallbackError)) { throw new Error(`DeepSeek API error: ${fallbackError.response?.data?.error?.message ?? fallbackError.message}`); } throw fallbackError; } } }
- src/index.ts:187-198 (schema)Zod input schema defining parameters for the chat_completion tool, including message, model, and generation parameters.message: z.string().optional(), messages: z.array(z.object({ role: z.enum(['system', 'user', 'assistant']), content: z.string() })).optional(), model: z.string().default('deepseek-reasoner'), temperature: z.number().min(0).max(2).default(0.7), max_tokens: z.number().positive().int().default(8000), top_p: z.number().min(0).max(1).default(1.0), frequency_penalty: z.number().min(-2).max(2).default(0.1), presence_penalty: z.number().min(-2).max(2).default(0) },
- src/index.ts:184-261 (registration)Registration of the chat_completion tool on the McpServer instance in the setupTools method.this.server.tool( "chat_completion", { message: z.string().optional(), messages: z.array(z.object({ role: z.enum(['system', 'user', 'assistant']), content: z.string() })).optional(), model: z.string().default('deepseek-reasoner'), temperature: z.number().min(0).max(2).default(0.7), max_tokens: z.number().positive().int().default(8000), top_p: z.number().min(0).max(1).default(1.0), frequency_penalty: z.number().min(-2).max(2).default(0.1), presence_penalty: z.number().min(-2).max(2).default(0) }, async (args) => { let messages: ChatMessage[]; if (args.message) { messages = [{ role: 'user', content: args.message }]; } else if (args.messages) { messages = args.messages; } else { throw new Error("Either 'message' or 'messages' must be provided"); } try { const response = await this.axiosInstance.post<DeepSeekResponse>( API_CONFIG.ENDPOINTS.CHAT, { messages, model: args.model, temperature: args.temperature, max_tokens: args.max_tokens, top_p: args.top_p, frequency_penalty: args.frequency_penalty, presence_penalty: args.presence_penalty } ); return { content: [{ type: "text", text: response.data.choices[0].message.content }] }; } catch (error) { console.error("Error with deepseek-reasoner, falling back to deepseek-chat"); try { const fallbackResponse = await this.axiosInstance.post<DeepSeekResponse>( API_CONFIG.ENDPOINTS.CHAT, { messages, model: 'deepseek-chat', temperature: args.temperature, max_tokens: args.max_tokens, top_p: args.top_p, frequency_penalty: args.frequency_penalty, presence_penalty: args.presence_penalty } ); return { content: [{ type: "text", text: "Note: Fallback to deepseek-chat due to reasoner error.\n\n" + fallbackResponse.data.choices[0].message.content }] }; } catch (fallbackError) { if (axios.isAxiosError(fallbackError)) { throw new Error(`DeepSeek API error: ${fallbackError.response?.data?.error?.message ?? fallbackError.message}`); } throw fallbackError; } } } );