multi_turn_chat
Enable multi-turn conversations with DeepSeek language models through structured message exchanges, supporting custom parameters for tailored responses.
Input Schema
TableJSON Schema
| Name | Required | Description | Default |
|---|---|---|---|
| messages | Yes | ||
| model | No | deepseek-chat | |
| temperature | No | ||
| max_tokens | No | ||
| top_p | No | ||
| frequency_penalty | No | ||
| presence_penalty | No |
Implementation Reference
- src/index.ts:295-346 (handler)The async handler function for the multi_turn_chat tool. It transforms the input message, appends it to the conversation history, calls the DeepSeek chat completions API with the full history, appends the assistant's response to history, and returns the response content.async (args) => { try { // Transform new messages const newMessage = args.messages[0]; const transformedNewMessage = { role: newMessage.role, content: newMessage.content.text }; // Add new message to history this.conversationHistory.push(transformedNewMessage); // Transform all messages for API const transformedMessages = this.conversationHistory.map(msg => ({ role: msg.role, content: msg.content })); const response = await this.axiosInstance.post<DeepSeekResponse>( API_CONFIG.ENDPOINTS.CHAT, { messages: transformedMessages, model: args.model, temperature: args.temperature, max_tokens: args.max_tokens, top_p: args.top_p, frequency_penalty: args.frequency_penalty, presence_penalty: args.presence_penalty } ); // Add assistant's response to history const assistantMessage = { role: 'assistant' as const, content: response.data.choices[0].message.content }; this.conversationHistory.push(assistantMessage); return { content: [{ type: "text", text: assistantMessage.content }] }; } catch (error) { if (axios.isAxiosError(error)) { throw new Error(`DeepSeek API error: ${error.response?.data?.error?.message ?? error.message}`); } throw error; } } );
- src/index.ts:266-294 (schema)Zod schema for the multi_turn_chat tool inputs, including messages (string or array with transform), model, temperature, max_tokens, top_p, frequency_penalty, and presence_penalty.{ messages: z.union([ z.string(), z.array(z.object({ role: z.enum(['system', 'user', 'assistant']), content: z.object({ type: z.literal('text'), text: z.string() }) })) ]).transform(messages => { if (typeof messages === 'string') { return [{ role: 'user' as const, content: { type: 'text' as const, text: messages } }]; } return messages; }), model: z.string().default('deepseek-chat'), temperature: z.number().min(0).max(2).default(0.7), max_tokens: z.number().positive().int().default(8000), top_p: z.number().min(0).max(1).default(1.0), frequency_penalty: z.number().min(-2).max(2).default(0.1), presence_penalty: z.number().min(-2).max(2).default(0) },
- src/index.ts:264-265 (registration)MCP server tool registration call for the 'multi_turn_chat' tool.this.server.tool( "multi_turn_chat",
- src/index.ts:92-92 (helper)Private conversationHistory array that stores the chat history across tool calls for multi_turn_chat.private conversationHistory: ChatMessage[] = [];
- src/types.ts:21-24 (schema)ChatMessage type interface used in the conversation history for multi_turn_chat.export interface ChatMessage { role: 'system' | 'user' | 'assistant'; content: string; }