Skip to main content
Glama

multi_turn_chat

Enable multi-turn conversations using DeepSeek's language models, allowing dynamic interactions with adjustable parameters like temperature and max tokens for tailored responses.

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
frequency_penaltyNo
max_tokensNo
messagesYes
modelNodeepseek-chat
presence_penaltyNo
temperatureNo
top_pNo

Implementation Reference

  • Handler function for multi_turn_chat tool that maintains persistent conversation history, transforms messages, calls DeepSeek API, updates history with response, and returns text content.
    async (args) => { try { // Transform new messages const newMessage = args.messages[0]; const transformedNewMessage = { role: newMessage.role, content: newMessage.content.text }; // Add new message to history this.conversationHistory.push(transformedNewMessage); // Transform all messages for API const transformedMessages = this.conversationHistory.map(msg => ({ role: msg.role, content: msg.content })); const response = await this.axiosInstance.post<DeepSeekResponse>( API_CONFIG.ENDPOINTS.CHAT, { messages: transformedMessages, model: args.model, temperature: args.temperature, max_tokens: args.max_tokens, top_p: args.top_p, frequency_penalty: args.frequency_penalty, presence_penalty: args.presence_penalty } ); // Add assistant's response to history const assistantMessage = { role: 'assistant' as const, content: response.data.choices[0].message.content }; this.conversationHistory.push(assistantMessage); return { content: [{ type: "text", text: assistantMessage.content }] }; } catch (error) { if (axios.isAxiosError(error)) { throw new Error(`DeepSeek API error: ${error.response?.data?.error?.message ?? error.message}`); } throw error; } }
  • Zod input schema for multi_turn_chat tool, supporting string or structured messages with transformation, and various generation parameters.
    { messages: z.union([ z.string(), z.array(z.object({ role: z.enum(['system', 'user', 'assistant']), content: z.object({ type: z.literal('text'), text: z.string() }) })) ]).transform(messages => { if (typeof messages === 'string') { return [{ role: 'user' as const, content: { type: 'text' as const, text: messages } }]; } return messages; }), model: z.string().default('deepseek-chat'), temperature: z.number().min(0).max(2).default(0.7), max_tokens: z.number().positive().int().default(8000), top_p: z.number().min(0).max(1).default(1.0), frequency_penalty: z.number().min(-2).max(2).default(0.1), presence_penalty: z.number().min(-2).max(2).default(0) },
  • src/index.ts:264-346 (registration)
    Registration of the multi_turn_chat tool on the MCP server using server.tool(name, inputSchema, handlerFn).
    this.server.tool( "multi_turn_chat", { messages: z.union([ z.string(), z.array(z.object({ role: z.enum(['system', 'user', 'assistant']), content: z.object({ type: z.literal('text'), text: z.string() }) })) ]).transform(messages => { if (typeof messages === 'string') { return [{ role: 'user' as const, content: { type: 'text' as const, text: messages } }]; } return messages; }), model: z.string().default('deepseek-chat'), temperature: z.number().min(0).max(2).default(0.7), max_tokens: z.number().positive().int().default(8000), top_p: z.number().min(0).max(1).default(1.0), frequency_penalty: z.number().min(-2).max(2).default(0.1), presence_penalty: z.number().min(-2).max(2).default(0) }, async (args) => { try { // Transform new messages const newMessage = args.messages[0]; const transformedNewMessage = { role: newMessage.role, content: newMessage.content.text }; // Add new message to history this.conversationHistory.push(transformedNewMessage); // Transform all messages for API const transformedMessages = this.conversationHistory.map(msg => ({ role: msg.role, content: msg.content })); const response = await this.axiosInstance.post<DeepSeekResponse>( API_CONFIG.ENDPOINTS.CHAT, { messages: transformedMessages, model: args.model, temperature: args.temperature, max_tokens: args.max_tokens, top_p: args.top_p, frequency_penalty: args.frequency_penalty, presence_penalty: args.presence_penalty } ); // Add assistant's response to history const assistantMessage = { role: 'assistant' as const, content: response.data.choices[0].message.content }; this.conversationHistory.push(assistantMessage); return { content: [{ type: "text", text: assistantMessage.content }] }; } catch (error) { if (axios.isAxiosError(error)) { throw new Error(`DeepSeek API error: ${error.response?.data?.error?.message ?? error.message}`); } throw error; } } );
  • Class property that stores the persistent conversation history used by the multi_turn_chat handler.
    private conversationHistory: ChatMessage[] = [];

Other Tools

Related Tools

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/liuchongchong1995/deepseek-mcp-server'

If you have feedback or need assistance with the MCP directory API, please join our Discord server