Skip to main content
Glama

chat_completion

Generate AI chat responses by processing user messages with configurable parameters for model selection, temperature, and token limits.

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
messageNo
messagesNo
modelNodeepseek-reasoner
temperatureNo
max_tokensNo
top_pNo
frequency_penaltyNo
presence_penaltyNo

Implementation Reference

  • Handler function that processes arguments, calls DeepSeek chat completions API (with fallback to deepseek-chat model), and returns text content.
    async (args) => { let messages: ChatMessage[]; if (args.message) { messages = [{ role: 'user', content: args.message }]; } else if (args.messages) { messages = args.messages; } else { throw new Error("Either 'message' or 'messages' must be provided"); } try { const response = await this.axiosInstance.post<DeepSeekResponse>( API_CONFIG.ENDPOINTS.CHAT, { messages, model: args.model, temperature: args.temperature, max_tokens: args.max_tokens, top_p: args.top_p, frequency_penalty: args.frequency_penalty, presence_penalty: args.presence_penalty } ); return { content: [{ type: "text", text: response.data.choices[0].message.content }] }; } catch (error) { console.error("Error with deepseek-reasoner, falling back to deepseek-chat"); try { const fallbackResponse = await this.axiosInstance.post<DeepSeekResponse>( API_CONFIG.ENDPOINTS.CHAT, { messages, model: 'deepseek-chat', temperature: args.temperature, max_tokens: args.max_tokens, top_p: args.top_p, frequency_penalty: args.frequency_penalty, presence_penalty: args.presence_penalty } ); return { content: [{ type: "text", text: "Note: Fallback to deepseek-chat due to reasoner error.\n\n" + fallbackResponse.data.choices[0].message.content }] }; } catch (fallbackError) { if (axios.isAxiosError(fallbackError)) { throw new Error(`DeepSeek API error: ${fallbackError.response?.data?.error?.message ?? fallbackError.message}`); } throw fallbackError; } } }
  • Zod input schema defining parameters for the chat_completion tool, including message, model, and generation parameters.
    message: z.string().optional(), messages: z.array(z.object({ role: z.enum(['system', 'user', 'assistant']), content: z.string() })).optional(), model: z.string().default('deepseek-reasoner'), temperature: z.number().min(0).max(2).default(0.7), max_tokens: z.number().positive().int().default(8000), top_p: z.number().min(0).max(1).default(1.0), frequency_penalty: z.number().min(-2).max(2).default(0.1), presence_penalty: z.number().min(-2).max(2).default(0) },
  • src/index.ts:184-261 (registration)
    Registration of the chat_completion tool on the McpServer instance in the setupTools method.
    this.server.tool( "chat_completion", { message: z.string().optional(), messages: z.array(z.object({ role: z.enum(['system', 'user', 'assistant']), content: z.string() })).optional(), model: z.string().default('deepseek-reasoner'), temperature: z.number().min(0).max(2).default(0.7), max_tokens: z.number().positive().int().default(8000), top_p: z.number().min(0).max(1).default(1.0), frequency_penalty: z.number().min(-2).max(2).default(0.1), presence_penalty: z.number().min(-2).max(2).default(0) }, async (args) => { let messages: ChatMessage[]; if (args.message) { messages = [{ role: 'user', content: args.message }]; } else if (args.messages) { messages = args.messages; } else { throw new Error("Either 'message' or 'messages' must be provided"); } try { const response = await this.axiosInstance.post<DeepSeekResponse>( API_CONFIG.ENDPOINTS.CHAT, { messages, model: args.model, temperature: args.temperature, max_tokens: args.max_tokens, top_p: args.top_p, frequency_penalty: args.frequency_penalty, presence_penalty: args.presence_penalty } ); return { content: [{ type: "text", text: response.data.choices[0].message.content }] }; } catch (error) { console.error("Error with deepseek-reasoner, falling back to deepseek-chat"); try { const fallbackResponse = await this.axiosInstance.post<DeepSeekResponse>( API_CONFIG.ENDPOINTS.CHAT, { messages, model: 'deepseek-chat', temperature: args.temperature, max_tokens: args.max_tokens, top_p: args.top_p, frequency_penalty: args.frequency_penalty, presence_penalty: args.presence_penalty } ); return { content: [{ type: "text", text: "Note: Fallback to deepseek-chat due to reasoner error.\n\n" + fallbackResponse.data.choices[0].message.content }] }; } catch (fallbackError) { if (axios.isAxiosError(fallbackError)) { throw new Error(`DeepSeek API error: ${fallbackError.response?.data?.error?.message ?? fallbackError.message}`); } throw fallbackError; } } } );

Other Tools

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/DMontgomery40/deepseek-mcp-server'

If you have feedback or need assistance with the MCP directory API, please join our Discord server