chat_completion
Generate text responses using local AI models with optional image support for multimodal tasks. This tool provides chat completion functionality within the Ollama MCP Server environment.
Instructions
OpenAI-compatible chat completion API. Supports optional images per message for vision/multimodal models.
Input Schema
TableJSON Schema
| Name | Required | Description | Default |
|---|---|---|---|
| model | Yes | ||
| messages | Yes | ||
| temperature | No | ||
| think | No |
Implementation Reference
- src/index.ts:207-238 (handler)Handler function that executes the chat_completion tool logic: calls ollama.chat with provided parameters and returns an OpenAI-compatible chat completion JSON response.async ({ model, messages, temperature, think }) => { try { const response = await ollama.chat({ model, messages, options: { temperature }, ...(think !== undefined ? { think } : {}), }); return { content: [ { type: "text", text: JSON.stringify({ id: "chatcmpl-" + Date.now(), object: "chat.completion", created: Math.floor(Date.now() / 1000), model, choices: [ { index: 0, message: response.message, finish_reason: "stop", }, ], }, null, 2), }, ], }; } catch (error) { return { content: [{ type: "text", text: `Error: ${formatError(error)}` }], isError: true }; } }
- src/index.ts:191-239 (registration)Registration of the 'chat_completion' tool using McpServer.registerTool, including schema and handler.server.registerTool( "chat_completion", { title: "Chat completion", description: "OpenAI-compatible chat completion API. Supports optional images per message for vision/multimodal models.", inputSchema: { model: z.string(), messages: z.array(z.object({ role: z.enum(["system", "user", "assistant"]), content: z.string(), images: z.array(z.string()).optional(), // Array of image paths })), temperature: z.number().min(0).max(2).optional(), think: z.boolean().optional(), }, }, async ({ model, messages, temperature, think }) => { try { const response = await ollama.chat({ model, messages, options: { temperature }, ...(think !== undefined ? { think } : {}), }); return { content: [ { type: "text", text: JSON.stringify({ id: "chatcmpl-" + Date.now(), object: "chat.completion", created: Math.floor(Date.now() / 1000), model, choices: [ { index: 0, message: response.message, finish_reason: "stop", }, ], }, null, 2), }, ], }; } catch (error) { return { content: [{ type: "text", text: `Error: ${formatError(error)}` }], isError: true }; } } );
- src/index.ts:196-205 (schema)Zod input schema definition for the chat_completion tool, validating model, messages (with optional images), temperature, and think parameters.inputSchema: { model: z.string(), messages: z.array(z.object({ role: z.enum(["system", "user", "assistant"]), content: z.string(), images: z.array(z.string()).optional(), // Array of image paths })), temperature: z.number().min(0).max(2).optional(), think: z.boolean().optional(), },