Skip to main content
Glama
hyzhak

Ollama MCP Server

by hyzhak

chat_completion

Generate text responses using local AI models with optional image support for multimodal tasks. This tool provides chat completion functionality within the Ollama MCP Server environment.

Instructions

OpenAI-compatible chat completion API. Supports optional images per message for vision/multimodal models.

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
modelYes
messagesYes
temperatureNo
thinkNo

Implementation Reference

  • Handler function that executes the chat_completion tool logic: calls ollama.chat with provided parameters and returns an OpenAI-compatible chat completion JSON response.
    async ({ model, messages, temperature, think }) => {
      try {
        const response = await ollama.chat({
          model,
          messages,
          options: { temperature },
          ...(think !== undefined ? { think } : {}),
        });
        return {
          content: [
            {
              type: "text",
              text: JSON.stringify({
                id: "chatcmpl-" + Date.now(),
                object: "chat.completion",
                created: Math.floor(Date.now() / 1000),
                model,
                choices: [
                  {
                    index: 0,
                    message: response.message,
                    finish_reason: "stop",
                  },
                ],
              }, null, 2),
            },
          ],
        };
      } catch (error) {
        return { content: [{ type: "text", text: `Error: ${formatError(error)}` }], isError: true };
      }
    }
  • src/index.ts:191-239 (registration)
    Registration of the 'chat_completion' tool using McpServer.registerTool, including schema and handler.
    server.registerTool(
      "chat_completion",
      {
        title: "Chat completion",
        description: "OpenAI-compatible chat completion API. Supports optional images per message for vision/multimodal models.",
        inputSchema: {
          model: z.string(),
          messages: z.array(z.object({
            role: z.enum(["system", "user", "assistant"]),
            content: z.string(),
            images: z.array(z.string()).optional(), // Array of image paths
          })),
          temperature: z.number().min(0).max(2).optional(),
          think: z.boolean().optional(),
        },
      },
      async ({ model, messages, temperature, think }) => {
        try {
          const response = await ollama.chat({
            model,
            messages,
            options: { temperature },
            ...(think !== undefined ? { think } : {}),
          });
          return {
            content: [
              {
                type: "text",
                text: JSON.stringify({
                  id: "chatcmpl-" + Date.now(),
                  object: "chat.completion",
                  created: Math.floor(Date.now() / 1000),
                  model,
                  choices: [
                    {
                      index: 0,
                      message: response.message,
                      finish_reason: "stop",
                    },
                  ],
                }, null, 2),
              },
            ],
          };
        } catch (error) {
          return { content: [{ type: "text", text: `Error: ${formatError(error)}` }], isError: true };
        }
      }
    );
  • Zod input schema definition for the chat_completion tool, validating model, messages (with optional images), temperature, and think parameters.
    inputSchema: {
      model: z.string(),
      messages: z.array(z.object({
        role: z.enum(["system", "user", "assistant"]),
        content: z.string(),
        images: z.array(z.string()).optional(), // Array of image paths
      })),
      temperature: z.number().min(0).max(2).optional(),
      think: z.boolean().optional(),
    },
Install Server

Other Tools

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/hyzhak/ollama-mcp-server'

If you have feedback or need assistance with the MCP directory API, please join our Discord server