Ollama MCP Server

by NightTrek

chat_completion

Process conversation messages using Ollama models locally via OpenAI-compatible API. Specify model, messages, and temperature for precise chat completions within MCP-powered applications.

Instructions

OpenAI-compatible chat completion API

Input Schema

NameRequiredDescriptionDefault
messagesYesArray of messages in the conversation
modelYesName of the Ollama model to use
temperatureNoSampling temperature (0-2)
timeoutNoTimeout in milliseconds (default: 60000)

Input Schema (JSON Schema)

{ "additionalProperties": false, "properties": { "messages": { "description": "Array of messages in the conversation", "items": { "properties": { "content": { "type": "string" }, "role": { "enum": [ "system", "user", "assistant" ], "type": "string" } }, "required": [ "role", "content" ], "type": "object" }, "type": "array" }, "model": { "description": "Name of the Ollama model to use", "type": "string" }, "temperature": { "description": "Sampling temperature (0-2)", "maximum": 2, "minimum": 0, "type": "number" }, "timeout": { "description": "Timeout in milliseconds (default: 60000)", "minimum": 1000, "type": "number" } }, "required": [ "model", "messages" ], "type": "object" }

You must be authenticated.

Other Tools from Ollama MCP Server

Related Tools

ID: sxt5su901q