chat_completion
Process conversation messages using Ollama models locally via OpenAI-compatible API. Specify model, messages, and temperature for precise chat completions within MCP-powered applications.
Instructions
OpenAI-compatible chat completion API
Input Schema
Name | Required | Description | Default |
---|---|---|---|
messages | Yes | Array of messages in the conversation | |
model | Yes | Name of the Ollama model to use | |
temperature | No | Sampling temperature (0-2) | |
timeout | No | Timeout in milliseconds (default: 60000) |