chat_completion
Process conversation messages using Ollama models locally via OpenAI-compatible API. Specify model, messages, and temperature for precise chat completions within MCP-powered applications.
Instructions
OpenAI-compatible chat completion API
Input Schema
Name | Required | Description | Default |
---|---|---|---|
messages | Yes | Array of messages in the conversation | |
model | Yes | Name of the Ollama model to use | |
temperature | No | Sampling temperature (0-2) | |
timeout | No | Timeout in milliseconds (default: 60000) |
Input Schema (JSON Schema)
You must be authenticated.
Other Tools from Ollama MCP Server
Related Tools
- @pyroprompts/any-chat-completions-mcp
- @mzxrai/mcp-openai
- @bigdata-coss/agent_mcp
- @PhialsBasement/KoboldCPP-MCP-Server
- @PhialsBasement/KoboldCPP-MCP-Server
- @thadius83/mcp-server-openai