Skip to main content
Glama

Ollama MCP Server

by hyzhak

run

Execute AI models locally with a prompt using Ollama MCP Server, optionally integrating images for vision/multimodal models and adjusting temperature for response control.

Instructions

Run a model with a prompt. Optionally accepts an image file path for vision/multimodal models and a temperature parameter.

Input Schema

NameRequiredDescriptionDefault
imagesNo
nameYes
promptYes
temperatureNo
thinkNo

Input Schema (JSON Schema)

{ "$schema": "http://json-schema.org/draft-07/schema#", "additionalProperties": false, "properties": { "images": { "items": { "type": "string" }, "type": "array" }, "name": { "type": "string" }, "prompt": { "type": "string" }, "temperature": { "maximum": 2, "minimum": 0, "type": "number" }, "think": { "type": "boolean" } }, "required": [ "name", "prompt" ], "type": "object" }
Install Server

Other Tools from Ollama MCP Server

Related Tools

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/hyzhak/ollama-mcp-server'

If you have feedback or need assistance with the MCP directory API, please join our Discord server