Skip to main content
Glama

Ollama MCP Server

by hyzhak
projectbrief.md•1.09 kB
# Project Brief ## Project Name Ollama MCP Server ## Purpose Provide a bridge between Ollama's local LLM API and the Model Context Protocol (MCP), enabling MCP-compatible tools to access and control Ollama models programmatically. ## Core Requirements - Expose Ollama model management and execution as MCP tools. - Support OpenAI-compatible chat completion via MCP. - Enable local LLM usage with privacy and full control. - Provide robust error handling and configurable parameters. ## Scope - Integrate all essential Ollama API and CLI features into MCP. - Support both CLI and HTTP transports (stdio and SSE). - Allow model creation, listing, pulling, pushing, copying, removing, and running (with prompt, images, and temperature parameters). - Enable advanced configuration (timeouts, temperature, endpoint). - The "run" tool supports an optional temperature parameter for deterministic or creative outputs. - Maintain compatibility with MCP server conventions. ## Out of Scope - Web UI or frontend components. - Non-Ollama LLM integrations. - Cloud deployment automation.

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/hyzhak/ollama-mcp-server'

If you have feedback or need assistance with the MCP directory API, please join our Discord server