Ollama MCP Server

serve

Start the Ollama MCP Server to manage and run local AI models, enabling integration of Ollama's LLM capabilities into MCP-powered applications.

Instructions

Start Ollama server

Input Schema

NameRequiredDescriptionDefault

No arguments

Input Schema (JSON Schema)

{ "additionalProperties": false, "properties": {}, "type": "object" }

You must be authenticated.

Other Tools from Ollama MCP Server

Related Tools

  • @NightTrek/Ollama-mcp
  • @NightTrek/Ollama-mcp
  • @NightTrek/Ollama-mcp
  • @NightTrek/Ollama-mcp
  • @NightTrek/Ollama-mcp
ID: sxt5su901q