Skip to main content
Glama

Ollama MCP Server

by NightTrek

serve

Start the Ollama MCP Server to manage and run local AI models, enabling integration of Ollama's LLM capabilities into MCP-powered applications.

Instructions

Start Ollama server

Input Schema

NameRequiredDescriptionDefault

No arguments

Input Schema (JSON Schema)

{ "additionalProperties": false, "properties": {}, "type": "object" }

You must be authenticated.

Other Tools from Ollama MCP Server

Related Tools

  • @NightTrek/Ollama-mcp
  • @NightTrek/Ollama-mcp
  • @NightTrek/Ollama-mcp
  • @NightTrek/Ollama-mcp
  • @NightTrek/Ollama-mcp
  • @NightTrek/Ollama-mcp

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/NightTrek/Ollama-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server