serve
Start the Ollama MCP Server to manage and run local AI models, enabling integration of Ollama's LLM capabilities into MCP-powered applications.
Instructions
Start Ollama server
Input Schema
Name | Required | Description | Default |
---|---|---|---|
No arguments |
Start the Ollama MCP Server to manage and run local AI models, enabling integration of Ollama's LLM capabilities into MCP-powered applications.
Start Ollama server
Name | Required | Description | Default |
---|---|---|---|
No arguments |
We provide all the information about MCP servers via our MCP API.
curl -X GET 'https://glama.ai/api/mcp/v1/servers/NightTrek/Ollama-mcp'
If you have feedback or need assistance with the MCP directory API, please join our Discord server