serve
Start the Ollama MCP Server to manage and run local AI models, enabling integration of Ollama's LLM capabilities into MCP-powered applications.
Instructions
Start Ollama server
Input Schema
Name | Required | Description | Default |
---|---|---|---|
No arguments |
Start the Ollama MCP Server to manage and run local AI models, enabling integration of Ollama's LLM capabilities into MCP-powered applications.
Start Ollama server
Name | Required | Description | Default |
---|---|---|---|
No arguments |