Ollama MCP Server
Server Configuration
Describes the environment variables required to run the server.
| Name | Required | Description | Default |
|---|---|---|---|
No arguments | |||
Capabilities
Features and capabilities supported by this server
| Capability | Details |
|---|---|
| tools | {
"listChanged": false
} |
| experimental | {} |
Tools
Functions exposed to the LLM to take actions
| Name | Description |
|---|---|
| list_local_modelsA | List all locally installed Ollama models with details |
| local_llm_chatC | Chat with a local Ollama model |
| ollama_health_checkB | Check Ollama server health and provide diagnostics |
| system_resource_checkC | Check system resources and compatibility |
| suggest_modelsB | Suggests the best locally installed model for a specific task based on user needs. |
| remove_modelC | Remove a model from local storage |
| start_ollama_serverB | Attempt to start Ollama server if it's not running |
| select_chat_modelC | Present available models and help user select one for chat |
| test_model_responsivenessC | Test the responsiveness of a specific model by sending a simple prompt. |
Prompts
Interactive templates invoked by user choice
| Name | Description |
|---|---|
No prompts | |
Resources
Contextual data attached and managed by the client
| Name | Description |
|---|---|
No resources | |
Latest Blog Posts
MCP directory API
We provide all the information about MCP servers via our MCP API.
curl -X GET 'https://glama.ai/api/mcp/v1/servers/paolodalprato/ollama-mcp-server'
If you have feedback or need assistance with the MCP directory API, please join our Discord server