MCP Ollama Consult Server
Server Configuration
Describes the environment variables required to run the server.
| Name | Required | Description | Default |
|---|---|---|---|
| OLLAMA_BASE_URL | No | The Ollama endpoint URL | http://localhost:11434 |
Capabilities
Features and capabilities supported by this server
| Capability | Details |
|---|---|
| tools | {} |
Tools
Functions exposed to the LLM to take actions
| Name | Description |
|---|---|
| consult_ollamaB | Consult with Ollama AI models for architectural decisions, code reviews, and design discussions. Supports sequential chaining of consultations for complex multi-step reasoning. |
| list_ollama_modelsA | List all available Ollama models on the local system (installed or cloud-based) |
| compare_ollama_responsesB | Compare responses from multiple Ollama models on the same prompt to get diverse perspectives |
| remember_contextB | Store context for use in future consultations within the same session |
| sequential_consultation_chainB | Run a sequence of consultations where each consultant builds on previous responses, enabling complex multi-step reasoning. |
Prompts
Interactive templates invoked by user choice
| Name | Description |
|---|---|
No prompts | |
Resources
Contextual data attached and managed by the client
| Name | Description |
|---|---|
No resources | |
Latest Blog Posts
MCP directory API
We provide all the information about MCP servers via our MCP API.
curl -X GET 'https://glama.ai/api/mcp/v1/servers/Atomic-Germ/mcp-consult'
If you have feedback or need assistance with the MCP directory API, please join our Discord server