Skip to main content
Glama

MCP Ollama Consult Server

by Atomic-Germ
demo-output.txt1.16 kB
Connecting to stdio server (spawning 'node dist/index.js')... Ollama Consult MCP server running on stdio Connected. Listing tools... Server tools: - consult_ollama: Consult an Ollama model with a prompt and get its response for reasoning from another viewpoint. - list_ollama_models: List all available Ollama models on the local instance. Calling 'list_ollama_models'... list_ollama_models result: Available models: glm-4.6:cloud, gemma3:4b, gpt-oss:20b-cloud, deepseek-v3.1:671b-cloud, gpt-oss:120b-cloud, qwen3-coder:480b-cloud Using model "glm-4.6:cloud" for consult_ollama (best-effort). Calling 'consult_ollama' (with 15s timeout)... consult_ollama result: Instead of eliminating all jobs, AI is more likely to transform them by automating routine tasks. This will create a greater demand for uniquely human skills like creativity, critical thinking, and emotional intelligence. History shows that technological revolutions ultimately create new roles and industries, and the AI era will be no exception, generating new jobs focused on developing and managing these powerful systems. The nature of work will evolve, not disappear. Closing client...

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/Atomic-Germ/mcp-consult'

If you have feedback or need assistance with the MCP directory API, please join our Discord server