Skip to main content
Glama

mcp-server-ollama-deep-researcher

MIT License
13
  • Apple
  • Linux

configure

Set up research parameters such as max loops, LLM model, and search API to customize deep research tasks using the MCP server for AI-driven insights.

Instructions

Configure the research parameters (max loops, LLM model, search API)

Input Schema

NameRequiredDescriptionDefault
llmModelNoOllama model to use (e.g. llama3.2)
maxLoopsNoMaximum number of research loops (1-5)
searchApiNoSearch API to use for web research

Input Schema (JSON Schema)

{ "properties": { "llmModel": { "description": "Ollama model to use (e.g. llama3.2)", "type": "string" }, "maxLoops": { "description": "Maximum number of research loops (1-5)", "type": "number" }, "searchApi": { "description": "Search API to use for web research", "enum": [ "perplexity", "tavily" ], "type": "string" } }, "required": [], "type": "object" }

Other Tools from mcp-server-ollama-deep-researcher

Related Tools

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/Cam10001110101/mcp-server-ollama-deep-researcher'

If you have feedback or need assistance with the MCP directory API, please join our Discord server