Skip to main content
Glama
Cam10001110101

mcp-server-ollama-deep-researcher

configure

Set up research parameters such as max loops, LLM model, and search API to customize deep research tasks using the MCP server for AI-driven insights.

Instructions

Configure the research parameters (max loops, LLM model, search API)

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
llmModelNoOllama model to use (e.g. llama3.2)
maxLoopsNoMaximum number of research loops (1-5)
searchApiNoSearch API to use for web research

Implementation Reference

  • Handler for the 'configure' tool. Validates and updates global research configuration (maxLoops, llmModel, searchApi) or returns current config.
    case "configure": { const newConfig = request.params.arguments; let configMessage = ''; if (newConfig && Object.keys(newConfig).length > 0) { try { // Validate new configuration if (newConfig.maxLoops !== undefined) { if (typeof newConfig.maxLoops !== 'number' || newConfig.maxLoops < 1 || newConfig.maxLoops > 10) { throw new Error("maxLoops must be a number between 1 and 10"); } } if (newConfig.searchApi !== undefined) { if (newConfig.searchApi !== 'perplexity' && newConfig.searchApi !== 'tavily' && newConfig.searchApi !== 'exa') { throw new Error("searchApi must be 'perplexity', 'tavily', or 'exa'"); } // Validate API key for new search API validateApiKeys(newConfig.searchApi); } // Type guard to ensure properties match ResearchConfig const validatedConfig: Partial<ResearchConfig> = {}; if (typeof newConfig.maxLoops === 'number') { validatedConfig.maxLoops = newConfig.maxLoops; } if (typeof newConfig.llmModel === 'string') { validatedConfig.llmModel = newConfig.llmModel; } if (newConfig.searchApi === 'perplexity' || newConfig.searchApi === 'tavily' || newConfig.searchApi === 'exa') { validatedConfig.searchApi = newConfig.searchApi; } config = { ...config, ...validatedConfig }; configMessage = 'Research configuration updated:'; } catch (error) { return { content: [ { type: "text", text: `Configuration error: ${error instanceof Error ? error.message : String(error)}`, }, ], isError: true, }; } } else { configMessage = 'Current research configuration:'; } return { content: [ { type: "text", text: `${configMessage} Max Loops: ${config.maxLoops} LLM Model: ${config.llmModel} Search API: ${config.searchApi}`, }, ], }; }
  • Input schema for the 'configure' tool defining optional parameters: maxLoops (number), llmModel (string), searchApi (enum).
    inputSchema: { type: "object", properties: { maxLoops: { type: "number", description: "Maximum number of research loops (1-10)" }, llmModel: { type: "string", description: "Ollama model to use (e.g. llama3.2)" }, searchApi: { type: "string", enum: ["perplexity", "tavily", "exa"], description: "Search API to use for web research" } }, required: [], },
  • src/index.ts:128-150 (registration)
    Registration of the 'configure' tool in the listTools handler, including name, description, and input schema.
    { name: "configure", description: "Configure the research parameters (max loops, LLM model, search API)", inputSchema: { type: "object", properties: { maxLoops: { type: "number", description: "Maximum number of research loops (1-10)" }, llmModel: { type: "string", description: "Ollama model to use (e.g. llama3.2)" }, searchApi: { type: "string", enum: ["perplexity", "tavily", "exa"], description: "Search API to use for web research" } }, required: [], }, },

Other Tools

Related Tools

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/Cam10001110101/mcp-server-ollama-deep-researcher'

If you have feedback or need assistance with the MCP directory API, please join our Discord server