Skip to main content
Glama
Cam10001110101

mcp-server-ollama-deep-researcher

get_status

Check the current status of ongoing research tasks locally using the MCP server's deep research capabilities. No parameters required.

Instructions

Get the current status of any ongoing research

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
_dummyYesNo parameters needed

Implementation Reference

  • The handler function for the 'get_status' tool. It checks if there is an ongoing research (currentResearch) and returns either a message indicating no research is in progress or the detailed status including topic, current step, loop count, summary, and sources.
    case "get_status": { if (!currentResearch) { return { content: [ { type: "text", text: "No research is currently in progress.", }, ], }; } return { content: [ { type: "text", text: `Research Status: Topic: ${currentResearch.topic} Current Step: ${currentResearch.currentStep} Loop Count: ${currentResearch.loopCount} Summary: ${currentResearch.summary} Sources: ${currentResearch.sources.join("\n")}`, }, ], }; }
  • The schema definition for the 'get_status' tool, including name, description, and inputSchema which requires a dummy parameter.
    { name: "get_status", description: "Get the current status of any ongoing research", inputSchema: { type: "object", properties: { _dummy: { type: "string", description: "No parameters needed", const: "dummy" } }, required: ["_dummy"], additionalProperties: false } as const, },
  • src/index.ts:96-154 (registration)
    The tool registration where 'get_status' is included in the list of available tools returned by the ListToolsRequestSchema handler.
    server.setRequestHandler(ListToolsRequestSchema, async () => { const tools = [ { name: "research", description: "Research a topic using web search and LLM synthesis", inputSchema: { type: "object", properties: { topic: { type: "string", description: "The topic to research" } }, required: ["topic"], }, }, { name: "get_status", description: "Get the current status of any ongoing research", inputSchema: { type: "object", properties: { _dummy: { type: "string", description: "No parameters needed", const: "dummy" } }, required: ["_dummy"], additionalProperties: false } as const, }, { name: "configure", description: "Configure the research parameters (max loops, LLM model, search API)", inputSchema: { type: "object", properties: { maxLoops: { type: "number", description: "Maximum number of research loops (1-10)" }, llmModel: { type: "string", description: "Ollama model to use (e.g. llama3.2)" }, searchApi: { type: "string", enum: ["perplexity", "tavily", "exa"], description: "Search API to use for web research" } }, required: [], }, }, ]; return { tools }; });

Other Tools

Related Tools

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/Cam10001110101/mcp-server-ollama-deep-researcher'

If you have feedback or need assistance with the MCP directory API, please join our Discord server