Skip to main content
Glama

investigate

Analyze topics with configurable depth using AI models and web search to gather comprehensive information for research and decision-making.

Instructions

Investigate topics thoroughly with configurable depth

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
providerNoAI provider to use (defaults to Azure if configured, otherwise best available)
topicYesThe topic or question to investigate
depthNoInvestigation depthdeep
modelNoSpecific model to use
enableSearchNoEnable web search for investigation (Gemini only)

Implementation Reference

  • The core handler function that executes the 'investigate' tool. It selects an AI provider, constructs depth-specific prompts and system instructions, generates a response using the provider's generateText method, and returns formatted content with metadata.
    async handleInvestigation(params: z.infer<typeof InvestigationSchema>) { // Use provided provider or get the preferred one (Azure if configured) const providerName = params.provider || (await this.providerManager.getPreferredProvider(['openai', 'gemini', 'azure', 'grok'])); const provider = await this.providerManager.getProvider(providerName); // Build investigation prompts based on depth const depthPrompts = { shallow: "Provide a brief overview and key points about", medium: "Investigate and analyze the following topic, covering main aspects and implications", deep: "Conduct a thorough investigation of the following topic, exploring all relevant angles, implications, evidence, and potential conclusions. Be comprehensive and systematic", }; const systemPrompt = `You are an expert investigator and analyst. Your task is to thoroughly investigate topics, gather relevant information, analyze patterns, and provide comprehensive insights. ${params.provider === "gemini" && params.enableSearch ? "Use web search to find current and relevant information." : ""}`; const prompt = `${depthPrompts[params.depth]}: ${params.topic}`; const response = await provider.generateText({ prompt, model: params.model, systemPrompt, reasoningEffort: (providerName === "openai" || providerName === "azure" || providerName === "grok") ? "high" : undefined, useSearchGrounding: providerName === "gemini" ? (params.enableSearch !== false) : false, temperature: 0.5, // Lower temperature for investigation }); return { content: [ { type: "text", text: response.text, }, ], metadata: { provider: providerName, model: response.model, investigationDepth: params.depth, usage: response.usage, ...response.metadata, }, }; }
  • src/server.ts:255-262 (registration)
    MCP server registration of the 'investigate' tool, including title, description, input schema, and the handler invocation via getHandlers().
    server.registerTool("investigate", { title: "Investigate", description: "Investigate topics thoroughly with configurable depth", inputSchema: InvestigationSchema.shape, }, async (args) => { const aiHandlers = await getHandlers(); return await aiHandlers.handleInvestigation(args); });
  • Zod schema defining the input parameters for the investigate tool: provider, topic (required), depth, model, enableSearch.
    const InvestigationSchema = z.object({ provider: z.enum(["openai", "gemini", "azure", "grok"]).optional().describe("AI provider to use (defaults to Azure if configured, otherwise best available)"), topic: z.string().describe("The topic or question to investigate"), depth: z.enum(["shallow", "medium", "deep"]).default("deep").describe("Investigation depth"), model: z.string().optional().describe("Specific model to use"), enableSearch: z.boolean().optional().default(true).describe("Enable web search for investigation (Gemini only)"), });
  • Duplicate Zod schema used in server.ts for tool registration inputSchema.
    const InvestigationSchema = z.object({ provider: z.enum(["openai", "gemini", "azure", "grok"]).optional().describe("AI provider to use (defaults to Azure if configured, otherwise best available)"), topic: z.string().describe("The topic or question to investigate"), depth: z.enum(["shallow", "medium", "deep"]).default("deep").describe("Investigation depth"), model: z.string().optional().describe("Specific model to use"), enableSearch: z.boolean().optional().default(true).describe("Enable web search for investigation (Gemini only)"), });
  • Helper function that lazily initializes and returns the AIToolHandlers instance, setting up config, providers, and environment variables.
    async function getHandlers() { if (!handlers) { const { ConfigManager } = require("./config/manager"); const { ProviderManager } = require("./providers/manager"); const { AIToolHandlers } = require("./handlers/ai-tools"); const configManager = new ConfigManager(); // Load config and set environment variables const config = await configManager.getConfig(); if (config.openai?.apiKey) { process.env.OPENAI_API_KEY = config.openai.apiKey; } if (config.openai?.baseURL) { process.env.OPENAI_BASE_URL = config.openai.baseURL; } if (config.google?.apiKey) { process.env.GOOGLE_API_KEY = config.google.apiKey; } if (config.google?.baseURL) { process.env.GOOGLE_BASE_URL = config.google.baseURL; } if (config.azure?.apiKey) { process.env.AZURE_API_KEY = config.azure.apiKey; } if (config.azure?.baseURL) { process.env.AZURE_BASE_URL = config.azure.baseURL; } if (config.xai?.apiKey) { process.env.XAI_API_KEY = config.xai.apiKey; } if (config.xai?.baseURL) { process.env.XAI_BASE_URL = config.xai.baseURL; } providerManager = new ProviderManager(configManager); handlers = new AIToolHandlers(providerManager); } return handlers; }

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/RealMikeChong/ultra-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server