Skip to main content
Glama

deep-reasoning

Solve complex problems using advanced AI models like GPT-5 and Gemini 2.5 Pro. Leverage deep reasoning, Google Search integration, and custom parameters for precise, context-aware solutions.

Instructions

Use advanced AI models for deep reasoning and complex problem-solving. Supports GPT-5 for OpenAI/Azure and Gemini 2.5 Pro with Google Search.

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
enableSearchNoEnable Google Search for Gemini models
maxOutputTokensNoMaximum tokens in response
modelNoSpecific model to use (optional, will use provider default)
promptYesThe complex question or problem requiring deep reasoning
providerNoAI provider to use (defaults to Azure if configured, otherwise OpenAI)
reasoningEffortNoReasoning effort level (for certain reasoning models)high
systemPromptNoSystem prompt to set context for reasoning
temperatureNoTemperature for response generation

Implementation Reference

  • Core handler function that executes deep-reasoning tool: selects preferred AI provider, builds specialized system prompt, calls provider.generateText with reasoning parameters and optional search grounding for Gemini, formats MCP-compliant response with text content and usage metadata.
    async handleDeepReasoning(params: z.infer<typeof DeepReasoningSchema>) { // Use provided provider or get the preferred one (Azure if configured) const providerName = params.provider || (await this.providerManager.getPreferredProvider(['openai', 'azure'])); const provider = await this.providerManager.getProvider(providerName); // Build a comprehensive system prompt for deep reasoning const systemPrompt = params.systemPrompt || `You are an expert AI assistant specializing in deep reasoning and complex problem-solving. Approach problems systematically, consider multiple perspectives, and provide thorough, well-reasoned responses. Break down complex problems into components, analyze each thoroughly, and synthesize insights.`; const response = await provider.generateText({ prompt: params.prompt, model: params.model, temperature: params.temperature, maxOutputTokens: params.maxOutputTokens, systemPrompt, reasoningEffort: params.reasoningEffort, useSearchGrounding: providerName === "gemini" ? (params.enableSearch !== false) : false, toolName: 'deep-reasoning', }); return { content: [ { type: "text", text: response.text, }, ], metadata: { provider: params.provider, model: response.model, usage: response.usage, ...response.metadata, }, }; }
  • src/server.ts:244-252 (registration)
    MCP server tool registration for 'deep-reasoning', defining metadata, input schema, and lazy-loading handler via getHandlers() which instantiates AIToolHandlers.
    // Register deep-reasoning tool server.registerTool("deep-reasoning", { title: "Deep Reasoning", description: "Use advanced AI models for deep reasoning and complex problem-solving. Supports GPT-5 for OpenAI/Azure and Gemini 2.5 Pro with Google Search.", inputSchema: DeepReasoningSchema.shape, }, async (args) => { const aiHandlers = await getHandlers(); return await aiHandlers.handleDeepReasoning(args); });
  • Zod input schema definition for the deep-reasoning tool, validating parameters like provider selection, prompt, model overrides, temperature, tokens, custom system prompts, reasoning effort levels, and Gemini search enabling.
    const DeepReasoningSchema = z.object({ provider: z.enum(["openai", "gemini", "azure", "grok"]).optional().describe("AI provider to use (defaults to Azure if configured, otherwise OpenAI)"), prompt: z.string().describe("The complex question or problem requiring deep reasoning"), model: z.string().optional().describe("Specific model to use (optional, will use provider default)"), temperature: z.number().min(0).max(2).optional().default(0.7).describe("Temperature for response generation"), maxOutputTokens: z.number().positive().optional().describe("Maximum tokens in response"), systemPrompt: z.string().optional().describe("System prompt to set context for reasoning"), reasoningEffort: z.enum(["low", "medium", "high"]).optional().default("high").describe("Reasoning effort level (for certain reasoning models)"), enableSearch: z.boolean().optional().default(true).describe("Enable Google Search for Gemini models"), });

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/RealMikeChong/ultra-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server