Skip to main content
Glama

ask_gemini

Analyze large codebases or complex projects using Google Gemini's 1M token capacity for architecture design, code review, or comprehensive context analysis.

Instructions

Use Gemini for large context analysis (1M tokens), architecture design, or whole codebase review. Best for tasks requiring understanding of entire projects.

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
promptYesThe question or task for Gemini
contextNoOptional: Large codebase, multiple files, or extensive context to analyze
modelNoModel to use: 'flash' (default, free, fast) or 'pro' (3 Pro, latest model, better quality, paid)flash

Implementation Reference

  • Handler for executing the 'ask_gemini' tool. Parses arguments, selects Gemini model ('gemini-3-pro-preview' or 'gemini-2.5-flash'), constructs prompt with optional context, generates content, and returns the response.
    if (name === "ask_gemini") { const { prompt, context, model = "flash" } = args; // 모델 선택 const modelName = model === "pro" ? "gemini-3-pro-preview" // Gemini 3 Pro (최신 모델, 2025년 11월 출시) : "gemini-2.5-flash"; // 2.5 Flash (무료) const geminiModel = genAI.getGenerativeModel({ model: modelName }); // 프롬프트 구성 const fullPrompt = context ? `Context/Codebase:\n\`\`\`\n${context}\n\`\`\`\n\nTask: ${prompt}` : prompt; // Gemini 호출 const result = await geminiModel.generateContent(fullPrompt); const response = await result.response; const text = response.text(); return { content: [ { type: "text", text: `[Gemini ${ model === "pro" ? "3.0 Pro" : "2.5 Flash" }]\n\n${text}`, }, ], }; }
  • index.js:41-67 (registration)
    Registration of the 'ask_gemini' tool in the ListTools response, including name, description, and input schema definition.
    { name: "ask_gemini", description: "Use Gemini for large context analysis (1M tokens), architecture design, or whole codebase review. Best for tasks requiring understanding of entire projects.", inputSchema: { type: "object", properties: { prompt: { type: "string", description: "The question or task for Gemini", }, context: { type: "string", description: "Optional: Large codebase, multiple files, or extensive context to analyze", }, model: { type: "string", description: "Model to use: 'flash' (default, free, fast) or 'pro' (3 Pro, latest model, better quality, paid)", enum: ["flash", "pro"], default: "flash", }, }, required: ["prompt"], }, },
  • Input schema for the 'ask_gemini' tool defining parameters: prompt (required), context (optional), model (default 'flash').
    inputSchema: { type: "object", properties: { prompt: { type: "string", description: "The question or task for Gemini", }, context: { type: "string", description: "Optional: Large codebase, multiple files, or extensive context to analyze", }, model: { type: "string", description: "Model to use: 'flash' (default, free, fast) or 'pro' (3 Pro, latest model, better quality, paid)", enum: ["flash", "pro"], default: "flash", }, }, required: ["prompt"], },

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/Yoon-jongho/claude-to-gemini'

If you have feedback or need assistance with the MCP directory API, please join our Discord server