ask_gemini
Analyze large codebases or complex projects using Google Gemini's 1M token capacity for architecture design, code review, or comprehensive context analysis.
Instructions
Use Gemini for large context analysis (1M tokens), architecture design, or whole codebase review. Best for tasks requiring understanding of entire projects.
Input Schema
TableJSON Schema
| Name | Required | Description | Default |
|---|---|---|---|
| prompt | Yes | The question or task for Gemini | |
| context | No | Optional: Large codebase, multiple files, or extensive context to analyze | |
| model | No | Model to use: 'flash' (default, free, fast) or 'pro' (3 Pro, latest model, better quality, paid) | flash |
Implementation Reference
- index.js:152-183 (handler)Handler for executing the 'ask_gemini' tool. Parses arguments, selects Gemini model ('gemini-3-pro-preview' or 'gemini-2.5-flash'), constructs prompt with optional context, generates content, and returns the response.if (name === "ask_gemini") { const { prompt, context, model = "flash" } = args; // 모델 선택 const modelName = model === "pro" ? "gemini-3-pro-preview" // Gemini 3 Pro (최신 모델, 2025년 11월 출시) : "gemini-2.5-flash"; // 2.5 Flash (무료) const geminiModel = genAI.getGenerativeModel({ model: modelName }); // 프롬프트 구성 const fullPrompt = context ? `Context/Codebase:\n\`\`\`\n${context}\n\`\`\`\n\nTask: ${prompt}` : prompt; // Gemini 호출 const result = await geminiModel.generateContent(fullPrompt); const response = await result.response; const text = response.text(); return { content: [ { type: "text", text: `[Gemini ${ model === "pro" ? "3.0 Pro" : "2.5 Flash" }]\n\n${text}`, }, ], }; }
- index.js:41-67 (registration)Registration of the 'ask_gemini' tool in the ListTools response, including name, description, and input schema definition.{ name: "ask_gemini", description: "Use Gemini for large context analysis (1M tokens), architecture design, or whole codebase review. Best for tasks requiring understanding of entire projects.", inputSchema: { type: "object", properties: { prompt: { type: "string", description: "The question or task for Gemini", }, context: { type: "string", description: "Optional: Large codebase, multiple files, or extensive context to analyze", }, model: { type: "string", description: "Model to use: 'flash' (default, free, fast) or 'pro' (3 Pro, latest model, better quality, paid)", enum: ["flash", "pro"], default: "flash", }, }, required: ["prompt"], }, },
- index.js:45-66 (schema)Input schema for the 'ask_gemini' tool defining parameters: prompt (required), context (optional), model (default 'flash').inputSchema: { type: "object", properties: { prompt: { type: "string", description: "The question or task for Gemini", }, context: { type: "string", description: "Optional: Large codebase, multiple files, or extensive context to analyze", }, model: { type: "string", description: "Model to use: 'flash' (default, free, fast) or 'pro' (3 Pro, latest model, better quality, paid)", enum: ["flash", "pro"], default: "flash", }, }, required: ["prompt"], },