generateWithGemini
Generate text content using Google's Gemini 2.5 Pro AI model through Claude Desktop, with options for temperature control, token limits, safe mode, and web search integration.
Instructions
Generate content with Gemini 2.5 Pro Experimental (beta API)
Input Schema
TableJSON Schema
| Name | Required | Description | Default |
|---|---|---|---|
| prompt | Yes | The prompt to send to Gemini | |
| temperature | No | Temperature (0.0 to 1.0) | |
| maxTokens | No | Maximum output tokens | |
| safeMode | No | Enable safe mode for sensitive topics | |
| useSearch | No | Enable Google Search grounding tool |
Implementation Reference
- src/index.ts:94-306 (handler)The core handler function that executes the generateWithGemini tool: sends the prompt to Gemini 2.5 Pro experimental beta API via fetch, supports optional Google Search grounding, parses the response including all text parts, appends token usage and search metadata, saves full responses to files for debugging, and returns structured content or error messages.async ({ prompt, temperature = 0.9, maxTokens = 32000, safeMode = false, useSearch = false }: GenerationParams, extra) => { console.log("Generating with Gemini 2.5 Pro, prompt:", prompt); try { log("Sending request to beta API: gemini-2.5-pro-exp-03-25"); // Create request body with optional search tool const requestBody: RequestBody = { contents: [ { role: "user", parts: [{ text: prompt }] } ], generationConfig: { temperature: temperature, topP: 1, topK: 64, maxOutputTokens: maxTokens } }; // Add Google Search grounding tool if requested if (useSearch) { console.log("Adding Google Search grounding tool to request"); requestBody.tools = [ { googleSearch: {} // Empty config means use default settings } ]; } // Enhanced logging for debugging console.log(`Request to beta API (${new Date().toISOString()}): ${betaApiEndpoint.substring(0, 100)}...`); try { // Use the non-streaming API const response = await fetch(betaApiEndpoint, { method: 'POST', headers: { 'Content-Type': 'application/json' }, body: JSON.stringify(requestBody) }); console.log(`Response received (${new Date().toISOString()}), status: ${response.status}`); if (response.ok) { // Capture the full raw response text for debugging and to ensure nothing is lost const rawResponseText = await response.text(); console.log(`Raw API response length: ${rawResponseText.length} characters`); // Save the raw response to a file for verification try { const fs = await import('fs'); const responseDir = '/tmp/gemini-responses'; if (!fs.existsSync(responseDir)) { fs.mkdirSync(responseDir, { recursive: true }); } const timestamp = new Date().toISOString().replace(/:/g, '-'); const rawFilename = `${responseDir}/raw-response-${timestamp}.json`; fs.writeFileSync(rawFilename, rawResponseText); console.log(`Raw API response saved to ${rawFilename}`); } catch (fsError) { console.error("Error saving raw response to file:", fsError); } if (DEBUG) { console.log("First 1000 chars of raw response:", rawResponseText.substring(0, 1000)); } // Parse the JSON from the raw text to avoid any automatic processing/truncation const data = JSON.parse(rawResponseText); // Log response structure for debugging if (DEBUG) { console.log("API response structure:", JSON.stringify(data).substring(0, 1000) + (JSON.stringify(data).length > 1000 ? "..." : "")); } if (data.candidates && data.candidates[0]?.content?.parts) { // Extract and log the full API response for debugging console.log("Full API response structure:", JSON.stringify(Object.keys(data)).substring(0, 500)); if (DEBUG) { console.log("First candidate structure:", JSON.stringify(Object.keys(data.candidates[0])).substring(0, 500)); console.log("Content parts structure:", JSON.stringify(data.candidates[0].content.parts).substring(0, 500)); } // Ensure we correctly extract all text content let text = ""; // Process all parts that might contain text for (const part of data.candidates[0].content.parts) { if (part.text) { text += part.text; } } log("Success with Gemini 2.5 Pro Experimental!"); console.log("USING MODEL: gemini-2.5-pro-exp-03-25"); // Create token usage information if available let tokenInfo = ""; if (data.usageMetadata) { const { promptTokenCount, candidatesTokenCount, totalTokenCount } = data.usageMetadata; tokenInfo = `\n\n[Token usage: ${promptTokenCount} prompt, ${candidatesTokenCount || 0} response, ${totalTokenCount} total]`; } // Create search grounding data if available let searchInfo = ""; if (useSearch && data.candidates[0].groundingMetadata?.webSearchQueries) { const searchQueries = data.candidates[0].groundingMetadata.webSearchQueries; searchInfo = `\n\n[Search queries: ${searchQueries.join(", ")}]`; } // Check text length for debugging console.log(`Response text length: ${text.length} characters`); const fullText = text + tokenInfo + searchInfo; // Save the full response to a file for verification let savedFilename = ""; try { const fs = await import('fs'); const responseDir = '/tmp/gemini-responses'; // Create directory if it doesn't exist if (!fs.existsSync(responseDir)) { fs.mkdirSync(responseDir, { recursive: true }); } // Save the full response to a timestamped file const timestamp = new Date().toISOString().replace(/:/g, '-'); savedFilename = `${responseDir}/response-${timestamp}.txt`; fs.writeFileSync(savedFilename, fullText); console.log(`Full response saved to ${savedFilename}`); // Also save metadata to a JSON file const metadataFilename = `${responseDir}/metadata-${timestamp}.json`; fs.writeFileSync(metadataFilename, JSON.stringify({ responseLength: fullText.length, promptLength: prompt.length, useSearch: useSearch, timestamp: new Date().toISOString(), tokenInfo: data.usageMetadata || null, searchQueries: useSearch ? (data.candidates[0]?.groundingMetadata?.webSearchQueries || null) : null }, null, 2)); } catch (fsError) { console.error("Error saving response to file:", fsError); } // Include the file path info in the response const fileInfo = savedFilename ? `\n\n[Complete response saved to: ${savedFilename}]` : ""; // Return the full text directly without chunking // This gives Claude a chance to display the entire response if it can // Also includes the file path as a backup console.log(`Sending full response (${fullText.length} characters) with file info`); return { content: [{ type: "text", text: fullText + fileInfo }] }; } else { console.error("Invalid API response format:", JSON.stringify(data).substring(0, 500)); throw new Error("Invalid response format from beta API"); } } else { // Try to parse error response let errorMessage = `HTTP error ${response.status}`; try { const errorData = await response.json(); console.error("API error response:", JSON.stringify(errorData).substring(0, 500)); errorMessage = `API error: ${JSON.stringify(errorData)}`; } catch (e) { // If we can't parse JSON, use text const errorText = await response.text(); console.error("API error text:", errorText.substring(0, 500)); errorMessage = `API error: ${errorText}`; } throw new Error(errorMessage); } } catch (fetchError: any) { console.error("Fetch error:", fetchError.name, fetchError.message); throw fetchError; // Re-throw to be caught by outer catch } } catch (error: any) { // Handle the case where error is undefined if (!error) { console.error("Undefined error caught in Gemini handler"); return { content: [{ type: "text", text: "The model took too long to respond or the connection was interrupted. This sometimes happens with complex topics. Please try again or rephrase your question." }], isError: true }; } // Normal error handling for defined errors console.error("Error with Gemini 2.5 Pro:", error.name || "UnknownError", error.message || "No message", error.stack || "No stack" ); return { content: [{ type: "text", text: `Error using Gemini 2.5 Pro Experimental: ${error.name || 'Unknown'} - ${error.message || 'No error message'}` }], isError: true }; } }
- src/index.ts:88-93 (schema)Zod schema for input validation of generateWithGemini tool parameters.prompt: z.string().describe("The prompt to send to Gemini"), temperature: z.number().optional().describe("Temperature (0.0 to 1.0)"), maxTokens: z.number().optional().describe("Maximum output tokens"), safeMode: z.boolean().optional().describe("Enable safe mode for sensitive topics"), useSearch: z.boolean().optional().describe("Enable Google Search grounding tool") },
- src/index.ts:84-307 (registration)Registration of the generateWithGemini tool on the MCP server with name, description, input schema, and handler function.server.tool( "generateWithGemini", "Generate content with Gemini 2.5 Pro Experimental (beta API)", { prompt: z.string().describe("The prompt to send to Gemini"), temperature: z.number().optional().describe("Temperature (0.0 to 1.0)"), maxTokens: z.number().optional().describe("Maximum output tokens"), safeMode: z.boolean().optional().describe("Enable safe mode for sensitive topics"), useSearch: z.boolean().optional().describe("Enable Google Search grounding tool") }, async ({ prompt, temperature = 0.9, maxTokens = 32000, safeMode = false, useSearch = false }: GenerationParams, extra) => { console.log("Generating with Gemini 2.5 Pro, prompt:", prompt); try { log("Sending request to beta API: gemini-2.5-pro-exp-03-25"); // Create request body with optional search tool const requestBody: RequestBody = { contents: [ { role: "user", parts: [{ text: prompt }] } ], generationConfig: { temperature: temperature, topP: 1, topK: 64, maxOutputTokens: maxTokens } }; // Add Google Search grounding tool if requested if (useSearch) { console.log("Adding Google Search grounding tool to request"); requestBody.tools = [ { googleSearch: {} // Empty config means use default settings } ]; } // Enhanced logging for debugging console.log(`Request to beta API (${new Date().toISOString()}): ${betaApiEndpoint.substring(0, 100)}...`); try { // Use the non-streaming API const response = await fetch(betaApiEndpoint, { method: 'POST', headers: { 'Content-Type': 'application/json' }, body: JSON.stringify(requestBody) }); console.log(`Response received (${new Date().toISOString()}), status: ${response.status}`); if (response.ok) { // Capture the full raw response text for debugging and to ensure nothing is lost const rawResponseText = await response.text(); console.log(`Raw API response length: ${rawResponseText.length} characters`); // Save the raw response to a file for verification try { const fs = await import('fs'); const responseDir = '/tmp/gemini-responses'; if (!fs.existsSync(responseDir)) { fs.mkdirSync(responseDir, { recursive: true }); } const timestamp = new Date().toISOString().replace(/:/g, '-'); const rawFilename = `${responseDir}/raw-response-${timestamp}.json`; fs.writeFileSync(rawFilename, rawResponseText); console.log(`Raw API response saved to ${rawFilename}`); } catch (fsError) { console.error("Error saving raw response to file:", fsError); } if (DEBUG) { console.log("First 1000 chars of raw response:", rawResponseText.substring(0, 1000)); } // Parse the JSON from the raw text to avoid any automatic processing/truncation const data = JSON.parse(rawResponseText); // Log response structure for debugging if (DEBUG) { console.log("API response structure:", JSON.stringify(data).substring(0, 1000) + (JSON.stringify(data).length > 1000 ? "..." : "")); } if (data.candidates && data.candidates[0]?.content?.parts) { // Extract and log the full API response for debugging console.log("Full API response structure:", JSON.stringify(Object.keys(data)).substring(0, 500)); if (DEBUG) { console.log("First candidate structure:", JSON.stringify(Object.keys(data.candidates[0])).substring(0, 500)); console.log("Content parts structure:", JSON.stringify(data.candidates[0].content.parts).substring(0, 500)); } // Ensure we correctly extract all text content let text = ""; // Process all parts that might contain text for (const part of data.candidates[0].content.parts) { if (part.text) { text += part.text; } } log("Success with Gemini 2.5 Pro Experimental!"); console.log("USING MODEL: gemini-2.5-pro-exp-03-25"); // Create token usage information if available let tokenInfo = ""; if (data.usageMetadata) { const { promptTokenCount, candidatesTokenCount, totalTokenCount } = data.usageMetadata; tokenInfo = `\n\n[Token usage: ${promptTokenCount} prompt, ${candidatesTokenCount || 0} response, ${totalTokenCount} total]`; } // Create search grounding data if available let searchInfo = ""; if (useSearch && data.candidates[0].groundingMetadata?.webSearchQueries) { const searchQueries = data.candidates[0].groundingMetadata.webSearchQueries; searchInfo = `\n\n[Search queries: ${searchQueries.join(", ")}]`; } // Check text length for debugging console.log(`Response text length: ${text.length} characters`); const fullText = text + tokenInfo + searchInfo; // Save the full response to a file for verification let savedFilename = ""; try { const fs = await import('fs'); const responseDir = '/tmp/gemini-responses'; // Create directory if it doesn't exist if (!fs.existsSync(responseDir)) { fs.mkdirSync(responseDir, { recursive: true }); } // Save the full response to a timestamped file const timestamp = new Date().toISOString().replace(/:/g, '-'); savedFilename = `${responseDir}/response-${timestamp}.txt`; fs.writeFileSync(savedFilename, fullText); console.log(`Full response saved to ${savedFilename}`); // Also save metadata to a JSON file const metadataFilename = `${responseDir}/metadata-${timestamp}.json`; fs.writeFileSync(metadataFilename, JSON.stringify({ responseLength: fullText.length, promptLength: prompt.length, useSearch: useSearch, timestamp: new Date().toISOString(), tokenInfo: data.usageMetadata || null, searchQueries: useSearch ? (data.candidates[0]?.groundingMetadata?.webSearchQueries || null) : null }, null, 2)); } catch (fsError) { console.error("Error saving response to file:", fsError); } // Include the file path info in the response const fileInfo = savedFilename ? `\n\n[Complete response saved to: ${savedFilename}]` : ""; // Return the full text directly without chunking // This gives Claude a chance to display the entire response if it can // Also includes the file path as a backup console.log(`Sending full response (${fullText.length} characters) with file info`); return { content: [{ type: "text", text: fullText + fileInfo }] }; } else { console.error("Invalid API response format:", JSON.stringify(data).substring(0, 500)); throw new Error("Invalid response format from beta API"); } } else { // Try to parse error response let errorMessage = `HTTP error ${response.status}`; try { const errorData = await response.json(); console.error("API error response:", JSON.stringify(errorData).substring(0, 500)); errorMessage = `API error: ${JSON.stringify(errorData)}`; } catch (e) { // If we can't parse JSON, use text const errorText = await response.text(); console.error("API error text:", errorText.substring(0, 500)); errorMessage = `API error: ${errorText}`; } throw new Error(errorMessage); } } catch (fetchError: any) { console.error("Fetch error:", fetchError.name, fetchError.message); throw fetchError; // Re-throw to be caught by outer catch } } catch (error: any) { // Handle the case where error is undefined if (!error) { console.error("Undefined error caught in Gemini handler"); return { content: [{ type: "text", text: "The model took too long to respond or the connection was interrupted. This sometimes happens with complex topics. Please try again or rephrase your question." }], isError: true }; } // Normal error handling for defined errors console.error("Error with Gemini 2.5 Pro:", error.name || "UnknownError", error.message || "No message", error.stack || "No stack" ); return { content: [{ type: "text", text: `Error using Gemini 2.5 Pro Experimental: ${error.name || 'Unknown'} - ${error.message || 'No error message'}` }], isError: true }; } } );
- src/types.ts:13-19 (schema)TypeScript interface for the input parameters of generateWithGemini, matching the Zod schema and used in handler type annotation.export interface GenerationParams { prompt: string; temperature?: number; maxTokens?: number; safeMode?: boolean; useSearch?: boolean; }
- src/types.ts:21-37 (schema)TypeScript interface for the request body sent to the Gemini API.export interface RequestBody { contents: Array<{ role: string; parts: Array<{ text: string; }>; }>; generationConfig: { temperature: number; topP: number; topK: number; maxOutputTokens: number; }; tools?: Array<{ googleSearch?: Record<string, unknown>; }>; }