Skip to main content
Glama
mozicim

Node Code Sandbox MCP

by mozicim

ai_generate

Generate text using Google Gemini AI within a secure Node.js sandbox environment. Provide prompts to create content, select models, and control response length for coding and development tasks.

Instructions

Generate text using Google Gemini. Provide a prompt and optional model name.

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
promptYesPrompt to send to Gemini
modelNoGemini model namemodels/gemini-2.0-flash-exp
maxTokensNoMaximum tokens in the response

Implementation Reference

  • The async handler function that implements the ai_generate tool by calling the Google Gemini API with the provided prompt, model, and maxTokens.
    export default async function aiGenerate({
      prompt,
      model = 'models/gemini-2.0-flash-exp',
      maxTokens,
    }: {
      prompt: string;
      model?: string;
      maxTokens?: number;
    }): Promise<McpResponse> {
      const apiKey = process.env.GEMINI_API_KEY;
      if (!apiKey) {
        logger.error('GEMINI_API_KEY is not set in environment variables');
        return { content: [textContent('Error: Gemini API key not configured.')] };
      }
      try {
        const response = await fetch(
          'https://generativelanguage.googleapis.com/v1beta/models/' +
            encodeURIComponent(model) +
            ':generateContent?key=' +
            apiKey,
          {
            method: 'POST',
            headers: { 'Content-Type': 'application/json' },
            body: JSON.stringify({
              contents: [{ parts: [{ text: prompt }] }],
              ...(maxTokens
                ? { generationConfig: { maxOutputTokens: maxTokens } }
                : {}),
            }),
          }
        );
        if (!response.ok) {
          const errorText = await response.text();
          logger.error('Gemini API error', errorText);
          return { content: [textContent('Gemini API error: ' + errorText)] };
        }
        const data = await response.json();
        const text =
          data.candidates?.[0]?.content?.parts?.[0]?.text || '[No response]';
        return { content: [textContent(text)] };
      } catch (error) {
        logger.error('Failed to call Gemini API', error);
        return {
          content: [
            textContent(
              'Error calling Gemini: ' +
                (error instanceof Error ? error.message : String(error))
            ),
          ],
        };
      }
    }
  • Zod schema defining the input arguments for the ai_generate tool: prompt (required), model (optional), maxTokens (optional).
    export const argSchema = {
      prompt: z.string().min(1).describe('Prompt to send to Gemini'),
      model: z
        .string()
        .optional()
        .default('models/gemini-2.0-flash-exp')
        .describe('Gemini model name'),
      maxTokens: z.number().optional().describe('Maximum tokens in the response'),
    };
  • src/server.ts:115-120 (registration)
    Registration of the 'ai_generate' tool on the MCP server, specifying name, description, argSchema, and handler from aiGenerate.ts.
    server.tool(
      'ai_generate',
      'Generate text using Google Gemini. Provide a prompt and optional model name.',
      (await import('./tools/aiGenerate.ts')).argSchema,
      (await import('./tools/aiGenerate.ts')).default
    );

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/mozicim/node-code-sandbox-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server