Skip to main content
Glama

Gemini MCP Tool

ask-gemini

Analyze files, codebases, or get AI insights with natural language commands. Supports model selection and sandbox testing for secure execution using the Gemini CLI interface.

Instructions

Execute 'gemini -p <prompt>' to get Gemini AI's response. Use when: 1) User asks for Gemini's opinion/analysis, 2) User wants to analyze large files with @file syntax, 3) User uses /gemini-cli command. Supports -m flag for model selection and -s flag for sandbox testing.

Input Schema

NameRequiredDescriptionDefault
modelNoOptional model to use (e.g., 'gemini-2.5-flash'). If not specified, uses the default model (gemini-2.5-pro).
promptYesAnalysis request. Use @ syntax to include files (e.g., '@largefile.js explain what this does') or ask general questions
sandboxNoUse sandbox mode (-s flag) to safely test code changes, execute scripts, or run potentially risky operations in an isolated environment

Input Schema (JSON Schema)

{ "properties": { "model": { "description": "Optional model to use (e.g., 'gemini-2.5-flash'). If not specified, uses the default model (gemini-2.5-pro).", "type": "string" }, "prompt": { "description": "Analysis request. Use @ syntax to include files (e.g., '@largefile.js explain what this does') or ask general questions", "type": "string" }, "sandbox": { "default": false, "description": "Use sandbox mode (-s flag) to safely test code changes, execute scripts, or run potentially risky operations in an isolated environment", "type": "boolean" } }, "required": [ "prompt" ], "type": "object" }
Install Server

Other Tools from Gemini MCP Tool

Related Tools

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/jamubc/gemini-mcp-tool'

If you have feedback or need assistance with the MCP directory API, please join our Discord server