Skip to main content
Glama

gemini-bridge

by eLyiN

consult_gemini

Send queries directly to Gemini CLI via the MCP server to retrieve AI-generated responses. Execute prompts in a specified directory for streamlined, straightforward interactions.

Instructions

Send a query directly to Gemini CLI. This is the core function - a direct bridge between Claude and Gemini. No caching, no sessions, no complexity. Just execute and return. Args: query: The question or prompt to send to Gemini directory: Working directory (required) model: Optional model name (flash, pro, etc.) Returns: Gemini's response

Input Schema

NameRequiredDescriptionDefault
directoryYes
modelNo
queryYes

Input Schema (JSON Schema)

{ "properties": { "directory": { "title": "Directory", "type": "string" }, "model": { "anyOf": [ { "type": "string" }, { "type": "null" } ], "default": null, "title": "Model" }, "query": { "title": "Query", "type": "string" } }, "required": [ "query", "directory" ], "title": "consult_geminiArguments", "type": "object" }

Other Tools from gemini-bridge

Related Tools

  • @jamubc/gemini-mcp-tool
  • @orzcls/gemini-mcp-tool-windows-fixed
  • @InfolabAI/gemini-cli-mcp
  • @orzcls/gemini-mcp-tool-windows-fixed
  • @jamubc/gemini-mcp-tool
  • @fakoli/mcp-ai-bridge

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/eLyiN/gemini-bridge'

If you have feedback or need assistance with the MCP directory API, please join our Discord server