Skip to main content
Glama

gemini-bridge

by eLyiN

consult_gemini_with_files

Query Gemini AI with file attachments to integrate file content into prompts. Specify a directory and file paths for context, and receive AI-driven responses.

Instructions

Send a query to Gemini CLI with file attachments. Files are read and concatenated into the prompt. Simple and direct. Args: query: The question or prompt to send to Gemini directory: Working directory (required) files: List of file paths to attach (relative to directory) model: Optional model name (flash, pro, etc.) Returns: Gemini's response with file context

Input Schema

NameRequiredDescriptionDefault
directoryYes
filesNo
modelNo
queryYes

Input Schema (JSON Schema)

{ "properties": { "directory": { "title": "Directory", "type": "string" }, "files": { "anyOf": [ { "items": { "type": "string" }, "type": "array" }, { "type": "null" } ], "default": null, "title": "Files" }, "model": { "anyOf": [ { "type": "string" }, { "type": "null" } ], "default": null, "title": "Model" }, "query": { "title": "Query", "type": "string" } }, "required": [ "query", "directory" ], "title": "consult_gemini_with_filesArguments", "type": "object" }

Other Tools from gemini-bridge

Related Tools

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/eLyiN/gemini-bridge'

If you have feedback or need assistance with the MCP directory API, please join our Discord server