Skip to main content
Glama

Server Configuration

Describes the environment variables required to run the server.

NameRequiredDescriptionDefault

No arguments

Tools

Functions exposed to the LLM to take actions

NameDescription
consult_gemini

Send a query directly to the Gemini CLI.

Args: query: Prompt text forwarded verbatim to the CLI. directory: Working directory used for command execution. model: Optional model alias (``flash``, ``pro``) or full Gemini model id. timeout_seconds: Optional per-call timeout override in seconds. Returns: Gemini's response text or an explanatory error string.
consult_gemini_with_files

Send a query to the Gemini CLI with file context.

Args: query: Prompt text forwarded to the CLI. directory: Working directory used for resolving relative file paths. files: Relative or absolute file paths to include alongside the prompt. model: Optional model alias (``flash``, ``pro``) or full Gemini model id. timeout_seconds: Optional per-call timeout override in seconds. mode: ``"inline"`` streams truncated snippets; ``"at_command"`` emits ``@path`` directives so Gemini CLI resolves files itself. Returns: Gemini's response or an explanatory error string with any warnings.

Prompts

Interactive templates invoked by user choice

NameDescription

No prompts

Resources

Contextual data attached and managed by the client

NameDescription

No resources

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/eLyiN/gemini-bridge'

If you have feedback or need assistance with the MCP directory API, please join our Discord server