Skip to main content
Glama
eLyiN
by eLyiN

Server Configuration

Describes the environment variables required to run the server.

NameRequiredDescriptionDefault

No arguments

Capabilities

Features and capabilities supported by this server

CapabilityDetails
tools
{
  "listChanged": false
}
prompts
{
  "listChanged": false
}
resources
{
  "subscribe": false,
  "listChanged": false
}
experimental
{}

Tools

Functions exposed to the LLM to take actions

NameDescription
consult_geminiA

Send a query directly to the Gemini CLI.

Args:
    query: Prompt text forwarded verbatim to the CLI.
    directory: Working directory used for command execution.
    model: Optional model alias (``flash``, ``pro``) or full Gemini model id.
    timeout_seconds: Optional per-call timeout override in seconds.

Returns:
    Gemini's response text or an explanatory error string.
consult_gemini_with_filesA

Send a query to the Gemini CLI with file context.

Args:
    query: Prompt text forwarded to the CLI.
    directory: Working directory used for resolving relative file paths.
    files: Relative or absolute file paths to include alongside the prompt.
    model: Optional model alias (``flash``, ``pro``) or full Gemini model id.
    timeout_seconds: Optional per-call timeout override in seconds.
    mode: ``"inline"`` streams truncated snippets; ``"at_command"`` emits
        ``@path`` directives so Gemini CLI resolves files itself.

Returns:
    Gemini's response or an explanatory error string with any warnings.
web_searchA

Ask Gemini queries with web search context.

Note: This uses Gemini CLI's automatic web search capability.
The model determines when to search based on query context.
Best-effort web search - not guaranteed for every query.

Args:
    query: Search query or question to look up on the web
    directory: Working directory for command execution
    model: Optional model alias (flash, pro, or custom)
    timeout_seconds: Optional per-call timeout override in seconds

Returns:
    Gemini's response with potential web sources

Prompts

Interactive templates invoked by user choice

NameDescription

No prompts

Resources

Contextual data attached and managed by the client

NameDescription

No resources

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/eLyiN/gemini-bridge'

If you have feedback or need assistance with the MCP directory API, please join our Discord server