Skip to main content
Glama

Server Configuration

Describes the environment variables required to run the server.

NameRequiredDescriptionDefault
PROJECT_ROOTNoThe directory to analyze.
GEMINI_API_KEYNoAPI key for Gemini CLI authentication (required for Docker deployment).

Capabilities

Features and capabilities supported by this server

CapabilityDetails
tools
{}
logging
{}

Tools

Functions exposed to the LLM to take actions

NameDescription
quick_query

Analyze code/files quickly using Gemini's large context window. Preferred when questions mention specific files or require reading repository code. Example: {prompt: 'Explain @src/auth.ts security approach', focus: 'security', responseStyle: 'concise'}

deep_research

Perform comprehensive codebase analysis across multiple files with deep reasoning. Preferred for complex architectural questions or multi-file investigation. Example: {prompt: 'Trace authentication flow from @src/routes to @src/middleware', focus: 'architecture', citationMode: 'paths_only'}

analyze_directory

Map repository structure and understand what each file/module does. Preferred when questions ask about project organization or 'what's in this directory'. Example: {path: './src', depth: 3, maxFiles: 100}

validate_paths

Verify file paths exist and are accessible before analysis. Use when uncertain about path correctness or troubleshooting 'PATH_NOT_ALLOWED' errors. Example: {paths: ['src/auth.ts', 'config/database.js', '../README.md']}

health_check

Verify server status and Gemini CLI configuration. Use for troubleshooting connection issues or confirming setup. Example: {includeDiagnostics: true}

fetch_chunk

Retrieve continuation of a large response. Use when a previous tool response included 'chunks' metadata indicating more content available. Example: {cacheKey: 'cache_abc123', chunkIndex: 2}

Prompts

Interactive templates invoked by user choice

NameDescription

No prompts

Resources

Contextual data attached and managed by the client

NameDescription

No resources

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/capyBearista/gemini-researcher'

If you have feedback or need assistance with the MCP directory API, please join our Discord server