Skip to main content
Glama
hfredrick69

Deep Research MCP Server

by hfredrick69

Server Configuration

Describes the environment variables required to run the server.

NameRequiredDescriptionDefault
GEMINI_MODELNoThe Gemini model to usegemini-2.5-flash
GEMINI_API_KEYYesYour Google Gemini API key (required)
CONCURRENCY_LIMITNoConcurrency limit for batched API calls5
ENABLE_GEMINI_FUNCTIONSNoEnable function callingfalse
GEMINI_MAX_OUTPUT_TOKENSNoMaximum output tokens for Gemini responses65536
ENABLE_GEMINI_GOOGLE_SEARCHNoEnable Google Search Grounding tooltrue
ENABLE_GEMINI_CODE_EXECUTIONNoEnable code execution toolfalse

Tools

Functions exposed to the LLM to take actions

NameDescription

No tools

Prompts

Interactive templates invoked by user choice

NameDescription

No prompts

Resources

Contextual data attached and managed by the client

NameDescription

No resources

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/hfredrick69/deep-research-mcp-server'

If you have feedback or need assistance with the MCP directory API, please join our Discord server