Skip to main content
Glama

Deep Research MCP Server

by Ozamatash
OPENAI_API_KEY= # Your OpenAI API key FIRECRAWL_KEY= # Your Firecrawl API key (not needed if using local instance) # Firecrawl Configuration FIRECRAWL_CONCURRENCY=2 # Number of concurrent searches (default: 2) FIRECRAWL_BASE_URL= # Local Firecrawl URL (e.g. "http://localhost:3002") # OpenAI Configuration CONTEXT_SIZE="128000" # If you want to use other OpenAI compatible API, add the following below: # OPENAI_ENDPOINT="http://localhost:11434/v1" # OPENAI_MODEL="llama3.1" # Observability (Optional) LANGFUSE_PUBLIC_KEY= LANGFUSE_SECRET_KEY= LANGFUSE_BASEURL=

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/Ozamatash/deep-research-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server