Skip to main content
Glama
rinaldowouterson

mcp-open-webresearch

Server Configuration

Describes the environment variables required to run the server.

NameRequiredDescriptionDefault
PORTNoServer port.3000
LLM_NAMENoExternal LLM model name.
SAMPLINGNoEnable result sampling.false
HTTP_PROXYNoHTTP Proxy URL.
PUBLIC_URLNoPublic URL for download links.http://localhost:port
CORS_ORIGINNoAllowed CORS origin.*
ENABLE_CORSNoEnable CORS.false
HTTPS_PROXYNoHTTPS Proxy URL.
LLM_API_KEYNoExternal LLM API key.
ENABLE_PROXYNoEnable proxy support.false
LLM_BASE_URLNoExternal LLM API base URL.
SOCKS5_PROXYNoSOCKS5 Proxy URL (Highest Priority).
LLM_TIMEOUT_MSNoTimeout for external LLM calls.30000
WRITE_DEBUG_FILENoLog debug output to file.false
SKIP_IDE_SAMPLINGNoPrefer external API over IDE.false
WRITE_DEBUG_TERMINALNoLog debug output to stdout.false
DEEP_SEARCH_MAX_LOOPSNoMax research iterations.20
DEFAULT_SEARCH_ENGINESNoDefault engines list.bing,duckduckgo,brave
DEEP_SEARCH_MAX_CITATION_URLSNoMax URLs to visit for citations.10
DEEP_SEARCH_RESULTS_PER_ENGINENoResults per engine per round.5
DEEP_SEARCH_SATURATION_THRESHOLDNoThreshold to stop research early.0.6
DEEP_SEARCH_REPORT_RETENTION_MINUTESNoDownload expiration time (minutes).10

Capabilities

Server capabilities have not been inspected yet.

Tools

Functions exposed to the LLM to take actions

NameDescription

No tools

Prompts

Interactive templates invoked by user choice

NameDescription

No prompts

Resources

Contextual data attached and managed by the client

NameDescription

No resources

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/rinaldowouterson/mcp-open-webresearch'

If you have feedback or need assistance with the MCP directory API, please join our Discord server