Skip to main content
Glama

mcp-server-ollama-deep-researcher

MIT License
13
  • Apple
  • Linux
# API Keys for search services TAVILY_API_KEY=your_tavily_api_key_here PERPLEXITY_API_KEY=your_perplexity_api_key_here # Ollama configuration # Use this if Ollama is running on a different host or port # OLLAMA_BASE_URL=http://localhost:11434 # LangSmith configuration (optional) # LANGSMITH_TRACING=true # LANGSMITH_API_KEY=your_langsmith_api_key_here # LANGSMITH_PROJECT=ollama-deep-researcher-mcp-server

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/Cam10001110101/mcp-server-ollama-deep-researcher'

If you have feedback or need assistance with the MCP directory API, please join our Discord server