mcp-server-ollama-deep-researcher

by Cam10001110101
Verified

Server Configuration

Describes the environment variables required to run the server.

NameRequiredDescriptionDefault
PYTHONPATHNoPath to the Python source code directory
TAVILY_API_KEYNoYour Tavily API key (including tvly- prefix) for web search functionality
OLLAMA_BASE_URLNoURL for connecting to Ollama service (used in Docker configuration)
PYTHONUNBUFFEREDNoDisable Python output buffering for better logging1
LANGSMITH_API_KEYNoYour LangSmith API key for tracing and monitoring
LANGSMITH_PROJECTNoLangSmith project nameollama-deep-researcher-mcp-server
LANGSMITH_TRACINGNoEnable LangSmith tracingtrue
LANGSMITH_ENDPOINTNoLangSmith API endpointhttps://api.smith.langchain.com
PERPLEXITY_API_KEYNoYour Perplexity API key for web search functionality

Schema

Prompts

Interactive templates invoked by user choice

NameDescription

No prompts

Resources

Contextual data attached and managed by the client

NameDescription

No resources

Tools

Functions exposed to the LLM to take actions

NameDescription
research

Research a topic using web search and LLM synthesis

get_status

Get the current status of any ongoing research

configure

Configure the research parameters (max loops, LLM model, search API)