mcp-server-ollama-deep-researcher
by Cam10001110101
Verified
Server Configuration
Describes the environment variables required to run the server.
Name | Required | Description | Default |
---|---|---|---|
PYTHONPATH | No | Path to the Python source code directory | |
TAVILY_API_KEY | No | Your Tavily API key (including tvly- prefix) for web search functionality | |
OLLAMA_BASE_URL | No | URL for connecting to Ollama service (used in Docker configuration) | |
PYTHONUNBUFFERED | No | Disable Python output buffering for better logging | 1 |
LANGSMITH_API_KEY | No | Your LangSmith API key for tracing and monitoring | |
LANGSMITH_PROJECT | No | LangSmith project name | ollama-deep-researcher-mcp-server |
LANGSMITH_TRACING | No | Enable LangSmith tracing | true |
LANGSMITH_ENDPOINT | No | LangSmith API endpoint | https://api.smith.langchain.com |
PERPLEXITY_API_KEY | No | Your Perplexity API key for web search functionality |
Schema
Prompts
Interactive templates invoked by user choice
Name | Description |
---|---|
No prompts |
Resources
Contextual data attached and managed by the client
Name | Description |
---|---|
No resources |
Tools
Functions exposed to the LLM to take actions
Name | Description |
---|---|
research | Research a topic using web search and LLM synthesis |
get_status | Get the current status of any ongoing research |
configure | Configure the research parameters (max loops, LLM model, search API) |