mcp-server-ollama-deep-researcher
by Cam10001110101
Verified
Server Configuration
Describes the environment variables required to run the server.
Name | Required | Description | Default |
---|---|---|---|
PYTHONPATH | No | Path to the src directory of the MCP server | |
TAVILY_API_KEY | No | Your Tavily API key including tvly- prefix | |
PYTHONUNBUFFERED | No | Optional setting for macOS/Linux to prevent Python output buffering | 1 |
LANGSMITH_API_KEY | No | Your LangSmith API key for tracing and monitoring | |
LANGSMITH_PROJECT | No | Project name for LangSmith tracing | ollama-deep-researcher-mcp-server |
LANGSMITH_TRACING | No | Enable tracing with LangSmith | true |
LANGSMITH_ENDPOINT | No | The endpoint for LangSmith API | https://api.smith.langchain.com |
PERPLEXITY_API_KEY | No | Your Perplexity API key |
Schema
Prompts
Interactive templates invoked by user choice
Name | Description |
---|---|
No prompts |
Resources
Contextual data attached and managed by the client
Name | Description |
---|---|
No resources |
Tools
Functions exposed to the LLM to take actions
Name | Description |
---|---|
No tools |