mcp-server-ollama-deep-researcher

by Cam10001110101
Verified

Server Configuration

Describes the environment variables required to run the server.

NameRequiredDescriptionDefault
PYTHONPATHNoPath to the src directory of the MCP server
TAVILY_API_KEYNoYour Tavily API key including tvly- prefix
PYTHONUNBUFFEREDNoOptional setting for macOS/Linux to prevent Python output buffering1
LANGSMITH_API_KEYNoYour LangSmith API key for tracing and monitoring
LANGSMITH_PROJECTNoProject name for LangSmith tracingollama-deep-researcher-mcp-server
LANGSMITH_TRACINGNoEnable tracing with LangSmithtrue
LANGSMITH_ENDPOINTNoThe endpoint for LangSmith APIhttps://api.smith.langchain.com
PERPLEXITY_API_KEYNoYour Perplexity API key

Schema

Prompts

Interactive templates invoked by user choice

NameDescription

No prompts

Resources

Contextual data attached and managed by the client

NameDescription

No resources

Tools

Functions exposed to the LLM to take actions

NameDescription

No tools