Ollama Deep Researcher DXT Extension
Overview
Ollama Deep Researcher is a Desktop Extension (DXT) that enables advanced topic research using web search and LLM synthesis, powered by a local MCP server. It supports configurable research parameters, status tracking, and resource access, and is designed for seamless integration with the DXT ecosystem.
- Research any topic using web search APIs and LLMs (Ollama, DeepSeek, etc.)
- Configure max research loops, LLM model, and search API
- Track status of ongoing research
- Access research results as resources via MCP protocol
Features
- Implements the MCP protocol over stdio for local, secure operation
- Defensive programming: error handling, timeouts, and validation
- Logging and debugging via stderr
- Compatible with DXT host environments
Directory Structure
Installation & Setup
- Clone the repository and install dependencies:
- Install Python dependencies for the assistant:
- Set required environment variables for web search APIs:
- For Tavily:
TAVILY_API_KEY
- For Perplexity:
PERPLEXITY_API_KEY
- Example:
- For Tavily:
- Build the TypeScript server (if needed):
- Run the extension locally for testing:
Usage
- Research a topic:
- Use the
research
tool with{ "topic": "Your subject" }
- Use the
- Get research status:
- Use the
get_status
tool
- Use the
- Configure research parameters:
- Use the
configure
tool with any of:maxLoops
,llmModel
,searchApi
- Use the
Manifest
See manifest.json
for the full DXT manifest, including tool schemas and resource templates. Follows DXT MANIFEST.md.
Logging & Debugging
- All server logs and errors are output to
stderr
for debugging. - Research subprocesses are killed after 5 minutes to prevent hangs.
- Invalid requests and configuration errors return clear, structured error messages.
Security & Best Practices
- All tool schemas are validated before execution.
- API keys are required for web search APIs and are never logged.
- MCP protocol is used over stdio for local, secure communication.
Testing & Validation
- Validate the extension by loading it in a DXT-compatible host.
- Ensure all tool calls return valid, structured JSON responses.
- Check that the manifest loads and the extension registers as a DXT.
Troubleshooting
- Missing API key: Ensure
TAVILY_API_KEY
orPERPLEXITY_API_KEY
is set in your environment. - Python errors: Check Python dependencies and logs in
stderr
. - Timeouts: Research subprocesses are limited to 5 minutes.
References
© 2025 Your Name or Organization. Licensed under MIT.
hybrid server
The server is able to function both locally and remotely, depending on the configuration or use case.
これは、LangChain Ollama Deep Researcherのモデルコンテキストプロトコル(MCP)サーバーへの適応です。モデルコンテキストプロトコルエコシステム内で利用可能なMCPツールとしてディープリサーチ機能を提供し、AIアシスタントがOllamaを介してトピック(ローカル)に関する詳細なリサーチを実行できるようにします。
Related Resources
Related MCP Servers
- -securityFlicense-qualityAn interactive chat interface that combines Ollama's LLM capabilities with PostgreSQL database access through the Model Context Protocol (MCP). Ask questions about your data in natural language and get AI-powered responses backed by real SQL queries.Last updated -52TypeScript
- AsecurityAlicenseAqualityMCP Ollama server integrates Ollama models with MCP clients, allowing users to list models, get detailed information, and interact with them through questions.Last updated -325PythonMIT License
- -securityFlicense-qualityA generic Model Context Protocol framework for building AI-powered applications that provides standardized ways to create MCP servers and clients for integrating LLMs with support for Ollama and Supabase.Last updated -TypeScript
- AsecurityAlicenseAqualityAn MCP server that queries multiple Ollama models and combines their responses, providing diverse AI perspectives on a single question for more comprehensive answers.Last updated -260TypeScriptMIT License