Skip to main content
Glama

mcp-server-ollama-deep-researcher

MIT License
13
  • Apple
  • Linux

Ollama Deep Researcher DXT Extension

Overview

Ollama Deep Researcher is a Desktop Extension (DXT) that enables advanced topic research using web search and LLM synthesis, powered by a local MCP server. It supports configurable research parameters, status tracking, and resource access, and is designed for seamless integration with the DXT ecosystem.

  • Research any topic using web search APIs and LLMs (Ollama, DeepSeek, etc.)
  • Configure max research loops, LLM model, and search API
  • Track status of ongoing research
  • Access research results as resources via MCP protocol

Features

  • Implements the MCP protocol over stdio for local, secure operation
  • Defensive programming: error handling, timeouts, and validation
  • Logging and debugging via stderr
  • Compatible with DXT host environments

Directory Structure

. ├── manifest.json # DXT manifest (see MANIFEST.md for spec) ├── src/ │ ├── index.ts # MCP server entrypoint (Node.js, stdio transport) │ └── assistant/ # Python research logic │ └── run_research.py ├── README.md # This documentation └── ...

Installation & Setup

  1. Clone the repository and install dependencies:
    git clone <your-repo-url> cd mcp-server-ollama-deep-researcher npm install
  2. Install Python dependencies for the assistant:
    cd src/assistant pip install -r requirements.txt # or use pyproject.toml/uv if preferred
  3. Set required environment variables for web search APIs:
    • For Tavily: TAVILY_API_KEY
    • For Perplexity: PERPLEXITY_API_KEY
    • Example:
      export TAVILY_API_KEY=your_tavily_key export PERPLEXITY_API_KEY=your_perplexity_key
  4. Build the TypeScript server (if needed):
    npm run build
  5. Run the extension locally for testing:
    node dist/index.js # Or use the DXT host to load the extension per DXT documentation

Usage

  • Research a topic:
    • Use the research tool with { "topic": "Your subject" }
  • Get research status:
    • Use the get_status tool
  • Configure research parameters:
    • Use the configure tool with any of: maxLoops, llmModel, searchApi

Manifest

See manifest.json for the full DXT manifest, including tool schemas and resource templates. Follows DXT MANIFEST.md.

Logging & Debugging

  • All server logs and errors are output to stderr for debugging.
  • Research subprocesses are killed after 5 minutes to prevent hangs.
  • Invalid requests and configuration errors return clear, structured error messages.

Security & Best Practices

  • All tool schemas are validated before execution.
  • API keys are required for web search APIs and are never logged.
  • MCP protocol is used over stdio for local, secure communication.

Testing & Validation

  • Validate the extension by loading it in a DXT-compatible host.
  • Ensure all tool calls return valid, structured JSON responses.
  • Check that the manifest loads and the extension registers as a DXT.

Troubleshooting

  • Missing API key: Ensure TAVILY_API_KEY or PERPLEXITY_API_KEY is set in your environment.
  • Python errors: Check Python dependencies and logs in stderr.
  • Timeouts: Research subprocesses are limited to 5 minutes.

References


© 2025 Your Name or Organization. Licensed under MIT.

Install Server
A
security – no known vulnerabilities
A
license - permissive license
A
quality - confirmed to work

hybrid server

The server is able to function both locally and remotely, depending on the configuration or use case.

これは、LangChain Ollama Deep Researcherのモデルコンテキストプロトコル(MCP)サーバーへの適応です。モデルコンテキストプロトコルエコシステム内で利用可能なMCPツールとしてディープリサーチ機能を提供し、AIアシスタントがOllamaを介してトピック(ローカル)に関する詳細なリサーチを実行できるようにします。

  1. コア機能
    1. 研究プロセス
  2. 前提条件
    1. インストール
      1. オプション1: 標準インストール
      2. オプション2: Dockerのインストール
    2. クライアント構成
      1. オプション1: 標準インストール構成
      2. オプション2: Dockerインストール構成
    3. 追跡と監視
      1. MCPリソース
        1. 利用可能なツール
          1. 設定
          2. 研究
          3. ステータスを取得する
        2. 促す
          1. デフォルトの検索 API、モデル、最大反復回数(ループ)の使用
          2. デフォルト設定を変更して調査を開始する
        3. Ollamaリサーチワークフロー
          1. 出力
          2. システム統合の概要
          3. トラブルシューティング
          4. エラー処理
          5. 必要な機能強化
          6. 建築
          7. Glama.ai バッジ
        4. プロンプトと出力のトランスクリプトの例
          1. プロンプト
          2. 構成出力
          3. オラマ研究者の成果
        5. クロード最終出力

          Related MCP Servers

          • -
            security
            F
            license
            -
            quality
            An interactive chat interface that combines Ollama's LLM capabilities with PostgreSQL database access through the Model Context Protocol (MCP). Ask questions about your data in natural language and get AI-powered responses backed by real SQL queries.
            Last updated -
            52
            TypeScript
          • A
            security
            A
            license
            A
            quality
            MCP Ollama server integrates Ollama models with MCP clients, allowing users to list models, get detailed information, and interact with them through questions.
            Last updated -
            3
            25
            Python
            MIT License
            • Apple
          • -
            security
            F
            license
            -
            quality
            A generic Model Context Protocol framework for building AI-powered applications that provides standardized ways to create MCP servers and clients for integrating LLMs with support for Ollama and Supabase.
            Last updated -
            TypeScript
          • A
            security
            A
            license
            A
            quality
            An MCP server that queries multiple Ollama models and combines their responses, providing diverse AI perspectives on a single question for more comprehensive answers.
            Last updated -
            2
            60
            TypeScript
            MIT License
            • Apple

          View all related MCP servers

          MCP directory API

          We provide all the information about MCP servers via our MCP API.

          curl -X GET 'https://glama.ai/api/mcp/v1/servers/Cam10001110101/mcp-server-ollama-deep-researcher'

          If you have feedback or need assistance with the MCP directory API, please join our Discord server