Skip to main content
Glama

mcp-server-ollama-deep-researcher

MIT License
13
  • Apple
  • Linux

Ollama Deep Researcher DXT Extension

Overview

Ollama Deep Researcher is a Desktop Extension (DXT) that enables advanced topic research using web search and LLM synthesis, powered by a local MCP server. It supports configurable research parameters, status tracking, and resource access, and is designed for seamless integration with the DXT ecosystem.

  • Research any topic using web search APIs and LLMs (Ollama, DeepSeek, etc.)
  • Configure max research loops, LLM model, and search API
  • Track status of ongoing research
  • Access research results as resources via MCP protocol

Features

  • Implements the MCP protocol over stdio for local, secure operation
  • Defensive programming: error handling, timeouts, and validation
  • Logging and debugging via stderr
  • Compatible with DXT host environments

Directory Structure

. ├── manifest.json # DXT manifest (see MANIFEST.md for spec) ├── src/ │ ├── index.ts # MCP server entrypoint (Node.js, stdio transport) │ └── assistant/ # Python research logic │ └── run_research.py ├── README.md # This documentation └── ...

Installation & Setup

  1. Clone the repository and install dependencies:
    git clone <your-repo-url> cd mcp-server-ollama-deep-researcher npm install
  2. Install Python dependencies for the assistant:
    cd src/assistant pip install -r requirements.txt # or use pyproject.toml/uv if preferred
  3. Set required environment variables for web search APIs:
    • For Tavily: TAVILY_API_KEY
    • For Perplexity: PERPLEXITY_API_KEY
    • Example:
      export TAVILY_API_KEY=your_tavily_key export PERPLEXITY_API_KEY=your_perplexity_key
  4. Build the TypeScript server (if needed):
    npm run build
  5. Run the extension locally for testing:
    node dist/index.js # Or use the DXT host to load the extension per DXT documentation

Usage

  • Research a topic:
    • Use the research tool with { "topic": "Your subject" }
  • Get research status:
    • Use the get_status tool
  • Configure research parameters:
    • Use the configure tool with any of: maxLoops, llmModel, searchApi

Manifest

See manifest.json for the full DXT manifest, including tool schemas and resource templates. Follows DXT MANIFEST.md.

Logging & Debugging

  • All server logs and errors are output to stderr for debugging.
  • Research subprocesses are killed after 5 minutes to prevent hangs.
  • Invalid requests and configuration errors return clear, structured error messages.

Security & Best Practices

  • All tool schemas are validated before execution.
  • API keys are required for web search APIs and are never logged.
  • MCP protocol is used over stdio for local, secure communication.

Testing & Validation

  • Validate the extension by loading it in a DXT-compatible host.
  • Ensure all tool calls return valid, structured JSON responses.
  • Check that the manifest loads and the extension registers as a DXT.

Troubleshooting

  • Missing API key: Ensure TAVILY_API_KEY or PERPLEXITY_API_KEY is set in your environment.
  • Python errors: Check Python dependencies and logs in stderr.
  • Timeouts: Research subprocesses are limited to 5 minutes.

References


© 2025 Your Name or Organization. Licensed under MIT.

Install Server
A
security – no known vulnerabilities
A
license - permissive license
A
quality - confirmed to work

hybrid server

The server is able to function both locally and remotely, depending on the configuration or use case.

This is a Model Context Protocol (MCP) server adaptation of LangChain Ollama Deep Researcher. It provides the deep research capabilities as MCP tools that can be used within the model context protocol ecosystem, allowing AI assistants to perform in-depth research on topics (locally) via Ollama

  1. Core Functionality
    1. Research Process
  2. Prerequisites
    1. Installation
      1. Option 1: Standard Installation
      2. Option 2: Docker Installation
    2. Client Configuration
      1. Option 1: Standard Installation Configuration
      2. Option 2: Docker Installation Configuration
    3. Tracing and Monitoring
      1. MCP Resources
        1. Available Tools
          1. Configure
          2. Research
          3. Get status
        2. Prompting
          1. Using the Default Search API, Model, and Max Iterations (loops)
          2. Change Default Config and Start Research
        3. The Ollama Research Workflow
          1. Outputs
          2. System Integration Overview
          3. Troubleshooting
          4. Error Handling
          5. Enhancements Needed
          6. Architecture
          7. Glama.ai Badge
        4. Example Prompt and Output Transcript
          1. Prompt
          2. Configuration Output
          3. Ollama Researcher Output
        5. Claude Final Output

          Related MCP Servers

          • -
            security
            F
            license
            -
            quality
            An interactive chat interface that combines Ollama's LLM capabilities with PostgreSQL database access through the Model Context Protocol (MCP). Ask questions about your data in natural language and get AI-powered responses backed by real SQL queries.
            Last updated -
            52
            TypeScript
          • A
            security
            A
            license
            A
            quality
            MCP Ollama server integrates Ollama models with MCP clients, allowing users to list models, get detailed information, and interact with them through questions.
            Last updated -
            3
            25
            Python
            MIT License
            • Apple
          • -
            security
            F
            license
            -
            quality
            A generic Model Context Protocol framework for building AI-powered applications that provides standardized ways to create MCP servers and clients for integrating LLMs with support for Ollama and Supabase.
            Last updated -
            TypeScript
          • A
            security
            A
            license
            A
            quality
            An MCP server that queries multiple Ollama models and combines their responses, providing diverse AI perspectives on a single question for more comprehensive answers.
            Last updated -
            2
            60
            TypeScript
            MIT License
            • Apple

          View all related MCP servers

          MCP directory API

          We provide all the information about MCP servers via our MCP API.

          curl -X GET 'https://glama.ai/api/mcp/v1/servers/Cam10001110101/mcp-server-ollama-deep-researcher'

          If you have feedback or need assistance with the MCP directory API, please join our Discord server