Skip to main content
Glama

AskDocs MCP Server

by dymk

AskDocs MCP Server

A Model Context Protocol (MCP) server that provides RAG-powered semantic search over technical documentation PDFs using Ollama.

Features

  • Semantic search with natural language queries

  • Multiple PDF documents with page citations

  • Docker support with persistent caching

  • TOML-based configuration

Quick Start

1. Create

[[doc]] name = "my_manual" description = "My Product Manual" path = "pdf/manual.pdf"

2. Run with Docker:

docker run -it --rm --network=host -v ./docs:/docs askdocs-mcp:latest

askdocs-mcp expects an Ollama server to be running on http://localhost:11434.

3. Directory structure:

docs/ ├── askdocs-mcp.toml # Configuration ├── .askdocs-cache/ # Vector store (auto-created) └── pdf/ └── manual.pdf

Add **/.askdocs-cache/** to your .gitignore file.

Configuration

# Optional: Configure models embedding_model = "snowflake-arctic-embed:latest" llm_model = "qwen3:14b" [[doc]] name = "unique_identifier" description = "Human description" path = "pdf/document.pdf"

Using the MCP Server:

Cursor (~/.cursor/mcp.json or <project-root>/.cursor/mcp.json)

{ "mcpServers": { "askdocs-mcp": { "command": "docker", "args": [ "run", "-i", "--rm", "--network=host", "--volume=${workspaceFolder}/docs:/docs", "ghcr.io/dymk/askdocs-mcp:latest" ] } } }

Codex (~/.codex/config.toml)

[mcp_servers.askdocs-mcp] command = "docker" args = [ "run", "-i", "--rm", "--network=host", "--volume=/your/workspace/folder/docs:/docs", "ghcr.io/dymk/askdocs-mcp:latest" ]

Environment variable:

  • ASKDOCS_OLLAMA_URL: Ollama server URL (default: http://localhost:11434)

Available Tools

list_docs()

List all documentation sources.

ask_docs(source_name: str, query: str)

Search documentation with natural language.

get_doc_page(source_name: str, page_start: int, page_end: int = None)

Retrieve full text from specific pages.

Requirements

Ollama must be running with the required models:

ollama pull snowflake-arctic-embed:latest ollama pull qwen3:14b

Building

# Docker docker build -t askdocs-mcp:latest . # Local uv sync uv run askdocs-mcp --docs-dir /path/to/docs

License

MIT

-
security - not tested
F
license - not found
-
quality - not tested

hybrid server

The server is able to function both locally and remotely, depending on the configuration or use case.

Enables semantic search and retrieval of information from technical documentation PDFs using RAG-powered natural language queries with Ollama embeddings and LLMs.

  1. Features
    1. Quick Start
      1. Configuration
        1. Using the MCP Server:
          1. Cursor (~/.cursor/mcp.json or <project-root>/.cursor/mcp.json)
          2. Codex (~/.codex/config.toml)
        2. Available Tools
          1. list_docs()
          2. ask_docs(source_name: str, query: str)
          3. get_doc_page(source_name: str, page_start: int, page_end: int = None)
        3. Requirements
          1. Building
            1. License

              MCP directory API

              We provide all the information about MCP servers via our MCP API.

              curl -X GET 'https://glama.ai/api/mcp/v1/servers/dymk/askdocs-mcp'

              If you have feedback or need assistance with the MCP directory API, please join our Discord server