Skip to main content
Glama
dymk

AskDocs MCP Server

by dymk

AskDocs MCP Server

A Model Context Protocol (MCP) server that provides RAG-powered semantic search over technical documentation PDFs using Ollama.

Features

  • Semantic search with natural language queries

  • Multiple PDF documents with page citations

  • Docker support with persistent caching

  • TOML-based configuration

Quick Start

1. Create askdocs-mcp.toml in your project's docs directory:

[[doc]]
name = "my_manual"
description = "My Product Manual"
path = "pdf/manual.pdf"

2. Run with Docker:

docker run -it --rm --network=host -v ./docs:/docs askdocs-mcp:latest

askdocs-mcp expects an Ollama server to be running on http://localhost:11434.

3. Directory structure:

docs/
├── askdocs-mcp.toml    # Configuration
├── .askdocs-cache/     # Vector store (auto-created)
└── pdf/
    └── manual.pdf

Add **/.askdocs-cache/** to your .gitignore file.

Configuration

# Optional: Configure models
embedding_model = "snowflake-arctic-embed:latest"
llm_model = "qwen3:14b"

[[doc]]
name = "unique_identifier"
description = "Human description"
path = "pdf/document.pdf"

Using the MCP Server:

Cursor (~/.cursor/mcp.json or <project-root>/.cursor/mcp.json)

{
    "mcpServers": {
        "askdocs-mcp": {
            "command": "docker",
            "args": [
                "run", "-i", "--rm",
                "--network=host",
                "--volume=${workspaceFolder}/docs:/docs",
                "ghcr.io/dymk/askdocs-mcp:latest"
            ]
        }
    }
}

Codex (~/.codex/config.toml)

[mcp_servers.askdocs-mcp]
command = "docker"
args = [
    "run", "-i", "--rm",
    "--network=host",
    "--volume=/your/workspace/folder/docs:/docs",
    "ghcr.io/dymk/askdocs-mcp:latest"
]

Environment variable:

  • ASKDOCS_OLLAMA_URL: Ollama server URL (default: http://localhost:11434)

Available Tools

list_docs()

List all documentation sources.

ask_docs(source_name: str, query: str)

Search documentation with natural language.

get_doc_page(source_name: str, page_start: int, page_end: int = None)

Retrieve full text from specific pages.

Requirements

Ollama must be running with the required models:

ollama pull snowflake-arctic-embed:latest
ollama pull qwen3:14b

Building

# Docker
docker build -t askdocs-mcp:latest .

# Local
uv sync
uv run askdocs-mcp --docs-dir /path/to/docs

License

MIT

-
security - not tested
F
license - not found
-
quality - not tested

Resources

Unclaimed servers have limited discoverability.

Looking for Admin?

If you are the server author, to access and configure the admin panel.

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/dymk/askdocs-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server