AskDocs MCP Server
Provides RAG-powered semantic search over technical documentation PDFs using Ollama for embeddings and LLM inference, with natural language queries and page citations.
Click on "Install Server".
Wait a few minutes for the server to deploy. Once ready, it will show a "Started" state.
In the chat, type
@followed by the MCP server name and your instructions, e.g., "@AskDocs MCP Serverhow do I configure authentication in the API?"
That's it! The server will respond to your query, and you can continue using it as needed.
Here is a step-by-step guide with screenshots.
AskDocs MCP Server
A Model Context Protocol (MCP) server that provides RAG-powered semantic search over technical documentation PDFs using Ollama.
Features
Semantic search with natural language queries
Multiple PDF documents with page citations
Docker support with persistent caching
TOML-based configuration
Quick Start
1. Create askdocs-mcp.toml in your project's docs directory:
[[doc]]
name = "my_manual"
description = "My Product Manual"
path = "pdf/manual.pdf"2. Run with Docker:
docker run -it --rm --network=host -v ./docs:/docs askdocs-mcp:latestaskdocs-mcp expects an Ollama server to be running on http://localhost:11434.
3. Directory structure:
docs/
├── askdocs-mcp.toml # Configuration
├── .askdocs-cache/ # Vector store (auto-created)
└── pdf/
└── manual.pdfAdd **/.askdocs-cache/** to your .gitignore file.
Configuration
# Optional: Configure models
embedding_model = "snowflake-arctic-embed:latest"
llm_model = "qwen3:14b"
[[doc]]
name = "unique_identifier"
description = "Human description"
path = "pdf/document.pdf"Using the MCP Server:
Cursor (~/.cursor/mcp.json or <project-root>/.cursor/mcp.json)
{
"mcpServers": {
"askdocs-mcp": {
"command": "docker",
"args": [
"run", "-i", "--rm",
"--network=host",
"--volume=${workspaceFolder}/docs:/docs",
"ghcr.io/dymk/askdocs-mcp:latest"
]
}
}
}Codex (~/.codex/config.toml)
[mcp_servers.askdocs-mcp]
command = "docker"
args = [
"run", "-i", "--rm",
"--network=host",
"--volume=/your/workspace/folder/docs:/docs",
"ghcr.io/dymk/askdocs-mcp:latest"
]Environment variable:
ASKDOCS_OLLAMA_URL: Ollama server URL (default:http://localhost:11434)
Available Tools
list_docs()
List all documentation sources.
ask_docs(source_name: str, query: str)
Search documentation with natural language.
get_doc_page(source_name: str, page_start: int, page_end: int = None)
Retrieve full text from specific pages.
Requirements
Ollama must be running with the required models:
ollama pull snowflake-arctic-embed:latest
ollama pull qwen3:14bBuilding
# Docker
docker build -t askdocs-mcp:latest .
# Local
uv sync
uv run askdocs-mcp --docs-dir /path/to/docsLicense
MIT
This server cannot be installed
Resources
Unclaimed servers have limited discoverability.
Looking for Admin?
If you are the server author, to access and configure the admin panel.
Latest Blog Posts
MCP directory API
We provide all the information about MCP servers via our MCP API.
curl -X GET 'https://glama.ai/api/mcp/v1/servers/dymk/askdocs-mcp'
If you have feedback or need assistance with the MCP directory API, please join our Discord server