Provides RAG-powered semantic search over technical documentation PDFs using Ollama for embeddings and LLM inference, with natural language queries and page citations.
AskDocs MCP Server
A Model Context Protocol (MCP) server that provides RAG-powered semantic search over technical documentation PDFs using Ollama.
Features
Semantic search with natural language queries
Multiple PDF documents with page citations
Docker support with persistent caching
TOML-based configuration
Quick Start
1. Create
2. Run with Docker:
askdocs-mcp expects an Ollama server to be running on http://localhost:11434.
3. Directory structure:
Add **/.askdocs-cache/** to your .gitignore file.
Configuration
Using the MCP Server:
Cursor (~/.cursor/mcp.json or <project-root>/.cursor/mcp.json)
Codex (~/.codex/config.toml)
Environment variable:
ASKDOCS_OLLAMA_URL: Ollama server URL (default:http://localhost:11434)
Available Tools
list_docs()
List all documentation sources.
ask_docs(source_name: str, query: str)
Search documentation with natural language.
get_doc_page(source_name: str, page_start: int, page_end: int = None)
Retrieve full text from specific pages.
Requirements
Ollama must be running with the required models:
Building
License
MIT
This server cannot be installed
hybrid server
The server is able to function both locally and remotely, depending on the configuration or use case.
Enables semantic search and retrieval of information from technical documentation PDFs using RAG-powered natural language queries with Ollama embeddings and LLMs.