Skip to main content
Glama
kitan23

Dedalus MCP Documentation Server

by kitan23

ask_docs

Answer questions about documentation using AI with context from documents. Get AI-generated answers with sources for technical queries.

Instructions

Answer questions about documentation using AI

Args:
    question: The question to answer
    context_docs: Optional list of document paths to use as context
    max_context_length: Maximum characters of context to include
    user_id: Optional user identifier for rate limiting

Returns:
    AI-generated answer with sources

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
questionYes
context_docsNo
max_context_lengthNo
user_idNo

Implementation Reference

  • The primary handler function for the 'ask_docs' MCP tool. Decorated with @mcp.tool() for automatic schema generation and registration. Implements rate limiting, automatic context retrieval via search_docs if needed, context truncation, OpenAI GPT-4o-mini integration for answer generation (with fallback to raw context provision if no API key), source tracking, and comprehensive error handling.
    @mcp.tool()
    def ask_docs(
        question: str,
        context_docs: Optional[List[str]] = None,
        max_context_length: int = 4000,
        user_id: Optional[str] = None,
    ) -> Dict[str, Any]:
        """
        Answer questions about documentation using AI
    
        Args:
            question: The question to answer
            context_docs: Optional list of document paths to use as context
            max_context_length: Maximum characters of context to include
            user_id: Optional user identifier for rate limiting
    
        Returns:
            AI-generated answer with sources
        """
        # Rate limiting check
        identifier = user_id or 'default'
        if not rate_limiter.is_allowed(identifier):
            reset_time = rate_limiter.get_reset_time(identifier)
            return {
                'error': 'Rate limit exceeded',
                'message': f'Too many requests. Please wait {reset_time} seconds before trying again.',
                'reset_in_seconds': reset_time,
                'limit': '10 requests per minute',
            }
        # If no context docs specified, search for relevant ones
        if not context_docs:
            search_results = search_docs(question, max_results=3)
            context_docs = [result['path'] for result in search_results]
    
        # Gather context from documents
        context_parts = []
        sources = []
        total_length = 0
    
        for doc_path in context_docs:
            if total_length >= max_context_length:
                break
    
            try:
                file_path = DOCS_DIR / doc_path
                content = file_path.read_text()
    
                # Truncate if needed
                remaining = max_context_length - total_length
                if len(content) > remaining:
                    content = content[:remaining] + '...'
    
                context_parts.append(f'--- {doc_path} ---\n{content}')
                sources.append(doc_path)
                total_length += len(content)
            except (OSError, UnicodeDecodeError):
                continue
    
        if not context_parts:
            return {
                'answer': "I couldn't find relevant documentation to answer your question.",
                'sources': [],
                'confidence': 'low',
            }
    
        full_context = '\n\n'.join(context_parts)
    
        # Try to use OpenAI if API key is available
        api_key = os.getenv('OPENAI_API_KEY')
        if api_key:
            try:
                from openai import OpenAI
    
                client = OpenAI(api_key=api_key)
    
                response = client.chat.completions.create(
                    model='gpt-4o-mini',
                    messages=[
                        {
                            'role': 'system',
                            'content': 'You are a helpful assistant that answers questions based on provided documentation. Only use information from the provided context.',
                        },
                        {
                            'role': 'user',
                            'content': f"""Based on the following documentation, please answer this question: {question}
    
    Documentation:
    {full_context}
    
    Please provide a clear, concise answer based only on the provided documentation.""",
                        },
                    ],
                    temperature=0.7,
                    max_tokens=500,
                )
    
                return {
                    'answer': response.choices[0].message.content,
                    'sources': sources,
                    'context_length': total_length,
                    'model': 'gpt-4o-mini',
                    'confidence': 'high',
                }
            except Exception as e:
                # Fall back to context-only response if OpenAI fails
                return {
                    'answer': f'Error using OpenAI: {str(e)}',
                    'context': full_context[:500] + '...'
                    if len(full_context) > 500
                    else full_context,
                    'sources': sources,
                    'context_length': total_length,
                    'error': str(e),
                }
    
        # If no API key, return context for Dedalus deployment
        return {
            'question': question,
            'context': full_context[:500] + '...'
            if len(full_context) > 500
            else full_context,
            'sources': sources,
            'context_length': total_length,
            'note': "No API key found. When deployed to Dedalus, this will use the platform's LLM integration via BYOK",
        }
  • Function signature and docstring defining the input schema (question: str, context_docs: Optional[List[str]], etc.) and output format (Dict with answer, sources, etc.). Used by MCP framework for tool schema validation.
    def ask_docs(
        question: str,
        context_docs: Optional[List[str]] = None,
        max_context_length: int = 4000,
        user_id: Optional[str] = None,
    ) -> Dict[str, Any]:
        """
        Answer questions about documentation using AI
    
        Args:
            question: The question to answer
            context_docs: Optional list of document paths to use as context
            max_context_length: Maximum characters of context to include
            user_id: Optional user identifier for rate limiting
    
        Returns:
            AI-generated answer with sources
        """
  • src/main.py:42-54 (registration)
    The 'ask_docs' tool is documented and listed as available in the MCP server's instructions string, confirming its registration among available tools.
        instructions="""This MCP server provides access to documentation files with AI-powered search and Q&A capabilities.
        
    Available tools:
    - list_docs(): List all documentation files
    - search_docs(query): Search documentation with keywords
    - ask_docs(question): Get AI-powered answers from documentation
    - index_docs(): Index documents for better search
    - analyze_docs(task): Analyze documentation for specific tasks
    
    Resources:
    - docs://{path}: Access any markdown documentation file directly
    
    This server includes rate limiting (10 requests/minute) to protect API keys.""",
  • RateLimiter class used by ask_docs for API protection (10 req/min).
    class RateLimiter:
        """Simple rate limiter to protect API keys from abuse"""
    
        def __init__(self, max_requests: int = 10, window_seconds: int = 60):
            self.max_requests = max_requests
            self.window_seconds = window_seconds
            self.requests = defaultdict(list)
    
        def is_allowed(self, identifier: str) -> bool:
            """Check if request is allowed for this identifier"""
            now = time.time()
            # Clean old requests outside window
            self.requests[identifier] = [
                req_time
                for req_time in self.requests[identifier]
                if now - req_time < self.window_seconds
            ]
    
            # Check if under limit
            if len(self.requests[identifier]) < self.max_requests:
                self.requests[identifier].append(now)
                return True
            return False
    
        def get_reset_time(self, identifier: str) -> int:
            """Get seconds until rate limit resets"""
            if not self.requests[identifier]:
                return 0
            oldest = min(self.requests[identifier])
            return max(0, int(self.window_seconds - (time.time() - oldest)))
  • get_doc_metadata helper used indirectly via search_docs for document information.
    def get_doc_metadata(file_path: Path) -> Dict[str, Any]:
        """Extract metadata from markdown files"""
        if file_path in METADATA_CACHE:
            return METADATA_CACHE[file_path]
    
        metadata = {
            'title': file_path.stem.replace('-', ' ').title(),
            'path': str(file_path.relative_to(DOCS_DIR)),
            'modified': datetime.fromtimestamp(file_path.stat().st_mtime).isoformat(),
            'size': file_path.stat().st_size,
            'hash': hashlib.md5(file_path.read_bytes()).hexdigest(),
        }
    
        # Try to extract title from first # heading
        try:
            content = file_path.read_text()
            lines = content.split('\n')
            for line in lines[:10]:  # Check first 10 lines
                if line.startswith('# '):
                    metadata['title'] = line[2:].strip()
                    break
        except (OSError, UnicodeDecodeError):
            pass
    
        METADATA_CACHE[file_path] = metadata
        return metadata

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/kitan23/Python_MCP_Server_Example_2'

If you have feedback or need assistance with the MCP directory API, please join our Discord server