Click on "Install Server".
Wait a few minutes for the server to deploy. Once ready, it will show a "Started" state.
In the chat, type
@followed by the MCP server name and your instructions, e.g., "@Dedalus MCP Documentation Serversearch for rate limiting configuration in the docs"
That's it! The server will respond to your query, and you can continue using it as needed.
Here is a step-by-step guide with screenshots.
Dedalus MCP Documentation Server
An MCP server for serving and querying documentation with AI capabilities. Built for the YC Agents Hackathon.
Quick Start (Local Development)
# Install uv package manager (same as Dedalus uses)
brew install uv # or pip install uv
# Install dependencies
uv sync --no-dev
# Configure API keys for AI features
cp config/.env.example .env.local
# Edit .env.local and add your OpenAI API key
# Test
uv run python tests/test_server.py
# Run
uv run mainDeploy to Dedalus
What Dedalus Needs
pyproject.toml- Package configuration with dependenciesmain.py(root) - Entry point that Dedalus expectssrc/main.py- The actual MCP server codedocs/- Your documentation files
Deployment Steps
Set Environment Variables in Dedalus UI:
OPENAI_API_KEY- Your OpenAI API key (required for AI features)
Deploy:
dedalus deploy . --name "your-docs-server"How Dedalus Runs Your Server
Installs dependencies using
uv syncfrompyproject.tomlRuns
uv run mainto start the serverServer runs in
/appdirectory in containerDocs are served from
/app/docs
Features
Serve markdown documentation
Search across docs
AI-powered Q&A (with OpenAI)
Rate limiting (10 requests/minute) to protect API keys
Ready for agent handoffs
Tools Available
list_docs()- List documentation filessearch_docs()- Search with keywordsask_docs()- AI answers from docsindex_docs()- Index documentsanalyze_docs()- Analyze for tasks
Documentation
See docs/ directory for:
License
MIT