Containerized deployment with Docker Compose orchestrating PostgreSQL, Redis, and MCP server services
Imports skills from multiple GitHub repositories including Anthropic Official Skills, community skills, and other skill collections
Integrates with OpenAI's API to provide semantic search capabilities for natural language skill discovery
Uses PostgreSQL with pgvector extension as the primary database for storing skill metadata, ratings, favorites, and analytics
Leverages Redis for caching to improve performance of skill searches and data retrieval
Click on "Install Server".
Wait a few minutes for the server to deploy. Once ready, it will show a "Started" state.
In the chat, type
@followed by the MCP server name and your instructions, e.g., "@Skills Registry MCP Serverfind skills for creating presentations with charts"
That's it! The server will respond to your query, and you can continue using it as needed.
Here is a step-by-step guide with screenshots.
Skills Registry MCP Server
Intelligent discovery and management of Claude Skills using MCP (Model Context Protocol).
Features
π Semantic Search - Find skills using natural language
β Ratings & Reviews - Community-curated skill quality
πΎ Favorites - Save your most-used skills
π Trending - Discover popular skills
π€ Upload - Add custom skills
π·οΈ Categories & Tags - Organized skill library
Quick Start (Docker)
Prerequisites
Docker and Docker Compose
OpenAI API key (for semantic search) or Anthropic API key
1. Clone and Configure
# Copy environment template
cp .env.example .env
# Edit .env and add your API key (use your preferred editor)
# On Mac/Linux: vi .env or code .env
# On Windows: notepad .env
# Or just echo it directly:
echo "OPENAI_API_KEY=sk-your-key-here" >> .env2. Start Services
# Start PostgreSQL, Redis, and MCP server
docker-compose up -d
# View logs
docker-compose logs -f mcp-server3. Verify and Import Skills
# Check services are running
docker-compose ps
# Import skills from GitHub (60+ skills from multiple repos)
./scripts/import_github_skills.sh
# Verify import
docker-compose exec postgres psql -U skills -d skills_registry -c "SELECT COUNT(*) FROM skills;"This will import skills from:
Anthropic Official Skills (docx, pdf, pptx, xlsx, theme-factory, etc.)
Obra's Superpowers (test-driven-development, git workflows, etc.)
Composio Community Skills (changelog-generator, content-research-writer, etc.)
Other Community Skills (epub, ffuf, tapestry, etc.)
Usage with NCP
Install NCP
npm install -g @portel/ncpAdd Skills Registry MCP
# Add to NCP configuration
ncp add skills-registry npx @your-org/skills-registry-mcp
# Or connect to local Docker instance
ncp add skills-registry http://localhost:8000Test MCP Tools
# Search for skills
ncp find "pdf extraction"
# List categories
ncp run skills-registry:skill_list_categories
# Get trending skills
ncp run skills-registry:skill_trending --params '{"timeframe":"week"}'MCP Tools Available
skill_search
Search for skills using natural language or filters.
{
"query": "create presentations with charts",
"category": "documents",
"min_rating": 4.0,
"limit": 10
}skill_get
Fetch complete skill content and metadata.
{
"skill_id": "pdf-master-v2",
"user_id": "user-123"
}skill_favorite_add
Add skill to favorites.
{
"skill_id": "docx-advanced",
"user_id": "user-123"
}skill_rate
Rate a skill 1-5 stars.
{
"skill_id": "xlsx-wizard",
"user_id": "user-123",
"rating": 5,
"review": "Excellent for data analysis!"
}skill_trending
Get popular skills.
{
"limit": 10,
"timeframe": "week"
}skill_upload
Add a custom skill.
{
"name": "API Documentation Generator",
"description": "Generate OpenAPI specs from code",
"skill_md_content": "# Skill Content Here...",
"category": "development",
"tags": ["api", "documentation"],
"author_id": "user-123",
"visibility": "private"
}Development
Project Structure
skills-registry-mcp/
βββ docker-compose.yml # Service orchestration
βββ Dockerfile # MCP server container
βββ init.sql # Database schema
βββ requirements.txt # Python dependencies
βββ src/
β βββ __init__.py
β βββ server.py # FastMCP server
β βββ database.py # PostgreSQL operations
β βββ search.py # Semantic search
β βββ models.py # Data models
βββ skills_storage/ # Local skill filesLocal Development
# Install dependencies
pip install -r requirements.txt
# Set environment variables
export DATABASE_URL=postgresql://skills:skills_dev_password@localhost:5432/skills_registry
export OPENAI_API_KEY=your-key-here
# Run server directly
python -m src.serverImport Existing Skills
# Import skills from /mnt/skills/
python scripts/import_skills.py --source /mnt/skills/ --category coreDatabase Schema
See init.sql for complete schema. Key tables:
skills- Skill metadata and contentskill_ratings- User ratings and reviewsskill_favorites- User favoritesskill_usage- Analytics trackingskill_stats- Computed statistics view
Configuration
Environment Variables
DATABASE_URL- PostgreSQL connection stringREDIS_URL- Redis connection stringSKILLS_STORAGE_PATH- Local filesystem path for SKILL.md filesOPENAI_API_KEY- For semantic search (optional)ANTHROPIC_API_KEY- Alternative for semantic search (optional)
Docker Compose Services
postgres- PostgreSQL 15 with pgvectorredis- Redis 7 for cachingmcp-server- FastMCP server
Roadmap
Phase 1: MVP with local search β
Phase 2: Semantic search with embeddings β
Phase 3: Import existing skills from
/mnt/skills/Phase 4: Cloud-hosted registry option
Phase 5: Web UI for browsing
Phase 6: Skill versioning system
License
MIT