Skip to main content
Glama

claude-skills-mcp

Claude Skills MCP Server

Tests Python 3.12 License Code style: ruff PyPI version

Use - including Cursor, Codex, GPT-5, Gemini, and more. This MCP server brings Anthropic's Agent Skills framework to the entire AI ecosystem through the Model Context Protocol.

A Model Context Protocol (MCP) server that provides intelligent search capabilities for discovering relevant Claude Agent Skills using vector embeddings and semantic similarity. This server implements the same progressive disclosure architecture that Anthropic describes in their Agent Skills engineering blog, making specialized skills available to any MCP-compatible AI application.

An open-source project by - creators of autonomous AI scientists for scientific research.

This MCP server enables any MCP-compatible AI assistant to intelligently search and retrieve skills from our curated Claude Scientific Skills repository and other skill sources like the Official Claude Skills. If you want substantially more advanced capabilities, compute infrastructure, and enterprise-ready AI scientist offerings, check out K-Dense AI's commercial platform.

Features

  • 🔍 Semantic Search: Vector embeddings for intelligent skill discovery

  • 📚 Progressive Disclosure: Multi-level skill loading (metadata → full content → files)

  • 🚀 Zero Configuration: Works out of the box with curated skills

  • 🌐 Multi-Source: Load from GitHub repositories and local directories

  • Fast & Local: No API keys needed, with automatic GitHub caching

  • 🔧 Configurable: Customize sources, models, and content limits

Quick Start

Using uvx (Recommended)

Run the server with default configuration (no installation required):

uvx claude-skills-mcp

This loads ~90 skills from Anthropic's official skills repository and K-Dense AI's scientific skills collection.

With Custom Configuration

To customize skill sources or search parameters:

# 1. Print the default configuration uvx claude-skills-mcp --example-config > config.json # 2. Edit config.json to your needs # 3. Run with your custom configuration uvx claude-skills-mcp --config config.json

Setup for Your AI Assistant

Cursor

Add to your MCP settings (~/.cursor/mcp.json):

{ "mcpServers": { "claude-skills": { "command": "uvx", "args": ["claude-skills-mcp"] } } }

Restart Cursor and the skills will be available to the AI assistant.

Claude Desktop

Add to your MCP settings:

{ "mcpServers": { "claude-skills": { "command": "uvx", "args": ["claude-skills-mcp"] } } }

Restart Claude Desktop to activate.

Other MCP-Compatible Tools

Any tool supporting the Model Context Protocol can use this server via uvx claude-skills-mcp. Consult your tool's MCP configuration documentation.

Architecture

Built on five core components: Configuration (JSON-based config loading), Skill Loader (GitHub + local with automatic caching), Search Engine (sentence-transformers vector search), MCP Server (three tools with stdio transport), and CLI Entry Point (argument parsing and lifecycle management).

See Architecture Guide for detailed design, data flow, and extension points.

Configuration

The server uses a JSON configuration file to specify skill sources and search parameters.

Default Configuration

If no config file is specified, the server uses these defaults:

{ "skill_sources": [ { "type": "github", "url": "https://github.com/anthropics/skills" }, { "type": "github", "url": "https://github.com/K-Dense-AI/claude-scientific-skills" }, { "type": "local", "path": "~/.claude/skills" } ], "embedding_model": "all-MiniLM-L6-v2", "default_top_k": 3, "max_skill_content_chars": null }

This loads ~90 skills by default: 15 from Anthropic (document tools, web artifacts, etc.) + 78 from K-Dense AI (scientific analysis tools) + any custom local skills.

Configuration Options

Option

Type

Default

Description

skill_sources

Array

Anthropic repo

GitHub repos or local paths

embedding_model

String

all-MiniLM-L6-v2

Sentence-transformers model

default_top_k

Integer

3

Number of results to return

max_skill_content_chars

Integer/null

null

Content truncation limit

load_skill_documents

Boolean

true

Load additional skill files

max_image_size_bytes

Integer

5242880

Max image size (5MB)

To customize, run uvx claude-skills-mcp --example-config > config.json to see all options, or check Usage Guide for advanced patterns.

MCP Tools

The server provides three tools for working with Claude Agent Skills:

  1. search_skills - Semantic search for relevant skills based on task description

  2. read_skill_document - Retrieve specific files (scripts, data, references) from skills

  3. list_skills - View complete inventory of all loaded skills (for exploration/debugging)

See API Documentation for detailed parameters, examples, and best practices.

Quick Examples

Find skills: "I need to analyze RNA sequencing data"
Access files: "Show me Python scripts from the scanpy skill"
List all: "What skills are available?"

For task-oriented queries, prefer search_skills over list_skills.

Skill Format

The server searches for SKILL.md files with the following format:

--- name: Skill Name description: Brief description of what this skill does --- # Skill Name [Full skill content in Markdown...]

Technical Details

Dependencies

  • mcp>=1.0.0 - Model Context Protocol

  • sentence-transformers>=2.2.0 - Vector embeddings (uses CPU-only PyTorch on Linux)

  • numpy>=1.24.0 - Numerical operations

  • httpx>=0.24.0 - HTTP client for GitHub API

Note on PyTorch: This project uses CPU-only PyTorch on Linux systems to avoid unnecessary CUDA dependencies (~3-4 GB). This significantly reduces Docker image size and build time while maintaining full functionality for semantic search.

Python Version

  • Requires: Python 3.12 (not 3.13)

  • Dependencies are automatically managed by uv/uvx

Performance

  • Startup time: ~10-20 seconds (loads SKILL.md files only with lazy document loading)

  • Query time: <1 second for vector search

  • Document access: On-demand with automatic disk caching

  • Memory usage: ~500MB (embedding model + indexed skills)

  • First run: Downloads ~100MB embedding model (cached thereafter)

  • Docker image size: ~1-2 GB (uses CPU-only PyTorch, no CUDA dependencies)

How It Works

This server implements Anthropic's progressive disclosure architecture:

  1. Startup: Load SKILL.md files from GitHub/local sources, generate vector embeddings

  2. Search: Match task queries against skill descriptions using cosine similarity

  3. Progressive Loading: Return metadata → full content → referenced files as needed

  4. Lazy Document Loading: Additional skill documents fetched on-demand with automatic disk caching

  5. Two-Level Caching: GitHub API responses (24h) + individual documents (permanent)

This enables any MCP-compatible AI assistant to intelligently discover and load relevant skills with minimal context overhead and fast startup. See Architecture Guide for details.

Skill Sources

Load skills from GitHub repositories (direct skills or Claude Code plugins) or local directories.

By default, loads from:

Error Handling

The server is designed to be resilient:

  • If a local folder is inaccessible, it logs a warning and continues

  • If a GitHub repo fails to load, it tries alternate branches and continues

  • If no skills are loaded, the server exits with an error message

Docker Deployment

Building Docker Image

docker build -t claude-skills-mcp -f Dockerfile.glama .

Running with Docker

docker run -it claude-skills-mcp

The optimized Dockerfile uses CPU-only PyTorch to minimize image size and build time while maintaining full functionality.

Development

Installation from Source

git clone https://github.com/your-org/claude-skills-mcp.git cd claude-skills-mcp uv sync

Running in Development

uv run claude-skills-mcp

Running with Verbose Logging

uvx claude-skills-mcp --verbose

Running Tests

# Run all tests (with coverage - runs automatically) uv run pytest tests/ # Run only unit tests (fast) uv run pytest tests/ -m "not integration" # Run local demo (creates temporary skills) uv run pytest tests/test_integration.py::test_local_demo -v -s # Run repository demo (loads from K-Dense-AI scientific skills) uv run pytest tests/test_integration.py::test_repo_demo -v -s # Generate HTML coverage report uv run pytest tests/ --cov-report=html open htmlcov/index.html

Note: Coverage reporting is enabled by default. All test runs show coverage statistics.

See Testing Guide for more details.

Command Line Options

uvx claude-skills-mcp [OPTIONS] Options: --config PATH Path to configuration JSON file --example-config Print default configuration (with comments) and exit --verbose, -v Enable verbose logging --help Show help message

Contributing

Contributions are welcome! To contribute:

  1. Report issues: Open an issue for bugs or feature requests

  2. Submit PRs: Fork, create a feature branch, ensure tests pass (uv run pytest tests/), then submit

  3. Code style: Run uvx ruff check src/ before committing

  4. Add tests: New features should include tests

For questions, email orion.li@k-dense.ai

Documentation

  • Usage Examples - Advanced configuration, real-world use cases, and custom skill creation

  • Testing Guide - Complete testing instructions, CI/CD, and coverage analysis

  • Roadmap - Future features and planned enhancements

Roadmap

We're working on MCP Sampling, sandboxed execution, binary support, and skill workflows. See our detailed roadmap for technical specifications.

Learn More

License

This project is licensed under the Apache License 2.0.

Copyright 2025 K-Dense AI (https://k-dense.ai)

Deploy Server
-
security - not tested
A
license - permissive license
-
quality - not tested

hybrid server

The server is able to function both locally and remotely, depending on the configuration or use case.

A Model Context Protocol (MCP) server that provides intelligent search capabilities for discovering relevant Claude Agent Skills using vector embeddings and semantic similarity. This server implements the same progressive disclosure architecture that Anthropic describes in their Agent Skills enginee

  1. Features
    1. Quick Start
      1. Using uvx (Recommended)
      2. With Custom Configuration
    2. Setup for Your AI Assistant
      1. Cursor
      2. Claude Desktop
      3. Other MCP-Compatible Tools
    3. Architecture
      1. Configuration
        1. Default Configuration
        2. Configuration Options
      2. MCP Tools
        1. Quick Examples
      3. Skill Format
        1. Technical Details
          1. Dependencies
          2. Python Version
          3. Performance
        2. How It Works
          1. Skill Sources
            1. Error Handling
              1. Docker Deployment
                1. Building Docker Image
                2. Running with Docker
              2. Development
                1. Installation from Source
                2. Running in Development
                3. Running with Verbose Logging
                4. Running Tests
              3. Command Line Options
                1. Contributing
                  1. Documentation
                    1. Roadmap
                      1. Learn More
                        1. License

                          MCP directory API

                          We provide all the information about MCP servers via our MCP API.

                          curl -X GET 'https://glama.ai/api/mcp/v1/servers/OrionLi545/claude-skills-mcp'

                          If you have feedback or need assistance with the MCP directory API, please join our Discord server