Skip to main content
Glama

Claude Skills MCP Server

Tests Python 3.12 License Code style: ruff PyPI version

Use - including Cursor, Codex, GPT-5, Gemini, and more. This MCP server brings Anthropic's Agent Skills framework to the entire AI ecosystem through the Model Context Protocol.

A Model Context Protocol (MCP) server that provides intelligent search capabilities for discovering relevant Claude Agent Skills using vector embeddings and semantic similarity. This server implements the same progressive disclosure architecture that Anthropic describes in their Agent Skills engineering blog, making specialized skills available to any MCP-compatible AI application.

An open-source project by - creators of autonomous AI scientists for scientific research.

This MCP server enables any MCP-compatible AI assistant to intelligently search and retrieve skills from our curated Claude Scientific Skills repository and other skill sources like the Official Claude Skills.

Demo

Claude Skills MCP in Action

Semantic search and progressive loading of Claude Agent Skills in Cursor

Highlights

  • Two-Package Architecture: Lightweight frontend (~15 MB) starts instantly; backend (~250 MB) downloads in background

  • No Cursor Timeout: Frontend responds in <5 seconds, solving the timeout issue

  • Semantic Search: Vector embeddings for intelligent skill discovery

  • Progressive Disclosure: Multi-level skill loading (metadata → full content → files)

  • Zero Configuration: Works out of the box with curated skills

  • Multi-Source: Load from GitHub repositories and local directories

  • Fast & Local: No API keys needed, with automatic GitHub caching

  • Configurable: Customize sources, models, and content limits

Quick Start

For Cursor Users

Add through the Cursor Directory, or add to your Cursor config (~/.cursor/mcp.json):

{ "mcpServers": { "claude-skills": { "command": "uvx", "args": ["claude-skills-mcp"] } } }

The frontend starts instantly and displays tools, automatically downloading and starting the backend in the background (~60-120s due to RAG dependencies, one-time). Subsequent uses are instant.

Using uvx (Standalone)

Run the server with default configuration:

uvx claude-skills-mcp

This starts the lightweight frontend which auto-downloads the backend and loads ~90 skills from Anthropic's official skills repository and K-Dense AI's scientific skills collection.

With Custom Configuration

# 1. Print the default configuration uvx claude-skills-mcp --example-config > config.json # 2. Edit config.json to your needs # 3. Run with your custom configuration uvx claude-skills-mcp --config config.json

Documentation

MCP Tools

The server provides three tools for working with Claude Agent Skills:

  1. find_helpful_skills - Semantic search for relevant skills based on task description

  2. read_skill_document - Retrieve specific files (scripts, data, references) from skills

  3. list_skills - View complete inventory of all loaded skills (for exploration/debugging)

See API Documentation for detailed parameters, examples, and best practices.

Architecture (v1.0.0)

The system uses a two-package architecture for optimal performance:

  • Frontend (claude-skills-mcp): Lightweight proxy (~15 MB)

    • Starts instantly (<5 seconds) ✅ No Cursor timeout!

    • Auto-downloads backend on first use

    • MCP server (stdio) for Cursor

  • Backend (claude-skills-mcp-backend): Heavy server (~250 MB)

    • Vector search with PyTorch & sentence-transformers

    • MCP server (streamable HTTP)

    • Auto-installed by frontend OR deployable standalone

Benefits:

  • ✅ Solves Cursor timeout issue (frontend starts instantly)

  • ✅ Same simple user experience (uvx claude-skills-mcp)

  • ✅ Backend downloads in background (doesn't block Cursor)

  • ✅ Can connect to remote hosted backend (no local install needed)

See Architecture Guide for detailed design and data flow.

Skill Sources

Load skills from GitHub repositories (direct skills or Claude Code plugins) or local directories.

By default, loads from:

Contributing

Contributions are welcome! To contribute:

  1. Report issues: Open an issue for bugs or feature requests

  2. Submit PRs: Fork, create a feature branch, ensure tests pass (uv run pytest tests/), then submit

  3. Code style: Run uvx ruff check src/ before committing

  4. Add tests: New features should include tests

Development

Version Management: This monorepo uses a centralized version system:

  • Edit the VERSION file at the repo root to bump the version

  • Run python3 scripts/sync-version.py to sync all references (or use --check to verify)

  • The scripts/build-all.sh script automatically syncs versions before building

For questions, email orion.li@k-dense.ai

Join Our Community! 🚀

We'd love to have you in our Slack community! Connect with other users, share tips and tricks, get help with your skills, and be the first to know about new features and updates.

👉 Join the K-Dense Community on Slack 👈

Whether you're building custom skills, integrating with different AI models, or just exploring the possibilities of Agent Skills, our community is here to support you!

Learn More

License

This project is licensed under the Apache License 2.0.

Copyright 2025 K-Dense AI (https://k-dense.ai)

Star History

Star History Chart

Deploy Server
A
security – no known vulnerabilities
A
license - permissive license
A
quality - confirmed to work

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/OrionLi545/claude-skills-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server