π System Prompts MCP Server
Access system prompts from AI tools in your workflow. Browse and fetch prompts from Devin, Cursor, Claude, GPT, and more. Model-aware suggestions help you find the perfect prompt for your LLM.
An MCP (Model Context Protocol) server that exposes a collection of system prompts, summaries, and tool definitions from popular AI tools as MCP tools for AI coding environments like Cursor and Claude Desktop.
Why Use System Prompts MCP?
π Automatic Discovery β Every prompt in
prompts/is automatically exposed as an MCP toolπ― Model-Aware Suggestions β Get prompt recommendations based on your LLM (Claude, GPT, Gemini, etc.)
π Comprehensive Collection β Access prompts from Devin, Cursor, Claude, GPT, and more
π Easy Setup β One-click install in Cursor or simple manual setup
π§ Extensible β Add your own prompts and they're automatically available
Quick Start
Ready to explore system prompts? Install in seconds:
Install in Cursor (Recommended):
Or install manually:
Features
Core Tools
list_promptsβ Browse available prompts with filters (service, flavor, provider)get_prompt_suggestionβ Get ranked prompt suggestions for your LLM and keywords<service>-<variant>-<flavor>β Direct access to any prompt (e.g.,cursor-agent-system,devin-summary)
Automatic Discovery
Scans
prompts/directory for.txt,.md,.yaml,.yml,.jsonfilesEach file becomes a dedicated MCP tool
Infers metadata (service, variant, LLM family, persona hints)
Persona Activation
Each tool call includes a reminder for the model to embody the loaded prompt
Helps models behave like the original service (Devin, Cursor, etc.)
Installation
Cursor (One-Click)
Click the install link above or use:
Manual Installation
Requirements: Node.js 18+ and npm
Claude Desktop
Add to claude_desktop_config.json:
macOS: ~/Library/Application Support/Claude/claude_desktop_config.json
Windows: %APPDATA%\Claude\claude_desktop_config.json
Restart Claude Desktop after configuration.
Usage Examples
List Available Prompts
Browse prompts with optional filters:
Get Prompt Suggestions
Find the best prompt for your LLM and use case:
Access a Specific Prompt
Call a prompt directly by its tool name:
Get structured metadata only:
Adding Your Own Prompts
Add prompts by placing files in the prompts/ directory:
Supported formats: .txt, .md, .yaml, .yml, .json
Directory structure:
Directory names become the service name
File names create tool variants
Files are automatically classified as system prompts, tools, or summaries
After adding prompts, restart the MCP server. Use list_prompts to find your custom prompts.
Custom directory: Set PROMPT_LIBRARY_ROOT environment variable to use a different location.
Use Cases
AI Tool Developers β Reference and adapt prompts from successful AI tools
Researchers β Study how different tools structure their system prompts
Developers β Find the perfect prompt for your LLM and use case
Prompt Engineers β Compare and learn from proven prompt patterns
Technical Details
Built with: Node.js, TypeScript, MCP SDK
Dependencies: @modelcontextprotocol/sdk, zod
Platforms: macOS, Windows, Linux
Environment Variables:
PROMPT_LIBRARY_ROOT(optional): Override prompt root directory (defaults toprompts/)
Project Structure:
src/β TypeScript MCP server implementationdist/β Compiled JavaScriptprompts/β Prompt library and original documentation
Contributing
β If this project helps you, please star it on GitHub! β
Contributions welcome! Feel free to adapt the discovery logic, add tests, or extend metadata inference for new prompt formats.
License
See the original repository for license information.
Support
If you find this project useful, consider supporting it:
β‘ Lightning Network
βΏ Bitcoin: bc1ptzvr93pn959xq4et6sqzpfnkk2args22ewv5u2th4ps7hshfaqrshe0xtp
Ξ Ethereum/EVM: 0x42ea529282DDE0AA87B42d9E83316eb23FE62c3f