System Prompts MCP Server
Expose the prompt collection in this repository as a Model Context Protocol (MCP) server. Each prompt, summary, or tool definition maps to a dedicated MCP tool so your client can fetch the exact configuration it needs (e.g. Devin system prompt, Cursor summary) on demand.
The original prompt archive README now lives under prompts/README.md.
Features
Automatic discovery – every text/yaml/json prompt in
prompts/(or any directory you point to) is scanned and exposed as an MCP tool.Model-aware suggestions –
get_prompt_suggestionranks prompts against the LLM you’re using (Claude, GPT, Gemini, etc.) and the keywords you provide.Quick browsing –
list_promptsfilters by service, flavor (summary,system,tools), or provider hints.Persona activation – each tool call includes a reminder for the model to embody the loaded prompt so it behaves like the original service.
Structured responses – tool calls return both raw file contents and metadata (service, variant, path, inferred LLM family, persona hint).
Project Layout
src/– TypeScript MCP server implementationindex.tsregisters tools (list_prompts,get_prompt_suggestion, plus one tool per prompt file)config/prompts.tsdiscovers prompt files and infers metadatalib/helpers for slugging, LLM detection, and ranking
dist/– compiled JavaScript (created by the build step)prompts/– full prompt library and original documentation
Getting Started
npm installautomatically registers this server with Claude Desktop (if present) by updating~/Library/Application Support/Claude/claude_desktop_config.json. You can opt out by removing thepostinstallscript frompackage.json.
Start the server on stdio (suitable for Claude Desktop, Cursor MCP, etc.):
Run in watch/dev mode:
Environment variables
PROMPT_LIBRARY_ROOT(optional) – override the prompt root. If unset, the server automatically prefersprompts/(when available) and falls back to the repository root.
MCP Tools
Tool | Description |
| Lists available prompts with optional filters (
,
,
,
). |
| Suggests the best prompt for a given LLM/service/keywords, returning ranked alternatives. |
| One tool per prompt resource (e.g.
or
). Returns the file contents plus a persona activation hint. |
Example:
Once you have a tool name (e.g. cursor-agent-system), call it with optional format: "json" to receive structured metadata only.
Claude Desktop Integration
Add the server to ~/Library/Application Support/Claude/claude_desktop_config.json:
Restart Claude Desktop to load the new MCP server, then ask for prompts by name or use the suggestion tool.
Development
npm run dev– run withts-nodefor quick iterationnpm run lint– type-check without emitting files
Contributions welcome—feel free to adapt the discovery logic, add tests, or extend metadata inference for new prompt formats.
This server cannot be installed
local-only server
The server can only run on the client's local machine because it depends on local resources.
Model Context Protocol server exposing system prompt files and summaries.