A lightweight MCP server that provides a unified interface to various LLM providers including OpenAI, Anthropic, Google Gemini, Groq, DeepSeek, and Ollama.
A Model Context Protocol server that provides specialized prompt suggestions for backend development, frontend development, and general tasks to help LLMs generate better content.
Enables cleaning and sanitizing prompts through an LLM-powered tool that removes sensitive information, provides structured feedback with notes and risks, and normalizes prompt formatting. Supports configurable local or remote OpenAI-compatible APIs with automatic secret redaction.
A server based on Model Context Protocol that provides predefined prompt templates for tasks like code review and API documentation generation, enabling more efficient workflows in Cursor/Windsurf editors.
MCP server for managing and serving dynamic prompt templates using elegant and powerful text template engine. Create reusable, logic-driven prompts with variables, partials, and conditionals that can be served to any compatible MCP client like Claude Code, Claude Desktop, Gemini CLI, etc.