Search for:
Why this server?
Enables AI models to interact with Jira using a standardized protocol, offering full Jira REST API integration with features like optimal performance through connection pooling, error handling, and request monitoring.
Why this server?
Provides integration with Jira's REST API, allowing AI assistants to manage Jira issues programmatically.
Why this server?
Enables AI agents to manage issues, projects, and teams on the Linear platform programmatically.
Why this server?
Enables AI agents to interact with GitHub issues by providing details as tasks, allowing for seamless integration and task management through GitHub's platform.
Why this server?
A Python-based server allowing seamless integration with JIRA for managing and interacting with projects through custom APIs.
Why this server?
Enables seamless integration between Home Assistant and Language Learning Models (LLMs), allowing natural language interaction for smart home control and automation management.
Why this server?
An MCP server that enables AI agents to interact with Atlassian products (Confluence and Jira) for content management, issue tracking, and project management through a standardized interface.
Why this server?
Helps you analyze chess positions and get professional evaluations using Stockfish.
Why this server?
A Model Context Protocol (MCP) server for Atlassian Cloud products (Confluence and Jira). This integration is designed specifically for Atlassian Cloud instances and does not support Atlassian Server or Data Center deployments.
Why this server?
Enable Claude (or any other LLM) to interactively debug your code (set breakpoints and evaluate expressions in stack frame). It's language-agnostic, assuming debugger console support and valid launch.json for debugging in VSCode.