Search for:
Why this server?
Enables AI assistants to interact with the Plane project management platform, allowing them to manage workspaces, projects, issues, and comments through a structured API.
Why this server?
Facilitates interactive software development planning by managing tasks, tracking progress, and creating detailed implementation plans through the Model Context Protocol.
Why this server?
Enables management of development projects with GitHub integration, facilitating project tracking, repository linking, and metadata maintenance within the Model Context Protocol.
Why this server?
A collection of tools for interacting with Jira via the Model Context Protocol, providing core functionalities like fetching and analyzing issues, plus a guided Issue Creation Wizard.
Why this server?
The Fibery MCP server provides integration between Fibery and any LLM provider supporting the MCP protocol, allowing you to explore Fibery Workspace Schema, query databases, create and update entities using natural language.
Why this server?
Enables AI models to interact with Linear for issue tracking and project management through the Model Context Protocol, supporting capabilities like creating issues, searching, managing sprints, and bulk updating statuses.
Why this server?
A Model Context Protocol (MCP) server that integrates with OmniFocus to enable Claude (or other MCP-compatible AI assistants) to interact with your tasks and projects.
Why this server?
A Model Context Protocol server that enables Claude and other AI assistants to access and update Kintone data through natural language commands, supporting operations like record management, file handling, app administration, and space collaboration.
Why this server?
Enables LLMs to interact with GitHub issues by providing details as tasks, allowing for seamless integration and task management through GitHub's platform.
Why this server?
An MCP server that enables AI assistants like Claude to help users manage their GitHub notifications through natural language commands.