Integrations
Provides environment variable configuration support for storing LLM API keys and server settings via .env files
Supports containerized deployment of the Task Manager MCP server, with configuration for building and running the server as a Docker container
Offers alternative LLM provider integration for task management functions, allowing use of locally deployed Ollama models for PRD parsing and task suggestions
Task Manager MCP Server
A template implementation of the Model Context Protocol (MCP) server for managing tasks and projects. This server provides a comprehensive task management system with support for project organization, task tracking, and PRD parsing.
Overview
This project demonstrates how to build an MCP server that enables AI agents to manage tasks, track project progress, and break down Product Requirements Documents (PRDs) into actionable tasks. It serves as a practical template for creating your own MCP servers with task management capabilities.
The implementation follows the best practices laid out by Anthropic for building MCP servers, allowing seamless integration with any MCP-compatible client.
Features
The server provides several essential task management tools:
- Task Management
create_task_file
: Create new project task filesadd_task
: Add tasks to projects with descriptions and subtasksupdate_task_status
: Update the status of tasks and subtasksget_next_task
: Get the next uncompleted task from a project
- Project Planning
parse_prd
: Convert PRDs into structured tasks automaticallyexpand_task
: Break down tasks into smaller, manageable subtasksestimate_task_complexity
: Estimate task complexity and time requirementsget_task_dependencies
: Track task dependencies
- Development Support
generate_task_file
: Generate file templates based on task descriptionssuggest_next_actions
: Get AI-powered suggestions for next steps
Prerequisites
- Python 3.12+
- API keys for your chosen LLM provider (OpenAI, OpenRouter, or Ollama)
- Docker if running the MCP server as a container (recommended)
Installation
Using uv
- Install uv if you don't have it:Copy
- Clone this repository:Copy
- Install dependencies:Copy
- Create a
.env
file based on.env.example
:Copy - Configure your environment variables in the
.env
file (see Configuration section)
Using Docker (Recommended)
- Build the Docker image:Copy
- Create a
.env
file based on.env.example
and configure your environment variables
Configuration
The following environment variables can be configured in your .env
file:
Variable | Description | Example |
---|---|---|
TRANSPORT | Transport protocol (sse or stdio) | sse |
HOST | Host to bind to when using SSE transport | 0.0.0.0 |
PORT | Port to listen on when using SSE transport | 8050 |
LLM_PROVIDER | LLM provider (openai, openrouter, or ollama) | openai |
LLM_BASE_URL | Base URL for the LLM API | https://api.openai.com/v1 |
LLM_API_KEY | API key for the LLM provider | sk-... |
LLM_CHOICE | LLM model to use for task analysis | gpt-4 |
Running the Server
Using Python 3
The server will start on the configured host and port (default: http://0.0.0.0:8050).
Using Docker
Using the Task Manager
Creating a New Project
- Create a task file for your project:
- Add tasks to your project:
- Parse a PRD to create tasks automatically:
Managing Tasks
- Update task status:
- Get the next task to work on:
- Expand a task into subtasks:
Development Workflow
- Generate a file template for a task:
- Get task complexity estimate:
- Get suggestions for next actions:
Integration with MCP Clients
SSE Configuration
To connect to the server using SSE transport, use this configuration:
Stdio Configuration
For stdio transport, use this configuration:
Building Your Own Server
This template provides a foundation for building more complex task management MCP servers. To extend it:
- Add new task management tools using the
@mcp.tool()
decorator - Implement custom task analysis and automation features
- Add project-specific task templates and workflows
- Integrate with your existing development tools and processes
This server cannot be installed
A Model Context Protocol server providing comprehensive task management capabilities with support for project organization, task tracking, and automatic PRD parsing into actionable items.