Taiga MCP Server
A production-ready Model Context Protocol (MCP) server for Taiga Project Management
Getting Started • Features • Tools • Architecture • Contributing
📋 Overview
Taiga MCP Server enables seamless integration between Large Language Models (LLMs) and Taiga project management platform through the Model Context Protocol. Built with Python's async/await patterns and type-safe Pydantic models, it provides a robust, production-ready solution for AI-powered project management automation.
Why Taiga MCP?
🤖 Natural Language Interface: Interact with Taiga using conversational commands
🔄 Async-First: Built on modern async/await for high performance
🛡️ Type-Safe: Full Pydantic validation for reliability
🎯 Production Ready: Comprehensive error handling and logging
🔌 Extensible: Clean architecture for easy feature additions
📦 Zero Config: Works out-of-the-box with Claude Desktop, Cursor, Windsurf
✨ Features
Core Capabilities
Feature | Description |
🔐 Authentication | Token-based auth with automatic refresh |
📊 Project Management | List, view, and search projects by ID or slug |
📝 User Stories | Full CRUD operations with pagination support |
✅ Task Management | Create and organize tasks within stories |
👥 Team Collaboration | View members and assign work |
🏷️ Rich Metadata | Tags, story points, due dates, custom fields |
🔍 Flexible Queries | Support for IDs, slugs, and reference numbers (#42) |
Technical Features
Async Architecture: Non-blocking I/O for optimal performance
Smart Caching: Token management with auto-refresh
Intelligent Pagination: Auto-fetch all or page-by-page
Optimistic Locking: Version-based updates prevent conflicts
Role-Based Points: Automatic detection and handling
Flexible Identifiers: Use IDs, slugs, or #ref numbers interchangeably
🚀 Quick Start
Prerequisites
Python: 3.10 or higher
Taiga Account: taiga.io or self-hosted instance
MCP Client: Claude Desktop, Cursor, Windsurf, or any MCP-compatible client
Installation
Configuration
Create .env
file:
See
🛠️ Available Tools
The server exposes 10 tools through the MCP protocol:
Authentication
Tool | Description | Parameters |
| Authenticate with Taiga API |
(optional),
(optional) |
Project Management
Tool | Description | Parameters |
| List all accessible projects | None |
| Get project details |
(ID or slug) |
| List project team members |
|
User Story Management
Tool | Description | Parameters |
| Create a new user story |
,
,
, ,
* |
| List stories with pagination |
,
, ,
* |
| Get story details |
,
* |
| Update existing story |
,
, ,
, ,
, ,
, |
Task Management
Tool | Description | Parameters |
| Create task in story |
,
,
,
, ,
* |
| List tasks for a story |
,
* |
* = optional parameter
💬 Example Usage
Once configured with your LLM client, use natural language:
🏗️ Architecture
Tech Stack
Component | Technology | Purpose |
Protocol | LLM-tool communication | |
Language | Python 3.10+ | Core implementation |
HTTP Client | Async Taiga API calls | |
Validation | Type-safe data models | |
Config | Environment management | |
Testing | + | Test framework |
Project Structure
Design Patterns
1. Async/Await Throughout
All I/O operations use Python's async/await
for non-blocking execution:
2. Service Layer Pattern
Business logic is encapsulated in service classes:
3. Pydantic Validation
All data is validated using Pydantic models:
4. Error Handling
Custom exception hierarchy for precise error handling:
🔧 Development
Setup Development Environment
Running Tests
Code Quality Tools
Tool | Purpose | Command |
Black | Code formatting |
|
Ruff | Fast linting |
|
Mypy | Type checking |
|
Pytest | Testing |
|
🗺️ Roadmap
Phase 1: Core Features ✅
Authentication & token management
Project listing and details
User story CRUD operations
Task management
Team member listing
Smart pagination
Flexible identifiers (ID/slug/#ref)
Phase 2: Enhanced Features 🚧
Caching layer (Redis/in-memory)
Rate limiting
Bulk operations
Epic support
Sprint/Milestone management
Issues/Bugs tracking
Wiki page integration
File attachments
Comments on stories/tasks
Custom field support
Activity history tracking
Phase 3: Advanced Features 🎯
Standalone CLI tool
Analytics & reporting
Data export/import
Webhook support
Notification integrations (Slack, Email)
Project templates
Burndown charts
Time tracking
🤝 Contributing
Contributions are welcome! Here's how to get started:
Fork the repository
Create a feature branch:
git checkout -b feature/amazing-feature
Make your changes
Add tests: Ensure coverage for new code
Run quality checks:
black app/ tests/ ruff check app/ tests/ mypy app/ pytestCommit your changes:
git commit -m 'Add amazing feature'
Push to branch:
git push origin feature/amazing-feature
Open a Pull Request
Development Guidelines
Follow existing code style (Black formatting)
Add type hints to all functions
Write docstrings for public APIs
Include tests for new features
Update documentation as needed
📝 License
This project is licensed under the The GNU General Public License v3.0 - see the LICENSE file for details.
🙏 Acknowledgments
Model Context Protocol - For the excellent LLM-tool integration standard
Taiga - For the powerful open-source project management platform
Anthropic - For Claude and MCP SDK
Community Contributors - For feedback and improvements
📞 Support
Documentation: RUN.md for setup guides
Issues: GitHub Issues
Discussions: GitHub Discussions
Taiga API Docs: https://docs.taiga.io/api.html
Built with ❤️ for the AI-powered project management community
⭐ Star this repo if you find it useful!
⚠️ Disclaimer
This project is not officially affiliated with Taiga. It's a community-driven MCP server implementation for integrating Taiga with LLM applications.
This server cannot be installed
remote-capable server
The server can be hosted and run remotely because it primarily relies on remote services or has no dependency on the local environment.
Enables seamless integration between Large Language Models and Taiga project management platform, allowing users to manage projects, user stories, tasks, and team collaboration through natural language commands.