Clones template repositories for task transformation, allowing the LLM to work with local copies of task templates
Accesses task templates from the TaskBeacon GitHub organization, enabling the LLM to discover and transform existing task templates
Processes and translates configuration files, enabling localization of task templates into different languages
taskbeacon-mcp
A model context protocol (MCP) for taskbeacon.
Overview
taskbeacon-mcp is a lightweight FastMCP server that lets a language-model clone, transform, download and localize taskbeacon task templates using a single entry-point tool.
This README provides instructions for setting up and using taskbeacon-mcp in different environments.
Related MCP server: Awesome MCP FastAPI
1 · Quick Start (Recommended)
The easiest way to use taskbeacon-mcp is with uvx. This tool automatically downloads the package from PyPI, installs it and its dependencies into a temporary virtual environment, and runs it in a single step. No manual cloning or setup is required.
1.1 · Prerequisites
Ensure you have uvx installed. If not, you can install it with pip:
1.2 · LLM Tool Configuration (JSON)
To integrate taskbeacon-mcp with your LLM tool (like Gemini CLI or Cursor), use the following JSON configuration. This tells the tool how to run the server using uvx.
With this setup, the LLM can now use the taskbeacon-mcp tools.
2 · Manual Setup (For Developers)
This method is for developers who want to modify or contribute to the taskbeacon-mcp source code.
2.1 · Environment Setup
Create a virtual environment and install dependencies: This project uses
uv. Make sure you are in the project root directory.# Create and activate the virtual environment python -m venv .venv source .venv/bin/activate # On Windows, use: .venv\Scripts\activate # Install dependencies in editable mode pip install -e .
2.2 · Running Locally (StdIO)
This is the standard mode for local development, where the server communicates over STDIN/STDOUT.
Launch the server:
python taskbeacon_mcp/main.pyLLM Tool Configuration (JSON): To use your local development server with an LLM tool, use the following configuration. Note that you should replace the example path in
argswith the absolute path to themain.pyfile on your machine.{ "name": "taskbeacon-mcp_dev", "type": "stdio", "description": "Local development server for taskbeacon task operations.", "isActive": true, "command": "python", "args": [ "path\\to\\taskbeacon_mcp\\main.py" ] }
2.3 · Running as a Persistent Server (SSE)
For a persistent, stateful server, you can run taskbeacon-mcp using Server-Sent Events (SSE). This is ideal for production or when multiple clients need to interact with the same server instance.
Modify In
taskbeacon-mcp/main.py, change the last line frommcp.run(transport="stdio")to:
mcp.run(transport="sse", port=8000) ```
Run the server:
python taskbeacon-mcp/main.pyThe server will now be accessible at
http://localhost:8000/mcp.LLM Tool Configuration (JSON): To connect an LLM tool to the running SSE server, use a configuration like this:
{ "name": "taskbeacon-mcp_sse", "type": "http", "description": "Persistent SSE server for taskbeacon task operations.", "isActive": true, "endpoint": "http://localhost:8000/mcp" }
3 · Conceptual Workflow
User describes the task they want (e.g. “Make a Stroop out of Flanker”).
LLM calls the
build_tasktool:If the model already knows the best starting template it passes
source_task.Otherwise it omits
source_task, receives a menu created bychoose_template_prompt, picks a repo, then callsbuild_taskagain with that repo.
The server clones the chosen template, returns a Stage 0→5 instruction prompt (
transform_prompt) plus the local template path.The LLM edits files locally, optionally invokes
localizeto translate and adaptconfig.yaml, then zips / commits the new task.
4 · Exposed Tools
Tool | Arguments | Purpose / Return |
|
,
| Main entry-point. • With
→ clones repo and returns:
(Stage 0→5) +
(local clone). • Without
→ returns
from
so the LLM can pick the best starting template, then call
again. |
| none | Returns an array of objects:
, where
lists up to 20 branch names for that repo. |
|
| Clones any template repo from the registry and returns its local path. |
|
,
,
| Reads
, wraps it in
, and returns
. If a
is not provided, it first calls
to find suitable options. Also deletes old
files. |
|
| Returns a human-readable string of available text-to-speech voices from
, optionally filtered by language (e.g., "ja", "en"). |
5 · Exposed Prompts
Prompt | Parameters | Description |
|
,
| Single User message containing the full Stage 0→5 instructions to convert
into
. |
|
,
| Three User messages: task description, template list, and selection criteria. The LLM must reply with one repo name or the literal word
. |
|
,
,
| Two-message sequence: strict translation instruction + raw YAML. The LLM must return the fully-translated YAML body, adding the
if suitable options were provided. |