Provides tools for interacting with Jira, including support for initialization and configuration of Jira CLI settings for authentication and project access.
Click on "Install Server".
Wait a few minutes for the server to deploy. Once ready, it will show a "Started" state.
In the chat, type
@followed by the MCP server name and your instructions, e.g., "@MCP Templateshow me the available tools in this template"
That's it! The server will respond to your query, and you can continue using it as needed.
Here is a step-by-step guide with screenshots.
MCP Template
MY_MCP_DESCRIPTION
Template Setup
Remove the Template Setup section after you have replaced all the variables.
Find All and Replace these variables throughout the project:
Variable | Description | Example |
| General placeholders | |
| Human-readable name |
|
| One-liner description |
|
| Full GitHub repo URL |
|
| Package name for CLI/binaries |
|
| Key in mcpServers JSON config |
|
| Prefix for environment variables |
|
| Author name for LICENSE |
|
See more under
Installation
Install MY_MCP_NAME
MY_MCP_NAME requires no additional dependencies to be installed.
TODO: Add any additional installation instructions for your MCP server. For example, many MCPs may use the CLI tool.
MCP Server: Option 1: Download binaries (Recommended)
Download the latest release for your operating system from the Releases page.
Operating System | Binary |
Linux |
|
Windows |
|
macOS (Apple Silicon) |
|
macOS (Intel) |
|
Linux
Add to your LLM client configuration:
NOTE: Make sure to replace /usr/local/bin/MY_MCP_PACKAGE_NAME-linux with
the path to the binary on your machine if you moved it to a different location.
macOS
Note: macOS may block the binary on first run. If you see a security warning, go to System Settings > Privacy & Security and click Allow Anyway, or run:
Add to your LLM client configuration:
NOTE: Make sure to replace /usr/local/bin/MY_MCP_PACKAGE_NAME with the
path to the binary on your machine if you moved it to a different location.
Windows
Download
MY_MCP_PACKAGE_NAME-windows.exefrom the Releases page.Move the executable to a convenient location (e.g.,
C:\Program Files\MY_MCP_PACKAGE_NAME\).
Add to your LLM client configuration:
NOTE: Make sure to replace C:\\Program
Files\\MY_MCP_PACKAGE_NAME\\MY_MCP_PACKAGE_NAME-windows.exe with the path to
the binary on your machine if you moved it to a different location.
MCP Server: Option 2: Development setup with uv
Get repo:
Add MCP server to your choice of LLM client:
NOTE: You will need to look up for your specific client on how to add MCPs.
Usually the JSON file for the LLM client will look like this:
This will tell your LLM client application that there's a tool that can be
called by calling uv --directory /ABSOLUTE/PATH/TO/REPO run python -m
src.main.
Install UV: https://docs.astral.sh/uv/getting-started/installation/
MCP Server: Option 3: Install globally with pipx
How it works
You enter some questions or prompt to a LLM Client such as the Claude Desktop, Cursor, Windsurf, or ChatGPT.
The client sends your question to the LLM model (Sonnet, Grok, ChatGPT)
LLM analyzes the available tools and decides which one(s) to use
The LLM you're using will have a context of the tools and what each tool is meant for in human language.
Alternatively without MCPs, you could include in the prompt the endpoints and a description on each endpoint for the LLM to "call on". Then you could copy and paste the text commands into the terminal on your machine.
MCPs provide a more deterministic and standardized method on LLM-to-server interactions.
The client executes the chosen tool(s) through the MCP server.
The MCP server is either running local on your machine or an endpoint hosting the MCP server remotely.
The results are sent back to LLM.
LLM formulates a natural language response and one or both of the following happen:
The response is displayed to you with data from the MCP server
Some action is performed using the MCP server
Development
Testing In Cursor/Windsurf
To test the MCP server in Cursor/Windsurf, you can use the MCP client to test the MCP server.
Add this to the MCP client configuration in Cursor/Windsurf:
This will start the MCP server and you can then use the MCP server in Cursor/Windsurf. You may need to restart in the MCP settings to see the changes.
CI/CD
Setup for CI/CD to build and release the MCP server on multiple operating systems and architectures.
The
.github/workflows/ci.ymlfile is used to run the tests and linting checks.The
.github/workflows/release.ymlfile is used to build and release the MCP server on multiple operating systems and architectures.Tag the release with the format
vX.X.X.The release will be built and released to the Releases page on your GitHub repository.
Formatting
This project uses ruff for linting and
formatting. The .pre-commit-config.yaml file is used to run the linting and
formatting checks before each commit.
To set up pre-commit hooks:
Once installed, ruff will automatically run when you commit. To run checks manually on all files:
NOTE: A developer can skip installing pre-commit hook and formatting checks but the CI/CD workflow will fail if the checks are not passed.
Tests
This project uses pytest for testing. The tests
directory is used to store the test files.
To run the tests:
To run the tests with coverage:
Logging
Do not use print statements for logging. Use the logging module instead.
Writing to stdout will corrupt the JSON-RPC messages and break your server.
Pre-commit
This project uses pre-commit to run ruff linting and formatting checks, and pytest tests before each commit.
To set up pre-commit hooks:
Once installed, ruff and pytest will automatically run when you commit. To run checks manually on all files:
Docstrings / Tool decorator parameters
MCP.tools decorator parameters are especially important as this is the human readable text that the LLM has context of. This will be treated as part of the prompt when fed to the LLM and this will decide when to use each tool.
Architecture
MCP follows a client-server architecture where an MCP host (an AI application like Cursor or ChatGPT desktop) establishes connections to one or more MCP servers. The MCP host accomplishes this by creating one MCP client for each MCP server. Each MCP client maintains a dedicated connection with its corresponding MCP server.
https://modelcontextprotocol.io/docs/learn/architecture
Pitfalls / Troubleshooting
Edit the jira-cli config file
On MacOS:
404 error when using jira init
If you get a 404 error when using jira init, you may need to edit the jira-cli
config file to point to the correct Jira instance. There are only 3 possible
values for the auth type so try each one. basic, password, or bearer.
Environment Variables
Make sure to set any required environment variables. Copy env.example to
.env and fill in the values:
Server Not Starting
If the MCP server is not starting, check:
You have Python 3.12+ installed
All dependencies are installed (
uv sync)Environment variables are set correctly