netlinq-jenkins-mcp
Allows triggering Jenkins pipelines (NetLinQ EMS Release and Patch Single Repository), checking build status, listing recent builds, and tailing build logs.
Click on "Install Server".
Wait a few minutes for the server to deploy. Once ready, it will show a "Started" state.
In the chat, type
@followed by the MCP server name and your instructions, e.g., "@netlinq-jenkins-mcpbuild 7.0 release package"
That's it! The server will respond to your query, and you can continue using it as needed.
Here is a step-by-step guide with screenshots.
netlinq-jenkins-mcp
A small Python service that wraps your private Jenkins controller and lets a team
trigger the NetLinQ EMS Release pipeline and Patch Single Repository Pipeline jobs
through natural language. One codebase, two run modes:
MCP server (stdio) - plug into Cursor on your laptop and ask: "build 7.0 release package" or "rebuild blinq-ems-charts at tag 7.0.3".
FastAPI web app + chat UI - one-command
docker compose upon an internal server, the whole team logs in via browser and gets the same tools.
Hosting note: GitHub-hosted runners cannot reach a private Jenkins. The code lives in a private GitHub repo; the runtime runs wherever it has a network path to Jenkins (a teammate's laptop with VPN, or an internal Linux VM).
Table of contents
Architecture
flowchart LR
subgraph github [Private GitHub Repo]
repo[netlinq-jenkins-mcp]
end
subgraph local [Local laptop - DevOps user]
cursor[Cursor IDE]
mcp["FastMCP stdio server<br/>mcp_server.py"]
cursor -->|stdio| mcp
end
subgraph shared [Internal VM - team]
web["FastAPI web app<br/>web.py + Vite UI"]
chat["Chat UI - browser"]
chat -->|HTTPS basic auth| web
end
subgraph core [Shared Python core]
tools["tools.py<br/>5 tool functions"]
llm["llm.py<br/>LiteLLM router"]
jc["jenkins_client.py<br/>httpx + crumb"]
end
repo -.git clone.-> local
repo -.git clone.-> shared
mcp --> tools
web --> llm
web --> tools
llm -->|"tool calls"| tools
tools --> jc
jc -->|REST + basic auth| jenkins[(Jenkins<br/>private network)]tools.py is the single source of truth. Both the MCP server and the LiteLLM
agent in the web app call into the same five functions, so behavior is identical
between Cursor and the team chat UI.
The five tools:
Tool | What it does |
| Queues |
| Queues |
| Latest or specific build's result, duration, parameters |
| History (newest first) |
| Last N lines of console output |
Quickstart - MCP in Cursor
Full walkthrough: docs/CURSOR_MCP.md. Short version:
Generate a Jenkins API token at
<JENKINS_URL>/me/configure-> Add new Token.Install
uv:pipx install uvEdit
~/.cursor/mcp.json(Windows:%USERPROFILE%\.cursor\mcp.json):{ "mcpServers": { "netlinq-jenkins": { "command": "uvx", "args": [ "--from", "git+ssh://git@github.com/<your-org>/netlinq-jenkins-mcp.git@main", "netlinq-jenkins-mcp" ], "env": { "JENKINS_URL": "https://jenkins.internal.example.com", "JENKINS_USER": "your-user", "JENKINS_TOKEN": "your-api-token" } } } }Restart Cursor. Look for the green dot next to netlinq-jenkins in Settings -> MCP.
In the chat, try: "build 7.0 release package". The agent will confirm before actually triggering Jenkins.
Quickstart - team chat UI (Docker)
git clone git@github.com:<your-org>/netlinq-jenkins-mcp.git
cd netlinq-jenkins-mcp
cp .env.example .env
# edit .env: JENKINS_*, LLM_*, WEB_USERS
# Create at least one web user. The hash MUST be bcrypt-hashed.
python -c "from passlib.hash import bcrypt; print('alice:' + bcrypt.hash('secret123'))"
# paste the line into WEB_USERS=
docker compose up -d --build
# browse http://<host>:8000 - log in with alice / secret123What the team sees:
Chat input at the bottom, conversation transcript in the middle.
Live "recent builds" panels for both pipelines on the right, polled every 5s.
Tool-call cards expand inline so people can see exactly what the bot is doing.
A "Reset" button on the header clears the agent's memory.
Local dev (no Docker)
# Python side
python -m venv .venv
.\.venv\Scripts\Activate.ps1 # PowerShell
# or: source .venv/bin/activate # bash
pip install -e ".[dev]"
# Frontend side (only needed for the web mode)
cd ui
npm install
npm run build # writes ui/dist/, which web.py auto-serves
cd ..
# Run the web app
netlinq-jenkins-web
# or, with auto-reload:
uvicorn netlinq_jenkins.web:create_app --factory --reload --port 8000
# Or run as MCP (stdio - the way Cursor will spawn it)
netlinq-jenkins-mcp
# Run tests
pytestConfiguration reference
All settings come from environment variables (or a .env file).
See .env.example for the canonical list.
Variable | Default | Purpose |
| required | Base URL of the Jenkins controller |
| required | Service-account username |
| required | API token (preferred) or password |
| empty | Path to a CA bundle for self-signed TLS, or |
|
| Override if your job is named differently |
|
| The job's version-parameter name |
|
| Patch pipeline name |
|
| Patch pipeline repo-parameter name |
|
| Patch pipeline tag-parameter name |
|
| Informational - LiteLLM picks based on |
|
| |
| - | Provider key (web mode only) |
| - | For Azure / Ollama / self-hosted endpoints |
|
| FastAPI bind host |
|
| FastAPI bind port |
| empty |
|
| - | Optional |
|
| JSONL file every tool call is appended to |
Discovering Jenkins parameter names
If VERSION / REPO / TAG aren't your real parameter names, ask Jenkins:
curl -s -u "$JENKINS_USER:$JENKINS_TOKEN" \
"$JENKINS_URL/job/NetLinQ%20EMS%20Release%20pipeline/api/json?tree=property[parameterDefinitions[name,type,defaultParameterValue[value]]]" \
| jqThen override the relevant *_PARAM env var in .env or in your Cursor mcp.json.
LLM provider tips
The web mode uses LiteLLM so you can swap providers purely by env var. Common combos:
Provider |
|
|
|
OpenAI |
|
| - |
Anthropic |
|
| - |
Azure OpenAI |
| Azure key |
|
Ollama (local) |
| - |
|
OpenAI-compatible |
| key |
|
The MCP-in-Cursor mode does not need any of this - Cursor's own model drives the conversation and just calls our tools.
Security notes
.envis git-ignored - secrets never leave the host.Web mode requires HTTP Basic auth (bcrypt-hashed in
WEB_USERS).Optional
WEB_API_SHARED_SECRETadds a header-based second factor on/api/*, meant for "behind a reverse proxy" deployments.No inbound internet traffic is required - the app only reaches out to Jenkins.
API tokens are preferred over passwords: tokens skip the CSRF crumb dance and are easier to revoke.
Inputs are validated with strict regexes (
version,repo,tag) before any HTTP call goes out, so a chatty LLM cannot smuggle shell metacharacters.Every tool invocation is appended to the audit log (see below).
Audit log
Every successful trigger writes a JSONL line to ${AUDIT_LOG_PATH}:
{"ts": "2026-05-06T20:30:11+00:00", "event": "trigger",
"pipeline": "NetLinQ EMS Release pipeline",
"parameters": {"VERSION": "7.0"},
"queue_url": "https://jenkins.internal.example.com/queue/item/812/"}In Docker mode the file is bind-mounted at ./logs/audit.jsonl on the host.
Project layout
netlinq-jenkins-mcp/
├── src/netlinq_jenkins/
│ ├── config.py # pydantic-settings
│ ├── jenkins_client.py # async httpx wrapper, crumb handling
│ ├── tools.py # 5 tool functions, used by both modes
│ ├── llm.py # LiteLLM tool-calling agent (web mode only)
│ ├── mcp_server.py # FastMCP stdio entrypoint (Cursor)
│ └── web.py # FastAPI app + serves the bundled UI
├── ui/ # Vite + React + Tailwind chat UI
│ ├── src/App.tsx # main chat layout
│ └── src/components/ # ToolCard, BuildsPanel
├── tests/ # pytest + pytest-httpx
├── docs/CURSOR_MCP.md # detailed Cursor integration guide
├── examples/cursor-mcp.json
├── Dockerfile # multi-stage: builds UI, then Python wheel
├── docker-compose.yml
├── .env.example
└── pyproject.tomlRoadmap / next steps
OIDC / SSO instead of HTTP Basic for the web UI.
Slack-bot adapter that forwards
/build 7.0slash commands into the same tools.Optional read-only mode (
READ_ONLY=true) that disables the trigger tools.WebSocket log tail in the UI instead of the polling sidebar.
Per-user audit log instead of one global file.
This server cannot be installed
Resources
Unclaimed servers have limited discoverability.
Looking for Admin?
If you are the server author, to access and configure the admin panel.
Latest Blog Posts
MCP directory API
We provide all the information about MCP servers via our MCP API.
curl -X GET 'https://glama.ai/api/mcp/v1/servers/RadhaKrishna0018/netlinq-jenkins-mcp'
If you have feedback or need assistance with the MCP directory API, please join our Discord server