Skip to main content
Glama
edwardsp

Azure HPC/AI MCP Server

by edwardsp

Azure HPC/AI MCP Server

A minimal Model Context Protocol (MCP) server tailored for Azure HPC/AI clusters. It provides tools that query Kubernetes for GPU node information using kubectl. The server is implemented with fastmcp and uses synchronous subprocess calls (no asyncio).

Tools

  • list_nodes: Lists nodes in the GPU pool with name, labels, and Ready/NotReady status.

  • get_node_topologies: Returns InfiniBand-related topology labels per node: agentpool, pkey, torset.

Both tools shell out to kubectl and return JSON-serializable Python structures (lists of dicts).

Related MCP server: Azure Resource MCP Server

Run the server

Prerequisites:

  • Python 3.10+

  • kubectl configured to access your cluster

Installation

It’s recommended to use a virtual environment.

Create and activate a venv (Linux/macOS):

python3 -m venv .venv
source .venv/bin/activate
python -m pip install -U pip

Install dependencies with pip:

pip install -r requirements.txt

Notes:

  • fastmcp is required to run the server and is installed via requirements.txt. Tests don’t need it (they stub it).

  • If fastmcp isn’t on PyPI for your environment, install it from its source per its documentation.

Run:

python server.py

The server runs over stdio for MCP hosts. You can connect to it with an MCP-compatible client or call the tools locally with the helper script below.

invoke_local helper

The invoke_local.py script lets you execute server tools in-process without an MCP host. It discovers exported tools from server.py, calls them synchronously, and prints pretty JSON.

Examples:

# List nodes
python invoke_local.py list_nodes

# Get IB topology labels
python invoke_local.py get_node_topologies

# Passing parameters (if a tool defines any):
python invoke_local.py some_tool --params '{"key":"value"}'

Implementation notes:

  • No asyncio is used; tool functions call subprocess.run directly and return plain Python data.

  • The script unwraps simple function tools or FastMCP FunctionTool-like wrappers and invokes them with kwargs from --params when provided.

Tests

The tests are written with pytest and exercise success and error paths without requiring a cluster.

Key points:

  • subprocess-based: Tests monkeypatch subprocess.run to simulate kubectl output and errors. There is no usage of asyncio in code or tests.

  • fastmcp-free: Tests inject a lightweight dummy FastMCP module so importing server.py does not require the real dependency.

  • Coverage: Both tools are validated for JSON parsing, Ready condition handling, missing labels, and kubectl failures.

Run tests:

python -m pip install -U pytest
pytest -q

Troubleshooting

  • kubectl not found: Ensure kubectl is installed and on PATH for real runs. Tests do not require it.

  • No nodes returned: Confirm your label selectors match your cluster (tools currently expect GPU/IB labels used in Azure HPC/AI pools).

  • fastmcp import error: Install fastmcp for runtime; tests provide a dummy stub so you can run pytest without it.

-
security - not tested
F
license - not found
-
quality - not tested

Resources

Looking for Admin?

Admins can modify the Dockerfile, update the server description, and track usage metrics. If you are the server author, to access the admin panel.

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/edwardsp/mcpserver'

If you have feedback or need assistance with the MCP directory API, please join our Discord server