Skip to main content
Glama
carlosedp

Windows MCP Server

by carlosedp

Windows MCP Server

This project provides a Windows MCP (Model Context Protocol) server exposing useful system information and control tools for Windows environments on your AI applications.

What’s new

  • More robust drive and uptime implementations (no brittle PowerShell parsing).

  • Structured results for top processes (list of objects with pid, name, cpu_percent, memoryMB).

  • Structured results for memory and network information.

  • Safer PowerShell usage with JSON parsing for GPU info.

Features

  • System info (OS, release, version, architecture, hostname)

  • Uptime and last boot time

  • Drives listing and per-drive space usage

  • Memory, CPU, GPU, and Network information

  • Top processes by memory and CPU (accurate sampling)

Available Tools

Tool Name

Description

Parameters

Returns

Windows-system-info

Get OS, release, version, architecture, and hostname

None

object: name, system, release, version, architecture, hostname

Windows-last-boot-time

Get the last boot time of the system

None

string (timestamp)

Windows-uptime

Get system uptime since last boot

None

string: "Uptime:

seconds"

Windows-drives

Get list of all available drives

None

string[] (e.g., ["C", "D"])

Windows-drive-status

Get used and free space for a specific drive

drive: string

DriveInfo { name, used_spaceGB: number, free_spaceGB: number }

Windows-drives-status-simple

Get status using comma-separated drive letters

drives_string: string

DriveInfo[]

Windows-memory-info

Get RAM usage information

None

object: total_memory, available_memory, used_memory (strings with GB)

Windows-network-info

Get network IPv4 addresses per interface

None

object: interface -> IPv4 (or { error })

Windows-cpu-info

Get CPU model, logical count and frequency

None

string

Windows-gpu-info

Get GPU name(s) and driver versions

None

string (one line per GPU)

Windows-top-processes-by-memory

Get the top X processes by memory usage

amount: int = 5

ProcessInfo[] { pid, name, memoryMB, cpu_percent? }

Windows-top-processes-by-cpu

Get the top X processes by CPU usage (sampled for accuracy)

amount: int = 5

ProcessInfo[] { pid, name, cpu_percent, memoryMB }

Note: Previously documented tools Windows-name-version, Windows-drives-status, and Windows-all-drives-status are not currently implemented to avoid duplication. If needed, they can be added easily.

Requirements

  • Python 3.13+

  • uv (for fast startup and dependency management)

# On Windows powershell -c "irm https://astral.sh/uv/install.ps1 | iex"

Installation

  1. Clone this repository:

git clone https://github.com/carlosedp/windows-mcp-server.git cd windows-mcp-server

Running the development Server

uv run mcp dev main.py

uv handles the installation of dependencies and runs the server with the MCP protocol enabled.

Then in the MCP Inspector browser window:

  • Click "Connect" to connect the MCP client.

  • Go to the "Tools" tab to see available tools.

  • Click "List Tools" to see the available tools.

  • Select a tool and click "Run tool" to execute it.

Examples

Windows-system-info

{ "name": "MY-PC", "system": "Windows", "release": "10", "version": "10.0.19045", "architecture": "64bit", "hostname": "MY-PC" }

Windows-drive-status (input: "C")

{ "name": "C", "used_spaceGB": 120.53, "free_spaceGB": 380.12 }

Windows-memory-info

{ "total_memory": "32.00 GB", "available_memory": "18.25 GB", "used_memory": "13.75 GB" }

Windows-network-info

{ "Ethernet": "192.168.1.50", "Wi-Fi": "10.0.0.15" }

Windows-top-processes-by-cpu (amount: 3)

[ { "pid": 1234, "name": "chrome.exe", "cpu_percent": 24.7, "memoryMB": 512.3 }, { "pid": 4321, "name": "code.exe", "cpu_percent": 12.1, "memoryMB": 650.8 }, { "pid": 9876, "name": "System", "cpu_percent": 8.4, "memoryMB": 45.0 } ]

MCP Client Configuration Example

LM Studio using Llama 3.2 3B getting system information using the MCP server:

alt text

Some more examples of tools you can run:

alt text

To connect your MCP client to this server (like Claude Desktop, VSCode, LM Studio, etc), add the following to your client configuration:

{ "mcpServers": { "windows-mcp": { "command": "uv", "args": [ "--directory", "C:\\Users\\Carlos Eduardo\\repos\\windows-mcp-server", "run", "mcp", "run", "main.py" ] } } }

Adjust the paths above to where the project file is located.

Packaging and Distribution

To publish as a Python package:

  1. Edit pyproject.toml with your metadata.

  2. Build and upload to PyPI:

python -m build python -m twine upload dist/*

License

MIT

-
security - not tested
A
license - permissive license
-
quality - not tested

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/carlosedp/windows-mcp-server'

If you have feedback or need assistance with the MCP directory API, please join our Discord server