Implements the MCP server using FastAPI endpoints, exposing system monitoring tools through fastapi-mcp
Provides a chat interface for interacting with the agent that utilizes the MCP server's system monitoring capabilities
Integrates with OpenAI models to power the agent that responds to system resource usage queries using the MCP server's tools
Uses the Pydantic.ai agent framework to create an agent that can interact with the MCP server's system resource tools
Leverages the psutil Python library to collect and provide system CPU and memory statistics through the MCP server
Agent with MCP Example
This project provides a simple example of an Agent and a local MCP server.
The MCP Server provides a collection of tools for obtaining system CPU and memory statistics. It is built on the psutil library. The tools are implemented as FastAPI enpoints and then exposed via MCP using fastapi-mcp.
The Agent is part of a simple Gradio chat application. The agent uses the Pydantic.ai agent framework. The agent is provided the MCP Server's URL and a system prompt indicating that it should answer system resource usage. The Gradio Chat component maintains a conversation history so that you can ask follow-up questions.
Setup
Prerequisites
First make sure you have the following tools installed on your machine:
- uv, a package and environment manager for Python
- direnv, a tool for managing environment variables in your projects
- mcptools (optional), a command line utility for interacting
with MCP servers. This program is only needed if you want to test/debug the MCP server without
the chat application. It is really helpful for debugging your tools and making sure that the
expected metadata is being published by the MCP server. Note that the name of the program is
mcpt
if you install via Homebrew on Mac andmcptools
otherwise. - These examples use OpenAI models for the Agent, so you will need an actve account and key from here. Alternatively, you can use one of the other models supported by Pydantic.ai. In that case, you will have to set the model and key appropriately.
Setup steps
Once you have the prerequisites installed, do the following steps:
- Copy envrc.template to .envrc and edit the value of OPENAI_API_KEY to your Open AI token.
- Run
direnv allow
to put the changed environment variables into your environment. - Run
uv sync
to create/update your virtual environment. - You can start the MCP Server with
uv run psutil_mcp.py
. By default it will server on port 8000.
Testing
If you have installed mcptools, you can connect to your MCP server and test it as follows:
Running
To run the full application:
- If you have not already stared your MCP Server, you can run it as
uv run psutil_mcp.py
- In another terminal window, start the chat server with
uv run chat.py
- Point your browser to http://127.0.0.1:7860
Extras
The psutil_mcp.py and chat.py programs have some command line options to enable debugging, change the
model, change the ports, etc. Run them with the --help
option to see the available options.
There is a configuration for VSCode to use the MCP server at .vscode/mcp.json
This server cannot be installed
Provides tools for obtaining system CPU and memory statistics through FastAPI endpoints exposed via MCP.
Related MCP Servers
- AsecurityAlicenseAqualityAn MCP server providing access to college football statistics sourced from the College Football Data API within Claude Desktop.Last updated -99PythonMIT License
- -securityFlicense-qualityProvides access to PyTorch CI/CD analytics data including workflows, jobs, test runs, and log analysis through an MCP interface.Last updated -Python
CoinStats MCP Serverofficial
AsecurityAlicenseAqualityMCP Server for the CoinStats API. Provides access to cryptocurrency market data, portfolio tracking, and news.Last updated -30173TypeScriptMIT License- AsecurityFlicenseAqualityA FastMCP-based tool for monitoring server statistics that retrieves CPU, memory, and uptime information from both local and remote servers via SSH.Last updated -29JavaScript