Skip to main content
Glama

Extending AI with Custom Tools: MCP Demo

Model Context Protocol (MCP) - Bridge between AI and your tools

What is MCP?

  • Open protocol for connecting AI assistants to external data/tools

  • Enables AI to execute custom functions in your environment

  • Works with GitHub Copilot Chat

Live Demo: Process Monitor

  • Custom MCP server exposing system process information

  • AI can now answer questions about running processes

  • No internet required - runs entirely local

Example Capabilities:

  1. "How many processes are currently running?"

    • Uses get_process_count tool

  2. "Show me the top 5 CPU consuming processes"

    • Uses get_top_cpu_processes tool

  3. "What are the most memory-intensive processes?"

    • Uses get_top_memory_processes tool

  4. "Which processes have multiple instances running?"

    • Uses get_process_instances tool

  5. "Find all node processes"

    • Uses find_process_by_name tool with "node"

  6. "What's consuming the most resources on my system?"

    • AI will intelligently combine multiple tools

-
security - not tested
F
license - not found
-
quality - not tested

Resources

Looking for Admin?

Admins can modify the Dockerfile, update the server description, and track usage metrics. If you are the server author, to access the admin panel.

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/kiralyzoltan98/MCProcessMonitor'

If you have feedback or need assistance with the MCP directory API, please join our Discord server