## Extending AI with Custom Tools: MCP Demo
**Model Context Protocol (MCP)** - Bridge between AI and your tools
### What is MCP?
- Open protocol for connecting AI assistants to external data/tools
- Enables AI to execute custom functions in your environment
- Works with GitHub Copilot Chat
### Live Demo: Process Monitor
- Custom MCP server exposing system process information
- AI can now answer questions about running processes
- No internet required - runs entirely local
### Example Capabilities:
1. "How many processes are currently running?"
- Uses `get_process_count` tool
2. "Show me the top 5 CPU consuming processes"
- Uses `get_top_cpu_processes` tool
3. "What are the most memory-intensive processes?"
- Uses `get_top_memory_processes` tool
4. "Which processes have multiple instances running?"
- Uses `get_process_instances` tool
5. "Find all node processes"
- Uses `find_process_by_name` tool with "node"
6. "What's consuming the most resources on my system?"
- AI will intelligently combine multiple tools
MCP directory API
We provide all the information about MCP servers via our MCP API.
curl -X GET 'https://glama.ai/api/mcp/v1/servers/kiralyzoltan98/MCProcessMonitor'
If you have feedback or need assistance with the MCP directory API, please join our Discord server