DivLens MCP
Provides real-time system intelligence tools for AI agents, enabling queries about CPU, RAM, storage, network, hardware diagnostics, and developer stack.
Click on "Install Server".
Wait a few minutes for the server to deploy. Once ready, it will show a "Started" state.
In the chat, type
@followed by the MCP server name and your instructions, e.g., "@DivLens MCPWhat's using my CPU right now?"
That's it! The server will respond to your query, and you can continue using it as needed.
Here is a step-by-step guide with screenshots.
What is DivLens MCP?
DivLens MCP is a high-performance Model Context Protocol (MCP) server written in Rust.
It bridges the gap between AI assistants and your machine — giving Claude, Cursor, Windsurf, and any other MCP-compatible agent live, structured access to hardware sensors, storage metrics, network diagnostics, process trees, developer runtimes, system logs, and more.
No cloud. No API keys. No configuration required. Just build and run.
"Why is my Mac slow?" → Claude calls get_live_metrics() → Instant answer.
"Is my SSD healthy?" → Claude calls get_hardware_diagnostics() → SMART data returned.
"What's eating disk?" → Claude calls get_advanced_storage_stats() → Largest files listed.✦ 17 Diagnostic Tools
Category | Tool | What it returns |
⚡ Performance |
| CPU %, RAM, swap, blocked processes, uptime |
⚡ Performance |
| Top processes by CPU / RAM with PID |
💾 Storage |
| Free/used/total per mount point |
💾 Storage |
| Full file-type inventory with sizes |
💾 Storage |
| File counts and sizes by extension |
💾 Storage |
| All files matching a specific extension |
💾 Storage |
| Top 50 largest files + stale data analysis |
💾 Storage |
| IOPS, read/write latency, SMART status |
🖥️ Hardware |
| CPU/GPU specs, battery %, temps, SMART |
🌐 Network |
| Throughput, active connections, signal |
🌐 Network |
| IP, DNS, interface config per adapter |
🔬 Identity |
| OS, hostname, uptime, machine fingerprint |
🛠️ Dev Stack |
| Node, Python, Rust, Go, Java runtimes + packages |
🛠️ Dev Stack |
| Kernel modules and device drivers |
📂 Utility |
| Recursive directory listing with sizes |
🧠 Memory |
| Semantic search over past AI diagnoses |
📋 Logs |
| Recent OS/kernel errors clustered by pattern |
🚀 Install — One Command, Any Platform
No Rust required. No compilation. No manual config editing. The installer downloads a pre-built binary and automatically configures your AI clients.
macOS & Linux
curl -fsSL https://raw.githubusercontent.com/Lohithry/divlens-mcp/main/install.sh | bashWindows (PowerShell — no admin required)
irm https://raw.githubusercontent.com/Lohithry/divlens-mcp/main/install.ps1 | iexThe installer will:
✅ Detect your OS and chip (Apple Silicon / Intel / Linux / Windows)
✅ Download the correct pre-built binary from GitHub Releases
✅ Verify the SHA-256 checksum
✅ Install to your PATH with no admin rights needed
✅ Auto-configure Claude Desktop, Cursor, Windsurf, and Antigravity
✅ Test the server works before finishing
Then just restart your AI client and ask "What's using my CPU right now?"
Build from Source (Advanced)
Requires Rust 1.82+.
git clone https://github.com/Lohithry/divlens-mcp.git
cd divlens-mcp/apps/core
cargo build --release
./target/release/divlens-core --mcpConnect to Your AI
Claude Desktop
Config file:
~/Library/Application Support/Claude/claude_desktop_config.json(macOS)
or%APPDATA%\Claude\claude_desktop_config.json(Windows)
{
"mcpServers": {
"divlens": {
"command": "/usr/local/bin/divlens-core",
"args": ["--mcp"]
}
}
}Quit and relaunch Claude Desktop. A 🔌 plug icon confirms the connection.
Cursor
Config file:
~/.cursor/mcp.json
{
"mcpServers": {
"divlens": {
"command": "/usr/local/bin/divlens-core",
"args": ["--mcp"]
}
}
}Cmd+Shift+P → Reload Window
Windsurf
Config file:
~/.codeium/windsurf/mcp_config.json
{
"mcpServers": {
"divlens": {
"command": "/usr/local/bin/divlens-core",
"args": ["--mcp"]
}
}
}For complete setup details, see DEPLOYMENT.md.
How It Works
┌─────────────────────────────────────────┐
│ AI Client (Claude / Cursor / etc.) │
│ LLM reasoning lives here │
└──────────────────┬──────────────────────┘
│ JSON-RPC 2.0 (stdio)
▼
┌─────────────────────────────────────────┐
│ divlens-core (Rust) │
│ │
│ ┌───────────────┐ ┌───────────────┐ │
│ │ MCP Layer │ │ 17 Tools │ │
│ │ (JSON-RPC) │ │ (Rust + OS) │ │
│ └───────────────┘ └───────────────┘ │
│ ┌───────────────┐ ┌───────────────┐ │
│ │ SQLite Cache │ │ Native APIs │ │
│ │ (sysinfo/OS) │ │ (IOKit/WMI) │ │
│ └───────────────┘ └───────────────┘ │
└─────────────────────────────────────────┘
Zero cloud. Zero API keys. 100% local.Transport: Every MCP message is a newline-delimited JSON-RPC 2.0 object over stdio.
AI logic: DivLens never runs LLM inference — it only collects and returns raw system data.
Privacy: All data stays on your machine. Nothing is sent anywhere.
Project Structure
divlens-mcp/
└── apps/
└── core/ # Rust MCP engine
├── src/
│ ├── tools/ # 17 tool implementations
│ ├── mcp/ # JSON-RPC 2.0 protocol handler
│ ├── mcp_server.rs # stdio transport loop
│ ├── collectors/ # Native OS data collectors
│ │ ├── volatile/ # CPU, RAM, network (live)
│ │ ├── persistent/ # Storage, hardware (cached)
│ │ └── ondemand/ # Drivers, logs, packages
│ ├── modules/ # Core business logic
│ ├── db/ # SQLite caching layer
│ ├── models/ # Shared data types
│ └── utils/ # Shell env rehydration
├── Cargo.toml
└── env.exampleOptional: Semantic Memory
Enable the vector-memory feature to give recall_memory true semantic search using a local ONNX embedding model (no cloud, no API key):
cargo build --release --features vector-memoryWhen enabled, DivLens creates a local LanceDB vector store and uses fastembed to embed and recall past diagnoses semantically.
When disabled (default), recall_memory returns an empty list — no functionality is broken.
Verify the Server
Test the MCP wire protocol without a client:
# Initialize handshake
echo '{"jsonrpc":"2.0","id":1,"method":"initialize","params":{"protocolVersion":"2024-11-05","clientInfo":{"name":"test","version":"0.1"}}}' \
| divlens-core --mcp
# Call a tool directly
echo '{"jsonrpc":"2.0","id":2,"method":"tools/call","params":{"name":"get_live_metrics","arguments":{}}}' \
| divlens-core --mcpLicense
Licensed under the Apache License, Version 2.0.
See LICENSE for the full text.
Copyright © 2024 DivLens Contributors.
This server cannot be installed
Maintenance
Resources
Unclaimed servers have limited discoverability.
Looking for Admin?
If you are the server author, to access and configure the admin panel.
Latest Blog Posts
MCP directory API
We provide all the information about MCP servers via our MCP API.
curl -X GET 'https://glama.ai/api/mcp/v1/servers/Lohithry/divlens-mcp'
If you have feedback or need assistance with the MCP directory API, please join our Discord server