LogAnalyzer MCP Server
Server Configuration
Describes the environment variables required to run the server.
| Name | Required | Description | Default |
|---|---|---|---|
| LOG_LEVEL | No | The logging level | info |
| MAX_FILE_SIZE | No | Maximum file size for log analysis | 10MB |
| GEMINI_API_KEY | Yes | Your Google Gemini API key | |
| WATCH_POLL_INTERVAL | No | Interval in milliseconds for polling watched log files | 1000 |
Capabilities
Server capabilities have not been inspected yet.
Tools
Functions exposed to the LLM to take actions
| Name | Description |
|---|---|
| rapid_debugC | 🚀 DEBUG SERVER LOGS IN UNDER 30 SECONDS - Instant analysis with actionable fixes and debug commands |
| quick_scanB | âš¡ Ultra-fast log scan for real-time monitoring (< 1 second) |
| analyze_logB | Analyze error logs and provide root cause analysis with AI-powered insights |
| watch_log_fileC | Start monitoring a log file for real-time error detection |
| stop_watchingC | Stop monitoring a specific log file |
| list_watched_filesB | List all currently monitored log files |
| get_recent_errorsC | Get recent error analysis from monitored files |
Prompts
Interactive templates invoked by user choice
| Name | Description |
|---|---|
No prompts | |
Resources
Contextual data attached and managed by the client
| Name | Description |
|---|---|
| Rapid Debug Results | Latest rapid debugging analysis with quick fixes |
| Recent Error Analysis | Latest analyzed errors from all monitored files |
| Watched Files Status | Status of all currently monitored log files |
Latest Blog Posts
MCP directory API
We provide all the information about MCP servers via our MCP API.
curl -X GET 'https://glama.ai/api/mcp/v1/servers/ChiragPatankar/loganalyzer-mcp'
If you have feedback or need assistance with the MCP directory API, please join our Discord server