Skip to main content
Glama

LogAnalyzer MCP Server

CHANGELOG.md5.85 kB
# Changelog All notable changes to the LogAnalyzer MCP Server will be documented in this file. The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/), and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html). ## [1.0.0] - 2024-01-23 ### 🎉 Initial Release The first stable release of LogAnalyzer MCP Server - AI-powered log analysis for the Model Context Protocol ecosystem. ### ✨ Features #### Core Analysis - **AI-Powered Log Analysis**: Intelligent error detection and root cause analysis using Google Gemini - **Multi-Format Support**: Automatic detection and parsing of JSON and plain text logs - **Smart Error Extraction**: Advanced pattern matching for errors, exceptions, and stack traces - **Context Preservation**: Maintains relevant context around errors for better analysis #### Real-time Monitoring - **File Watching**: Monitor log files for new errors in real-time - **Multiple File Support**: Watch multiple log files simultaneously - **Configurable Polling**: Adjustable monitoring intervals for different use cases - **Background Processing**: Efficient incremental processing of log file changes #### MCP Integration - **5 MCP Tools**: Complete set of tools for log analysis workflows - `analyze_log`: AI-powered log analysis - `watch_log_file`: Start real-time monitoring - `stop_watching`: Stop file monitoring - `list_watched_files`: View active monitoring - `get_recent_errors`: Retrieve recent error analysis - **MCP Resources**: Access to recent errors and monitoring status - **Standard Compliance**: Full MCP protocol implementation #### Developer Experience - **TypeScript**: Full TypeScript implementation with type safety - **CLI Interface**: Standalone command-line client for testing and direct usage - **Configuration Management**: Environment-based configuration with validation - **Comprehensive Testing**: Test suite with sample logs and validation tools #### Deployment & Integration - **Docker Support**: Production-ready containerization with docker-compose - **Cursor AI Integration**: Complete setup guide for Cursor AI editor - **Claude Desktop Support**: Configuration examples for Claude Desktop - **Documentation**: Extensive guides, examples, and troubleshooting ### 🛠️ Technical Details #### Architecture - **Modular Design**: Separate components for analysis, monitoring, and MCP handling - **Performance Optimized**: Content truncation, streaming, and efficient file operations - **Error Handling**: Robust error handling with graceful degradation - **Security**: Environment variable management and input validation #### Dependencies - **@modelcontextprotocol/sdk**: MCP protocol implementation - **@google/generative-ai**: Gemini AI integration - **chokidar**: Cross-platform file watching - **dotenv**: Environment configuration - **TypeScript**: Type-safe development ### 📚 Documentation #### Guides - **README.md**: Installation and quick start guide - **docs/cursor-integration.md**: Comprehensive Cursor AI setup - **examples/sample-usage.md**: Real-world usage scenarios - **CHANGELOG.md**: Version history and updates #### Examples - **7 Usage Scenarios**: Web apps, microservices, security, performance, etc. - **Sample Logs**: Test cases for different log formats and error types - **Integration Patterns**: CI/CD, monitoring, and analysis workflows ### 🚀 Getting Started ```bash # Quick installation git clone <repository-url> cd loganalyzer-mcp npm install echo "GEMINI_API_KEY=your_key" > .env npm run build npm run validate # Test with sample logs npm run analyze -- test ``` ### 🎯 Use Cases - **Development**: Real-time error monitoring during coding - **Debugging**: AI-assisted root cause analysis - **Production Monitoring**: Continuous log surveillance - **Performance Analysis**: Bottleneck identification and optimization - **Security Auditing**: Threat detection and analysis - **DevOps**: CI/CD integration and deployment monitoring ### 📊 Performance - **Analysis Speed**: < 10 seconds for typical log analysis - **File Watching**: Efficient polling with configurable intervals - **Memory Usage**: Optimized for large log files with smart truncation - **Concurrent Support**: Multiple file monitoring without performance degradation ### 🔧 Configuration #### Environment Variables - `GEMINI_API_KEY`: Required for AI analysis - `LOG_LEVEL`: Logging verbosity (debug, info, warn, error) - `MAX_FILE_SIZE`: Maximum log file size for processing - `WATCH_INTERVAL`: Default file polling interval - `MAX_CONTEXT_TOKENS`: AI context window limit ### 🤝 Community - **Open Source**: MIT license for community contributions - **GitHub**: Issues, discussions, and feature requests - **MCP Ecosystem**: Compatible with all MCP-enabled tools - **Documentation**: Comprehensive guides for all skill levels ### 🔮 Roadmap #### Planned Features (v1.1.0) - [ ] Multiple AI provider support (OpenAI, Anthropic) - [ ] Web interface for log analysis - [ ] Log aggregation from multiple sources - [ ] Custom rule engine for pattern detection - [ ] Slack/Discord notifications #### Future Enhancements - [ ] Persistent storage with PostgreSQL - [ ] Redis caching for improved performance - [ ] Grafana integration for visualization - [ ] Machine learning for anomaly detection - [ ] Multi-language log format support ### 🙏 Acknowledgments - Model Context Protocol community for the excellent standard - Google for providing accessible Gemini AI API - Cursor AI team for pioneering AI-editor integration - Open source contributors and early adopters --- For detailed usage instructions, see [README.md](README.md). For integration guides, see [docs/cursor-integration.md](docs/cursor-integration.md). For examples, see [examples/sample-usage.md](examples/sample-usage.md).

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/ChiragPatankar/loganalyzer-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server