Skip to main content
Glama

get_log_stats

Extract and analyze overall statistics from log files, providing insights into data patterns and performance metrics for efficient log management.

Instructions

Get overall statistics for log files

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
filesNoLog files to analyze (default: all files)

Implementation Reference

  • The core handler function in JsonLogAnalyzer class that implements get_log_stats tool logic: aggregates total entries, level counts, unique modules/functions, and time span across specified log files.
    def get_log_stats(self, files: Optional[List[str]] = None) -> Dict[str, Any]: """Get overall statistics for log files""" if files is None: files = list(self.log_files_cache.keys()) total_entries = 0 levels = {} modules = set() functions = set() earliest_time = None latest_time = None for filename in files: try: entries = self.read_log_file(filename) total_entries += len(entries) for entry in entries: # Count levels level = entry.get("level", "UNKNOWN") levels[level] = levels.get(level, 0) + 1 # Collect modules and functions modules.add(entry.get("module", "UNKNOWN")) functions.add(entry.get("function", "UNKNOWN")) # Track time range timestamp = entry.get("parsed_timestamp") if timestamp: if earliest_time is None or timestamp < earliest_time: earliest_time = timestamp if latest_time is None or timestamp > latest_time: latest_time = timestamp except (FileNotFoundError, RuntimeError): continue return { "total_files": len(files), "total_entries": total_entries, "levels": levels, "unique_modules": sorted(list(modules)), "unique_functions": len(functions), "time_range": { "earliest": earliest_time.isoformat() if earliest_time else None, "latest": latest_time.isoformat() if latest_time else None, "span_hours": round((latest_time - earliest_time).total_seconds() / 3600, 2) if earliest_time and latest_time else None } }
  • Input schema definition for the get_log_stats tool, accepting optional array of file names.
    types.Tool( name="get_log_stats", description="Get overall statistics for log files", inputSchema={ "type": "object", "properties": { "files": { "type": "array", "items": {"type": "string"}, "description": "Log files to analyze (default: all files)" } } } ),
  • Registration and dispatch logic in the MCP app.call_tool() handler that invokes the get_log_stats implementation and returns formatted JSON response.
    elif name == "get_log_stats": results = log_analyzer.get_log_stats(arguments.get("files")) return [ types.TextContent( type="text", text=json.dumps(results, indent=2, default=str) ) ]

Other Tools

Related Tools

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/mfreeman451/json-logs-mcp-server'

If you have feedback or need assistance with the MCP directory API, please join our Discord server