Skip to main content
Glama
djm81
by djm81

search_log_time_based

Filter and analyze log files within a specified time window, with options to add context before and after matches, for efficient log debugging and monitoring.

Instructions

Search logs within a time window, optionally filtering, with context.

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
context_afterNo
context_beforeNo
daysNo
hoursNo
log_content_patterns_overrideNo
log_dirs_overrideNo
minutesNo
scopeNodefault

Implementation Reference

  • The main handler function for the 'search_log_time_based' tool. It is decorated with @mcp.tool() for registration. Constructs filter criteria from time-based parameters (days, hours, minutes) and other search options, then calls analysis_engine.search_logs to retrieve matching log records with context.
    @mcp.tool() async def search_log_time_based( minutes: int = 0, hours: int = 0, days: int = 0, scope: str = "default", context_before: int = 2, context_after: int = 2, log_dirs_override: str = "", log_content_patterns_override: str = "", ) -> list[dict[str, Any]]: """Search logs within a time window, optionally filtering, with context.""" logger.info( "MCP search_log_time_based called with time=%sd/%sh/%sm, scope='%s', " "context=%sB/%sA, log_dirs_override='%s', " "log_content_patterns_override='%s'", days, hours, minutes, scope, context_before, context_after, log_dirs_override, log_content_patterns_override, ) if minutes == 0 and hours == 0 and days == 0: logger.warning("search_log_time_based called without a time window (all minutes/hours/days are 0).") log_dirs_list = log_dirs_override.split(",") if log_dirs_override else None log_content_patterns_list = log_content_patterns_override.split(",") if log_content_patterns_override else None filter_criteria = build_filter_criteria( minutes=minutes, hours=hours, days=days, scope=scope, context_before=context_before, context_after=context_after, log_dirs_override=log_dirs_list, log_content_patterns_override=log_content_patterns_list, ) try: results = await asyncio.to_thread(analysis_engine.search_logs, filter_criteria) logger.info("search_log_time_based returning %s records.", len(results)) return results except Exception as e: # pylint: disable=broad-exception-caught logger.error("Error in search_log_time_based: %s", e, exc_info=True) custom_message = f"Failed to search time-based logs: {e!s}" raise McpError(ErrorData(code=-32603, message=custom_message)) from e
  • Pydantic input schema for the search_log_time_based tool, extending BaseSearchInput with time window fields: minutes, hours, days.
    class SearchLogTimeBasedInput(BaseSearchInput): """Input for search_log_time_based.""" minutes: int = Field(default=0, description="Search logs from the last N minutes.", ge=0) hours: int = Field(default=0, description="Search logs from the last N hours.", ge=0) days: int = Field(default=0, description="Search logs from the last N days.", ge=0)
  • The @mcp.tool() decorator registers the search_log_time_based function as an MCP tool.
    @mcp.tool()

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/djm81/log_analyzer_mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server