Skip to main content
Glama
djm81
by djm81

search_log_time_based

Search log files within specific time windows to analyze events, with options to filter results and include contextual lines before and after matches.

Instructions

Search logs within a time window, optionally filtering, with context.

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
minutesNo
hoursNo
daysNo
scopeNodefault
context_beforeNo
context_afterNo
log_dirs_overrideNo
log_content_patterns_overrideNo

Implementation Reference

  • The handler function for 'search_log_time_based' tool. Registered via @mcp.tool() decorator. Builds filter criteria from time-based parameters (days, hours, minutes) and other search params, then invokes analysis_engine.search_logs to retrieve matching log records with context.
    @mcp.tool() async def search_log_time_based( minutes: int = 0, hours: int = 0, days: int = 0, scope: str = "default", context_before: int = 2, context_after: int = 2, log_dirs_override: str = "", log_content_patterns_override: str = "", ) -> list[dict[str, Any]]: """Search logs within a time window, optionally filtering, with context.""" logger.info( "MCP search_log_time_based called with time=%sd/%sh/%sm, scope='%s', " "context=%sB/%sA, log_dirs_override='%s', " "log_content_patterns_override='%s'", days, hours, minutes, scope, context_before, context_after, log_dirs_override, log_content_patterns_override, ) if minutes == 0 and hours == 0 and days == 0: logger.warning("search_log_time_based called without a time window (all minutes/hours/days are 0).") log_dirs_list = log_dirs_override.split(",") if log_dirs_override else None log_content_patterns_list = log_content_patterns_override.split(",") if log_content_patterns_override else None filter_criteria = build_filter_criteria( minutes=minutes, hours=hours, days=days, scope=scope, context_before=context_before, context_after=context_after, log_dirs_override=log_dirs_list, log_content_patterns_override=log_content_patterns_list, ) try: results = await asyncio.to_thread(analysis_engine.search_logs, filter_criteria) logger.info("search_log_time_based returning %s records.", len(results)) return results except Exception as e: # pylint: disable=broad-exception-caught logger.error("Error in search_log_time_based: %s", e, exc_info=True) custom_message = f"Failed to search time-based logs: {e!s}" raise McpError(ErrorData(code=-32603, message=custom_message)) from e
  • Pydantic input schema for the search_log_time_based tool, extending BaseSearchInput with time-based fields: minutes, hours, days.
    class SearchLogTimeBasedInput(BaseSearchInput): """Input for search_log_time_based.""" minutes: int = Field(default=0, description="Search logs from the last N minutes.", ge=0) hours: int = Field(default=0, description="Search logs from the last N hours.", ge=0) days: int = Field(default=0, description="Search logs from the last N days.", ge=0) # Custom validation to ensure at least one time field is set if others are default (0) # Pydantic v2: @model_validator(mode='after') # Pydantic v1: @root_validator(pre=False) # For simplicity here, relying on tool logic to handle it, or can add validator if needed.
  • Base Pydantic schema shared by search tools including search_log_time_based, defining common parameters: scope, context_before, context_after, log_dirs_override, log_content_patterns_override.
    class BaseSearchInput(BaseModel): """Base model for common search parameters.""" scope: str = Field(default="default", description="Logging scope to search within (from .env scopes or default).") context_before: int = Field(default=2, description="Number of lines before a match.", ge=0) context_after: int = Field(default=2, description="Number of lines after a match.", ge=0) log_dirs_override: str = Field( default="", description="Comma-separated list of log directories, files, or glob patterns (overrides .env for file locations).", ) log_content_patterns_override: str = Field( default="", description="Comma-separated list of REGEX patterns for log messages (overrides .env content filters).", )
  • The @mcp.tool() decorator registers the search_log_time_based function as an MCP tool.
    @mcp.tool()

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/djm81/log_analyzer_mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server