Skip to main content
Glama

analyze_logs_streaming

Stream and analyze Optimizely DXP logs in real-time to monitor health, detect errors, track performance metrics, and verify deployments without downloading files.

Instructions

šŸ“Š Stream and analyze logs directly without downloading. FAST: 15-30sec vs 40-60sec for download+analyze. Returns structured health data: error count, performance metrics (p95/p99 response times), AI agent detection. Use this for deployment verification, health checks, or real-time diagnostics. Set slot=true when analyzing deployment slots. Required: environment. Optional: minutesBack (default 60), logType (http/application/all), slot. Returns health score, errors, performance, recommendations.

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
environmentNoEnvironment to analyze. Default: ProductionProduction
logTypeNoLog type: "application" for console logs, "web" for HTTP logs, "all" for both (DXP-114). Default: webweb
minutesBackNoAnalyze logs from last N minutes. Default: 60 (1 hour)
startDateTimeNoISO 8601 start datetime (alternative to minutesBack)
endDateTimeNoISO 8601 end datetime (alternative to minutesBack)
slotNoAnalyze deployment slot logs instead of production logs. Default: false (production logs only, excluding /SLOTS/ paths). Set to true for slot logs during warmup (DXP-116)
structuredContentNoReturn guaranteed structured JSON (recommended for automation). All fields always present with null/0/[] for missing data. Default: true
debugNoInclude debug information in response (container selection, blob dates, pagination details). Useful for troubleshooting. Default: false (DXP-118)
timeoutSecondsNoMaximum time in seconds to wait for analysis. Default: 300s (5 min) for <3 days, 600s (10 min) for larger ranges. Increase for very large time ranges (7+ days). (DXP-188)
projectNameNo
projectIdNo
apiKeyNo
apiSecretNo

Implementation Reference

  • Core handler function: resolves project credentials, streams recent logs from Azure storage containers, parses entries, performs multi-faceted analysis (errors by status, performance percentiles, AI agent detection, health score), generates recommendations, returns structured JSON + markdown report.
    static async handleAnalyzeLogsStreaming(args: AnalyzeLogsArgs): Promise<any> { try { OutputLogger.info(`⚔ handleAnalyzeLogsStreaming called with args: ${JSON.stringify(args, null, 2)}`); // Default environment to Production if (!args.environment) { args.environment = 'Production'; } // Default logType to web (HTTP logs) if (!args.logType) { args.logType = 'web'; } // DXP-179: Convert daysBack to minutesBack if provided if (args.daysBack && !args.minutesBack) { args.minutesBack = args.daysBack * 24 * 60; // Convert days to minutes OutputLogger.info(`šŸ“… Converted daysBack=${args.daysBack} to minutesBack=${args.minutesBack}`); } // Default minutesBack to 60 if (!args.minutesBack && !args.startDateTime && !args.endDateTime) { args.minutesBack = 60; } OutputLogger.info(`šŸ“‹ Defaults applied - environment: ${args.environment}, logType: ${args.logType}, minutesBack: ${args.minutesBack}`); // Default structuredContent to true const structuredContent = args.structuredContent !== false; // DXP-114: Handle logType: 'all' for dual log type analysis if (args.logType === 'all') { return this.handleDualLogTypeAnalysis(args, structuredContent); } OutputLogger.info(`šŸ” Analyzing ${args.logType} logs from ${args.environment} (last ${args.minutesBack || 'custom'} minutes)`); // Resolve project configuration OutputLogger.info(`šŸ”‘ Resolving project configuration for project: ${args.projectName || 'default'}...`); const resolution = ProjectResolutionFix.resolveProjectSafely(args, ProjectTools as any); OutputLogger.info(`āœ… Project resolution complete: success=${resolution.success}`); if (!resolution.success) { if (resolution.requiresSelection) { return ProjectResolutionFix.showProjectSelection(resolution.availableProjects as any); } return ResponseBuilder.error(resolution.message || 'Failed to resolve project'); } const projectName = resolution.project ? resolution.project.name : 'Unknown'; const credentials = resolution.credentials || resolution.project; // Analyze single log type const result = await this.analyzeSingleLogType({ logType: args.logType, environment: args.environment, credentials: credentials as any, timeFilter: { minutesBack: args.minutesBack, startDateTime: args.startDateTime, endDateTime: args.endDateTime }, slot: args.slot, // DXP-116: Pass slot parameter to filter main/slot storage debug: args.debug, // DXP-118: Pass debug parameter timeoutSeconds: args.timeoutSeconds // DXP-188: Pass timeout parameter }); if (result.parsedLogs.length === 0) { // DXP-179: Pass debugInfo so users can troubleshoot why 0 logs returned return this.buildEmptyResponse(args.logType!, structuredContent, result.debugInfo); } // Build response return this.buildResponse({ parsedLogs: result.parsedLogs, errorAnalysis: result.errorAnalysis, perfAnalysis: result.perfAnalysis, aiAnalysis: result.aiAnalysis, healthStatus: result.healthStatus, recommendations: result.recommendations, logType: args.logType!, environment: args.environment, projectName, structuredContent, debugInfo: result.debugInfo // DXP-118: Pass debug info }); } catch (error: any) { OutputLogger.error(`Log analysis error: ${error}`); return ResponseBuilder.internalError('Failed to analyze logs', error.message); } }
  • TypeScript interface defining input parameters for the analyze_logs_streaming tool, including time filters, log types, project selection, slot filtering, debug options.
    interface AnalyzeLogsArgs { environment?: string; logType?: 'web' | 'application' | 'all'; minutesBack?: number; daysBack?: number; // DXP-179: Support daysBack parameter (converted to minutesBack) startDateTime?: string; endDateTime?: string; structuredContent?: boolean; projectName?: string; slot?: boolean; // DXP-116: Filter main/slot storage debug?: boolean; // DXP-118: Debug mode timeoutSeconds?: number; // DXP-188: Configurable timeout }
  • Tool availability matrix entry registering 'analyze_logs_streaming' as available across hosting types (DXP PaaS/SaaS, self-hosted) in the Storage & Downloads category.
    'analyze_logs_streaming': { hostingTypes: ['dxp-paas', 'dxp-saas', 'self-hosted'], category: 'Storage & Downloads', description: 'Stream and analyze logs in-memory (2x faster, guaranteed structured output)' },
  • Key helper method implementing single log type analysis: container discovery, blob filtering, streaming, parsing, and analysis - called by handler for web/application logs.
    static async analyzeSingleLogType(params: SingleLogTypeParams): Promise<LogAnalysisResult> { const { logType, environment, credentials, timeFilter, slot, debug = false, timeoutSeconds } = params; OutputLogger.info(`šŸš€ Starting log analysis: ${logType} logs from ${environment}`); // DXP-188: Smart timeout based on time range // Default: 10 minutes for large ranges (>3 days), 5 minutes for smaller ranges let defaultTimeoutSeconds = 5 * 60; // 5 minutes default if (timeFilter.minutesBack && timeFilter.minutesBack > (3 * 24 * 60)) { defaultTimeoutSeconds = 10 * 60; // 10 minutes for > 3 days } const TIMEOUT_MS = (timeoutSeconds || defaultTimeoutSeconds) * 1000; OutputLogger.info(`ā±ļø Timeout set to ${TIMEOUT_MS / 1000} seconds`); const timeoutPromise = new Promise<never>((_, reject) => { setTimeout(() => reject(new Error( `Log analysis timed out after ${TIMEOUT_MS / 1000} seconds. Try reducing the time range or increase timeoutSeconds parameter.` )), TIMEOUT_MS); }); return Promise.race([ this._analyzeSingleLogTypeImpl({ logType, environment, credentials, timeFilter, slot, debug }), timeoutPromise ]); } /** * Implementation of analyzeSingleLogType (wrapped with timeout) * @private */ static async _analyzeSingleLogTypeImpl(params: SingleLogTypeParams): Promise<LogAnalysisResult> { const { logType, environment, credentials, timeFilter, slot, debug = false } = params; // DXP-118: Collect debug info only if requested let debugInfo: DebugInfo | null = null; if (debug) { debugInfo = { containerName: null, availableContainers: null, sasUrlHost: null, sasUrlPath: null, firstBlobDates: [], lastBlobDates: [], totalBlobsBeforeFilter: 0, totalBlobsAfterFilter: 0 }; } // DXP-179: Dynamically discover container (match download_logs behavior) OutputLogger.info(`šŸ” Discovering storage containers for ${environment}...`); // List all available containers const containersResult = await StorageTools.handleListStorageContainers({ apiKey: credentials.apiKey, apiSecret: credentials.apiSecret, projectId: credentials.projectId, environment }); // Extract container names const containers = this.extractContainerList(containersResult); OutputLogger.info(`šŸ“¦ Found ${containers.length} available containers`); if (containers.length === 0) { throw new Error('No storage containers found for this environment'); } // Match container by logType (same logic as download_logs) let containerName: string | undefined; const logTypeLower = logType.toLowerCase(); if (logTypeLower === 'application') { // Try exact matches first containerName = containers.find(c => { const lowerC = c.toLowerCase(); return lowerC === 'insights-logs-appserviceconsolelogs' || lowerC === 'azure-application-logs'; }) || containers.find(c => { // Fallback to partial matches const lowerC = c.toLowerCase(); return lowerC.includes('consolelog') || lowerC.includes('console') || lowerC.includes('application'); }); } else { // web/http // Try exact matches first containerName = containers.find(c => { const lowerC = c.toLowerCase(); return lowerC === 'insights-logs-appservicehttplogs' || lowerC === 'azure-web-logs'; }) || containers.find(c => { // Fallback to partial matches const lowerC = c.toLowerCase(); return lowerC.includes('httplog') || lowerC.includes('http') || lowerC.includes('web'); }); } if (!containerName) { throw new Error( `No container found for logType="${logType}".\n` + `Available containers: ${containers.join(', ')}\n` + `Try specifying a different logType or check your environment configuration.` ); } if (debugInfo) debugInfo.containerName = containerName; OutputLogger.info(`āœ… Matched container: ${containerName} (for logType: ${logType})`); // DXP-179 ENHANCED DEBUG: Log container discovery details OutputLogger.info(`šŸ” [DXP-179] Container discovery:`); OutputLogger.info(` - Requested logType: ${logType}`); OutputLogger.info(` - Matched container: ${containerName}`); OutputLogger.info(` - Total available containers: ${containers.length}`); OutputLogger.info(` - Available: ${containers.join(', ')}`) // DXP-116: Log slot filter status if (slot === true) { OutputLogger.info(`šŸŽÆ Requesting SLOT storage (deployment slot logs)`); } else if (slot === false) { OutputLogger.info(`šŸ“ Requesting MAIN storage (production logs, excluding slots)`); } // DXP-118: DEBUG - List ALL available containers first (only if debug=true) if (debug) { try { OutputLogger.info(`šŸ” [DXP-118 DEBUG] Listing ALL storage containers for ${environment}...`); const allContainers = await StorageTools.handleListStorageContainers({ apiKey: credentials.apiKey, apiSecret: credentials.apiSecret, projectId: credentials.projectId, environment }); debugInfo!.availableContainers = allContainers; OutputLogger.info(`šŸ” [DXP-118 DEBUG] Available containers: ${JSON.stringify(allContainers, null, 2)}`); } catch (debugError: any) { debugInfo!.availableContainers = `Error: ${debugError.message}`; OutputLogger.warn(`āš ļø [DXP-118 DEBUG] Failed to list containers: ${debugError.message}`); } } // Generate SAS URL for container OutputLogger.info(`šŸ” Generating SAS URL for container...`); const sasArgs = { apiKey: credentials.apiKey, apiSecret: credentials.apiSecret, projectId: credentials.projectId, environment, containerName, permissions: 'Read', expiryHours: 1, slot: slot // DXP-116: Pass slot parameter to storage tools }; const sasResult = await StorageTools.generateStorageSasLink(sasArgs) as any; OutputLogger.info(`āœ… SAS URL generated successfully`); if (!sasResult || !sasResult.data || !sasResult.data.sasUrl) { throw new Error('Failed to generate SAS URL for log container'); } const containerSasUrl = sasResult.data.sasUrl; // DXP-118: DEBUG - Decode SAS URL details (only if debug=true) if (debug && debugInfo) { try { const parsedSasUrl = new URL(containerSasUrl); debugInfo.sasUrlHost = parsedSasUrl.hostname; debugInfo.sasUrlPath = parsedSasUrl.pathname; OutputLogger.info(`šŸ” [DXP-118 DEBUG] Requested container: ${containerName}`); OutputLogger.info(`šŸ” [DXP-118 DEBUG] Got SAS URL host: ${parsedSasUrl.hostname}`); OutputLogger.info(`šŸ” [DXP-118 DEBUG] Got SAS URL path: ${parsedSasUrl.pathname}`); } catch (debugError: any) { OutputLogger.warn(`āš ļø [DXP-118 DEBUG] Failed to parse SAS URL: ${debugError.message}`); } } // List blobs in container OutputLogger.info('šŸ“‹ Listing log blobs...'); OutputLogger.info(`šŸ” [DXP-179] About to list blobs from container: ${containerName}`); OutputLogger.info(`šŸ” [DXP-179] SAS URL hostname: ${new URL(containerSasUrl).hostname}`); let blobUrls = await AzureBlobStreamer.listBlobs(containerSasUrl); if (debugInfo) debugInfo.totalBlobsBeforeFilter = blobUrls.length; OutputLogger.info(`āœ… Found ${blobUrls.length} blobs BEFORE filtering`); // DXP-179 ENHANCED DEBUG: Show sample blob URLs if (blobUrls.length > 0) { OutputLogger.info(`šŸ” [DXP-179] Sample blob URLs (first 3):`); blobUrls.slice(0, 3).forEach((url: string, i: number) => { // Extract just the blob path (after container name) const pathMatch = url.match(/\/([^?]+)\?/); const blobPath = pathMatch ? pathMatch[1] : 'unknown'; OutputLogger.info(` ${i + 1}. ${blobPath}`); }); } // DXP-179: Warn if no blobs found if (blobUrls.length === 0) { OutputLogger.warn(`āš ļø NO BLOBS FOUND in container: ${containerName}`); OutputLogger.warn(` Possible causes:`); OutputLogger.warn(` - Container is empty (no logs generated yet)`); OutputLogger.warn(` - Wrong time range (logs might be older/newer)`); OutputLogger.warn(` - Logs not being written to this container`); OutputLogger.warn(`\nšŸ’” Try: Use download_logs with logType="${logType}" to verify container has logs`); } // DXP-118: DEBUG - Sample blob timestamps (only if debug=true) if (debug && debugInfo && blobUrls.length > 0) { OutputLogger.info(`šŸ” [DXP-118 DEBUG] Sampling blob timestamps...`); // First 5 blobs const sampleBlobs = blobUrls.slice(0, 5); OutputLogger.info(`šŸ” [DXP-118 DEBUG] First 5 blob URLs:`); sampleBlobs.forEach((url: string, i: number) => { const match = url.match(/y=(\d{4})\/m=(\d{2})\/d=(\d{2})/); if (match) { const dateStr = `${match[1]}-${match[2]}-${match[3]}`; debugInfo!.firstBlobDates.push(dateStr); OutputLogger.info(` ${i + 1}. Date: ${dateStr}`); } else { OutputLogger.info(` ${i + 1}. No date pattern found in: ${url.substring(0, 150)}...`); } }); // Last 5 blobs const lastBlobs = blobUrls.slice(-5); OutputLogger.info(`šŸ” [DXP-118 DEBUG] Last 5 blob URLs:`); lastBlobs.forEach((url: string, i: number) => { const match = url.match(/y=(\d{4})\/m=(\d{2})\/d=(\d{2})/); if (match) { const dateStr = `${match[1]}-${match[2]}-${match[3]}`; debugInfo!.lastBlobDates.push(dateStr); OutputLogger.info(` ${i + 1}. Date: ${dateStr}`); } else { OutputLogger.info(` ${i + 1}. No date pattern found in: ${url.substring(0, 150)}...`); } }); } else if (debug && blobUrls.length === 0) { OutputLogger.warn(`āš ļø [DXP-118 DEBUG] No blobs found in container!`); } // DXP-116: Filter by slot parameter (main site vs deployment slot) if (slot !== undefined) { const beforeSlotFilter = blobUrls.length; blobUrls = blobUrls.filter((url: string) => { const nameUpper = url.toUpperCase(); if (slot === true) { // slot=true: Only include deployment slot logs (/SLOTS/SLOT/) return nameUpper.includes('/SLOTS/SLOT/'); } else if (slot === false) { // slot=false (default): Exclude ALL slot logs (any /SLOTS/ path) return !nameUpper.includes('/SLOTS/'); } return true; }); OutputLogger.info(`After slot filter (slot=${slot}): ${blobUrls.length} blobs (removed ${beforeSlotFilter - blobUrls.length})`); } // Filter blobs by date const beforeDateFilter = blobUrls.length; const filteredBlobs = AzureBlobStreamer.filterBlobsByDate(blobUrls, { ...timeFilter, debug }); // DXP-189: Pass debug flag if (debugInfo) debugInfo.totalBlobsAfterFilter = filteredBlobs.length; // DXP-179: Debug logging for date filter stage const removedByDateFilter = beforeDateFilter - filteredBlobs.length; OutputLogger.info(`After date filter: ${filteredBlobs.length} blobs (removed ${removedByDateFilter})`); if (filteredBlobs.length === 0 && beforeDateFilter > 0) { OutputLogger.warn(`āš ļø All ${beforeDateFilter} blobs filtered out by date range`); OutputLogger.warn(` Time filter: ${JSON.stringify(timeFilter)}`); OutputLogger.warn(`šŸ’” Try: Expand the time range or check if logs exist for this period`); } // Stream and parse logs const parsedLogs: ParsedLog[] = []; let totalBytes = 0; let totalLines = 0; for (const blobUrl of filteredBlobs) { try { // DXP-179: Pass debug flag so parsing errors are logged const stats = await AzureBlobStreamer.streamBlob(blobUrl, async (line: string) => { const parsed = parseLogEntry(line, debug); // DXP-179: Pass debug to parser if (parsed) { parsedLogs.push(parsed); } }, { debug }); totalBytes += stats.bytesDownloaded; totalLines += stats.linesProcessed; } catch (error: any) { OutputLogger.debug(`Skipping blob ${blobUrl}: ${error.message}`); } } OutputLogger.info(`āœ… Parsed ${parsedLogs.length} log entries from ${totalLines} lines (${Math.round(totalBytes / 1024)} KB)`); // Analyze logs const errorAnalysis = analyzeErrors(parsedLogs); const perfAnalysis = analyzePerformance(parsedLogs); const aiAnalysis = detectAIAgents(parsedLogs); const healthStatus = calculateHealthScore(errorAnalysis, parsedLogs.length); const recommendations = generateRecommendations(errorAnalysis, perfAnalysis, aiAnalysis); return { parsedLogs, errorAnalysis, perfAnalysis, aiAnalysis, healthStatus, recommendations, debugInfo // DXP-118: Include debug info for investigation }; }
  • Timeout wrapper around analysis implementation with configurable timeout (default 5-10 min based on time range).
    this._analyzeSingleLogTypeImpl({ logType, environment, credentials, timeFilter, slot, debug }), timeoutPromise ]);

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/JaxonDigital/optimizely-dxp-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server