Skip to main content
Glama

get-console-logs

Retrieve console logs from the Vite development server to monitor application output, debug issues, and track real-time updates during development.

Instructions

Retrieves console logs from the development server

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
checkpointNoIf specified, returns only logs recorded at this checkpoint
limitNoNumber of logs to return, starting from the most recent log

Implementation Reference

  • Handler for the 'get-console-logs' MCP tool. Registers the tool and implements the logic to retrieve and parse recent console logs using LogManager.readLogs, optionally filtered by checkpoint.
    server.tool( 'get-console-logs', 'Retrieves console logs from the development server', { checkpoint: z.string().optional().describe('If specified, returns only logs recorded at this checkpoint'), limit: z.number().optional().describe('Number of logs to return, starting from the most recent log') }, async ({ checkpoint, limit = 100 }) => { try { // Read logs (always provide limit value) const result = await logManager.readLogs(limit, checkpoint); // Parse logs const parsedLogs = result.logs.map((log: string) => { try { return JSON.parse(log); } catch (error) { return { type: 'unknown', text: log, timestamp: new Date().toISOString() }; } }); return { content: [ { type: 'text', text: JSON.stringify({ logs: parsedLogs, writePosition: result.writePosition, totalLogs: result.totalLogs }, null, 2) } ] }; } catch (error) { Logger.error(`Failed to read console logs: ${error}`); return { content: [ { type: 'text', text: `Failed to read console logs: ${error}` } ], isError: true }; } } );
  • Input schema validation for the 'get-console-logs' tool using Zod, defining optional checkpoint and limit parameters.
    checkpoint: z.string().optional().describe('If specified, returns only logs recorded at this checkpoint'), limit: z.number().optional().describe('Number of logs to return, starting from the most recent log') },
  • Core helper method LogManager.readLogs that implements log file reading logic: finds rotated log files, calculates positions, reads recent lines efficiently, returns logs with metadata.
    public async readLogs(limit: number, checkpointId?: string): Promise<{ logs: string[], writePosition: number, totalLogs: number }> { try { // 1. Calculate necessary information const logDir = path.dirname(this.getLogFilePath(0, checkpointId)); const filePattern = checkpointId ? new RegExp(`^chk-${checkpointId}-(\\d+)\\.log$`) : /^default-log-(\d+)\.log$/; // 2. Find log files in directory const files = fs.existsSync(logDir) ? fs.readdirSync(logDir) : []; const logFiles = files .filter(file => filePattern.test(file)) .map(file => { const match = file.match(filePattern); return { file, path: path.join(logDir, file), number: match ? parseInt(match[1], 10) : -1 }; }) .filter(item => item.number >= 0) .sort((a, b) => a.number - b.number); // Sort in order (oldest first) if (logFiles.length === 0) { return { logs: [], writePosition: 0, totalLogs: 0 }; } // 3. Calculate total number of logs (completed files + current file log count) const lastFileIndex = logFiles.length - 1; const completedFilesLogs = lastFileIndex * this.MAX_LOGS_PER_FILE; // Get log count of the last file let currentFileLogCount = 0; if (checkpointId) { const checkpointData = this.checkpointStreams.get(checkpointId); currentFileLogCount = checkpointData ? checkpointData.currentLogCount : 0; } else { currentFileLogCount = this.currentLogCount; } const totalLogs = completedFilesLogs + currentFileLogCount; // 4. Return empty result if no logs needed if (totalLogs === 0) { return { logs: [], writePosition: currentFileLogCount, totalLogs: 0 }; } // 5. Calculate start position and number of logs to read const startPosition = Math.max(0, totalLogs - limit); const startFileIndex = Math.floor(startPosition / this.MAX_LOGS_PER_FILE); const startLogInFile = startPosition % this.MAX_LOGS_PER_FILE; // 6. Read log files (using stream) const logs: string[] = []; let logsNeeded = Math.min(limit, totalLogs); for (let i = startFileIndex; i < logFiles.length && logsNeeded > 0; i++) { const filePath = logFiles[i].path; if (!fs.existsSync(filePath)) continue; // Read line by line using readline interface const rl = readline.createInterface({ input: fs.createReadStream(filePath, { encoding: 'utf-8' }), crlfDelay: Infinity }); let skippedLines = 0; // Skip lines if this is the first file and has a start position const shouldSkipLines = (i === startFileIndex && startLogInFile > 0); const linesToSkip = shouldSkipLines ? startLogInFile : 0; for await (const line of rl) { if (!line.trim()) continue; // Skip necessary lines if (shouldSkipLines && skippedLines < linesToSkip) { skippedLines++; continue; } logs.push(line); logsNeeded--; if (logsNeeded <= 0) { rl.close(); break; } } } // 7. Return result return { logs, writePosition: currentFileLogCount, totalLogs }; } catch (error) { Logger.error(`Failed to read logs: ${error}`); return { logs: [], writePosition: 0, totalLogs: 0 }; } }
  • Helper function to append browser console logs to the LogManager, formatting them as JSON with type, text, timestamp.
    async function appendLogToFile(type: string, text: string) { try { const logEntry = JSON.stringify({ type, text, timestamp: new Date().toISOString(), url: 'unknown', checkpointId: null }) + '\n'; // Record log await logManager.appendLog(logEntry); } catch (error) { Logger.error(`Failed to write console log to file: ${error}`); } }
  • Registration of the 'get-console-logs' tool on the MCP server within registerBrowserTools.
    server.tool( 'get-console-logs', 'Retrieves console logs from the development server', { checkpoint: z.string().optional().describe('If specified, returns only logs recorded at this checkpoint'), limit: z.number().optional().describe('Number of logs to return, starting from the most recent log') }, async ({ checkpoint, limit = 100 }) => { try { // Read logs (always provide limit value) const result = await logManager.readLogs(limit, checkpoint); // Parse logs const parsedLogs = result.logs.map((log: string) => { try { return JSON.parse(log); } catch (error) { return { type: 'unknown', text: log, timestamp: new Date().toISOString() }; } }); return { content: [ { type: 'text', text: JSON.stringify({ logs: parsedLogs, writePosition: result.writePosition, totalLogs: result.totalLogs }, null, 2) } ] }; } catch (error) { Logger.error(`Failed to read console logs: ${error}`); return { content: [ { type: 'text', text: `Failed to read console logs: ${error}` } ], isError: true }; } } );

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/ESnark/blowback'

If you have feedback or need assistance with the MCP directory API, please join our Discord server