memory_stream_status
Check the operational status of the persistent memory stream in the MCP server, enabling continuous learning and knowledge retention for LLMs across sessions.
Input Schema
TableJSON Schema
| Name | Required | Description | Default |
|---|---|---|---|
No arguments | |||
Implementation Reference
- The handler function that executes the memory_stream_status tool. It calls StreamingManager.getStreamStatus and wraps the result in the standard tool response format, handling errors appropriately.async handleStreamStatus(args) { try { const status = await this.streamingManager.getStreamStatus(args.streamId); return { content: [ { type: 'text', text: JSON.stringify({ success: true, ...status, }), }, ], }; } catch (error) { return { content: [ { type: 'text', text: JSON.stringify({ success: false, error: error.message, }), }, ], isError: true, }; } }
- src/tools/modules/MemoryStreamingTools.js:77-95 (registration)Registers the memory_stream_status tool with the MCP server, providing the tool name, description, input schema, and a reference to the handler function.registerStreamStatusTool(server) { server.registerTool( 'memory_stream_status', 'Get the status of a streaming query', { type: 'object', properties: { streamId: { type: 'string', description: 'The stream ID', }, }, required: ['streamId'], }, async (args) => { return await this.handleStreamStatus(args); } ); }
- Defines the input schema for the tool, specifying that a streamId string is required.type: 'object', properties: { streamId: { type: 'string', description: 'The stream ID', }, }, required: ['streamId'], },
- Supporting method in StreamingManager that provides the detailed status information for a stream, used by the tool handler.async getStreamStatus(streamId) { const stream = this.activeStreams.get(streamId); if (!stream) { throw new Error(`Stream ${streamId} not found`); } return { id: streamId, status: stream.status, progress: { current: stream.currentIndex, total: stream.totalFacts, percentage: Math.round((stream.currentIndex / stream.totalFacts) * 100), }, metadata: stream.metadata, startTime: stream.startTime, endTime: stream.endTime || null, duration: stream.endTime ? stream.endTime - stream.startTime : Date.now() - stream.startTime, }; }