Skip to main content
Glama
mixelpixx

meMCP - Memory-Enhanced Model Context Protocol

memory_stream_query

Query persistent memory to retrieve stored knowledge and context across LLM sessions, enabling continuous learning through the Memory-Enhanced Model Context Protocol.

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault

No arguments

Implementation Reference

  • The handler function that executes the 'memory_stream_query' tool logic. It invokes StreamingManager.createBatchStream to handle large queries and returns the stream ID.
    async handleStreamQuery(args) { try { const streamId = await this.streamingManager.createBatchStream(args, this.factStore, { chunkSize: args.chunkSize || 10, maxResults: args.maxResults || 1000, }); return { content: [ { type: 'text', text: JSON.stringify({ success: true, streamId, message: 'Streaming query started', chunkSize: args.chunkSize || 10, maxResults: args.maxResults || 1000, }), }, ], }; } catch (error) { return { content: [ { type: 'text', text: JSON.stringify({ success: false, error: error.message, }), }, ], isError: true, }; } }
  • Registers the 'memory_stream_query' MCP tool with the server, including description, input schema, and reference to the handler function.
    registerStreamQueryTool(server) { server.registerTool( 'memory_stream_query', 'Start a streaming query for large result sets', { type: 'object', properties: { query: { type: 'string', description: 'Search query text', }, type: { type: 'string', description: 'Filter by fact type', }, domain: { type: 'string', description: 'Filter by domain', }, chunkSize: { type: 'integer', description: 'Number of facts per chunk (default: 10)', minimum: 1, maximum: 100, }, maxResults: { type: 'integer', description: 'Maximum total results (default: 1000)', minimum: 1, maximum: 10000, }, }, }, async (args) => { return await this.handleStreamQuery(args); } ); }
  • Input schema (JSON Schema) for the 'memory_stream_query' tool defining parameters like query, filters, chunkSize, and maxResults.
    { type: 'object', properties: { query: { type: 'string', description: 'Search query text', }, type: { type: 'string', description: 'Filter by fact type', }, domain: { type: 'string', description: 'Filter by domain', }, chunkSize: { type: 'integer', description: 'Number of facts per chunk (default: 10)', minimum: 1, maximum: 100, }, maxResults: { type: 'integer', description: 'Maximum total results (default: 1000)', minimum: 1, maximum: 10000, }, }, },
  • Supporting utility in StreamingManager that performs batched fact querying from the factStore for large result sets and initializes the stream, called by the tool handler.
    async createBatchStream(queryParams, factStore, options = {}) { const batchSize = options.batchSize || 100; const maxResults = options.maxResults || 1000; // Execute query in batches let allFacts = []; let offset = 0; let hasMore = true; while (hasMore && allFacts.length < maxResults) { const batchParams = { ...queryParams, limit: Math.min(batchSize, maxResults - allFacts.length), offset, }; const result = await factStore.queryFacts(batchParams); const facts = result.facts || []; if (facts.length === 0) { hasMore = false; } else { allFacts = allFacts.concat(facts); offset += facts.length; if (facts.length < batchSize) { hasMore = false; } } } return await this.createStream(allFacts, { ...options, query: queryParams.query || '', type: queryParams.type || 'all', domain: queryParams.domain || 'all', }); }

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/mixelpixx/meMCP'

If you have feedback or need assistance with the MCP directory API, please join our Discord server