Skip to main content
Glama
127,390 tools. Last updated 2026-05-05 15:32

"An in-memory data store for caching and message brokering" matching MCP tools:

  • Filters HAR data from Sauce Labs test jobs by category, domain, resource type, or status code with in-memory caching for instant subsequent queries.
  • Save agent messages to persistent memory for future recall with channel-based context separation.
    MIT
  • Store memory observations with tags, importance scores, and expiration settings to organize and retrieve information in the Novyx MCP server.
    MIT
  • Update text content in an agent's archival memory store. Use this tool to modify existing memory passages by providing the agent ID, memory ID, and new text content.
  • Store information like facts, preferences, or events in MemoVault's long-term memory system for persistence across sessions.
    MIT

Matching MCP Servers

Matching MCP Connectors

  • App Store and Play downloads and charts over time, Android uses bundle ID. Free key at trendsmcp.ai

  • AI memory with 56 tools. Knowledge Graph, semantic search, OAuth 2.1 + Magic Link. Free tier.

  • Store AI session summaries in persistent memory to retain key decisions, patterns, and solutions learned during coding sessions.
    MIT
  • Store a memory entry for a session with automatic embeddings and graph extraction.
    MIT
  • Store a fact in persistent memory by providing a key and value. Retrieve it later using the key to access information across sessions.
    MIT
  • Perform targeted web research, generate intelligent suggestions, and store findings in memory for efficient task management. Combines web research with local knowledge caching to enhance research workflow.
    MIT
  • Store information in persistent memory for multi-agent collaboration, enabling key-value data storage with optional expiration and type categorization.
    MIT
  • Store and organize information with structured titles and content using a SQLite-based memory storage system. Ideal for managing and retrieving data efficiently.
    MIT