memory_stats
Troubleshoot missing memory search results or assess local data footprint by retrieving entry count, size, distinct tags, embedding index status, and last-write timestamp from the local memory database.
Instructions
Return statistics about the local memory database (entry count, size, tags, embedding state).
Returns counts (rows, distinct tags, total bytes), embedding index status (built / building / stale), and last-write timestamp.
USE WHEN: troubleshooting why memory_semantic_search returns no results, or sizing the user's local data footprint.
BEHAVIOR: pure read of metadata. No side effects.
Input Schema
| Name | Required | Description | Default |
|---|---|---|---|
No arguments | |||
Output Schema
| Name | Required | Description | Default |
|---|---|---|---|
| result | Yes |
Implementation Reference
- Actual handler for the memory_stats tool. Gets a MemoryStore instance, collects tier stats, adds embedding model availability, adds human-friendly KB sizes, and returns JSON.
@mcp_app.tool() @_require_starter def memory_stats() -> str: """Return storage statistics for the memory system. Reports entry counts per tier, database sizes, data directory path, and whether the semantic embedding model is loaded. """ store = _get_store() stats = store.stats() # Add embedding engine availability try: from contextpulse_memory.embeddings import get_engine engine = get_engine() stats["embedding_model_loaded"] = engine.is_available() except Exception: stats["embedding_model_loaded"] = False # Human-friendly size fields stats["warm_db_kb"] = round(stats["warm_db_bytes"] / 1024, 1) stats["cold_db_kb"] = round(stats["cold_db_bytes"] / 1024, 1) return json.dumps(stats, default=str) - Tool registration via @mcp_app.tool() decorator. Uses @_require_starter license gate (free tier).
@mcp_app.tool() @_require_starter def memory_stats() -> str: - MemoryStore.stats() helper called by the handler. Returns per-tier entry counts and database file sizes.
def stats(self) -> dict[str, Any]: """Return storage statistics across all three tiers.""" warm_db = self._data_dir / "memory.db" cold_db = self._data_dir / "memory_cold.db" return { "hot_entries": len(self.hot), "warm_entries": self.warm.count(), "cold_windows": self.cold.count(), "warm_db_bytes": warm_db.stat().st_size if warm_db.exists() else 0, "cold_db_bytes": cold_db.stat().st_size if cold_db.exists() else 0, "data_dir": str(self._data_dir), } - glama/server.py:790-802 (registration)Glama.ai registry stub registration of memory_stats. Returns a message telling users to install the local daemon.
@mcp_app.tool() def memory_stats() -> str: """Return statistics about the local memory database (entry count, size, tags, embedding state). Returns counts (rows, distinct tags, total bytes), embedding index status (built / building / stale), and last-write timestamp. USE WHEN: troubleshooting why memory_semantic_search returns no results, or sizing the user's local data footprint. BEHAVIOR: pure read of metadata. No side effects. """ return _LOCAL_ONLY_MSG