Skip to main content
Glama
brucepro

BuildAutomata Memory MCP Server

by brucepro
cache.py665 B
""" LRU Cache implementation for BuildAutomata Memory System Copyright 2025 Jurden Bruce """ from collections import OrderedDict class LRUCache(OrderedDict): """Simple LRU cache with max size""" def __init__(self, maxsize=1000): self.maxsize = maxsize super().__init__() def __setitem__(self, key, value): if key in self: self.move_to_end(key) super().__setitem__(key, value) if len(self) > self.maxsize: oldest = next(iter(self)) del self[oldest] def __getitem__(self, key): value = super().__getitem__(key) self.move_to_end(key) return value

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/brucepro/buildautomata_memory_mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server