Skip to main content
Glama
cache.py795 B
import logging from typing import Any, Optional from sensai.util.pickle import dump_pickle, load_pickle log = logging.getLogger(__name__) def load_cache(path: str, version: Any) -> Optional[Any]: data = load_pickle(path) if not isinstance(data, dict) or "__cache_version" not in data: log.info("Cache is outdated (expected version %s). Ignoring cache at %s", version, path) return None saved_version = data["__cache_version"] if saved_version != version: log.info("Cache is outdated (expected version %s, got %s). Ignoring cache at %s", version, saved_version, path) return None return data["obj"] def save_cache(path: str, version: Any, obj: Any) -> None: data = {"__cache_version": version, "obj": obj} dump_pickle(data, path)

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/oraios/serena'

If you have feedback or need assistance with the MCP directory API, please join our Discord server