Skip to main content
Glama

sage_turn

Manages conversation memory by recalling relevant context and storing current observations to build persistent episodic experience across sessions.

Instructions

Per-conversation-turn memory cycle. Call this EVERY turn. It does two things atomically: (1) Recalls consensus-committed memories relevant to the current topic (so you have context), and (2) Stores an observation about what just happened in this turn (so future-you has context). This builds episodic experience turn-by-turn, like human memory — not a context window dump. Domains are dynamic: create whatever domain fits the conversation (e.g. 'quantum-physics', 'go-debugging', 'user-project-x'). You decide what's relevant to recall based on the conversation context.

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
domainNoKnowledge domain — create dynamically based on the topic (e.g. 'rust-async', 'user-preferences', 'sage-architecture'). Don't reuse 'general' when a specific domain fits better.
observationNoWhat happened this turn — the user's request and key points of your response. Keep it concise but capture the essential insight.
topicYesWhat the current conversation is about — used for contextual recall

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/l33tdawg/s-age'

If you have feedback or need assistance with the MCP directory API, please join our Discord server