Skip to main content
Glama

configure_memory_optimization

Adjust memory optimization settings on Mode Manager MCP to enable auto-optimization based on file size, entry count, or time thresholds for improved performance.

Instructions

Configure memory optimization settings for auto-optimization behavior.

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
auto_optimizeNo
entry_thresholdNo
memory_fileNo
size_thresholdNo
time_threshold_daysNo

Implementation Reference

  • The main handler function for the 'configure_memory_optimization' tool. It updates the frontmatter of the specified (or default) memory file with new optimization settings like auto_optimize, thresholds, etc., creates a backup, and returns a confirmation message.
    def configure_memory_optimization( memory_file: Annotated[Optional[str], "Path to memory file to configure"] = None, auto_optimize: Annotated[Optional[bool], "Enable/disable auto-optimization"] = None, size_threshold: Annotated[Optional[int], "Size threshold in bytes"] = None, entry_threshold: Annotated[Optional[int], "Entry count threshold"] = None, time_threshold_days: Annotated[Optional[int], "Time threshold in days"] = None, ) -> str: """Configure memory optimization settings.""" if read_only: return "Error: Server is running in read-only mode" try: # Determine which file to configure if memory_file: file_path = Path(memory_file) if not file_path.exists(): return f"Error: Memory file not found: {memory_file}" else: # Use default user memory file user_memory_path = instruction_manager.get_memory_file_path() if not user_memory_path.exists(): return "Error: No user memory file found to configure" file_path = user_memory_path # Read current frontmatter from ..simple_file_ops import parse_frontmatter_file, write_frontmatter_file frontmatter, content = parse_frontmatter_file(file_path) # Update settings updated_settings = [] if auto_optimize is not None: frontmatter["autoOptimize"] = auto_optimize updated_settings.append(f"auto_optimize: {auto_optimize}") if size_threshold is not None: frontmatter["sizeThreshold"] = size_threshold updated_settings.append(f"size_threshold: {size_threshold:,} bytes") if entry_threshold is not None: frontmatter["entryThreshold"] = entry_threshold updated_settings.append(f"entry_threshold: {entry_threshold}") if time_threshold_days is not None: frontmatter["timeThreshold"] = time_threshold_days updated_settings.append(f"time_threshold: {time_threshold_days} days") if not updated_settings: return "No settings provided to update. Available options: auto_optimize, size_threshold, entry_threshold, time_threshold_days" # Write updated frontmatter success = write_frontmatter_file(file_path, frontmatter, content, create_backup=True) if success: message = "✅ Memory optimization settings updated:\n" for setting in updated_settings: message += f"• {setting}\n" message += f"\n💾 Backup created for safety" return message else: return "❌ Failed to update memory optimization settings" except Exception as e: return f"Error configuring memory optimization: {str(e)}"
  • Registers the 'configure_memory_optimization' tool using the @app.tool decorator, including description, tags, parameter annotations (serving as schema), and metadata.
    @app.tool( name="configure_memory_optimization", description="Configure memory optimization settings for auto-optimization behavior.", tags={"public", "memory"}, annotations={ "idempotentHint": False, "readOnlyHint": False, "title": "Configure Memory Optimization", "parameters": { "memory_file": "Optional path to specific memory file. If not provided, will configure the user's main memory file.", "auto_optimize": "Enable or disable automatic optimization. True/False.", "size_threshold": "File size threshold in bytes for triggering optimization.", "entry_threshold": "Number of new entries to trigger optimization.", "time_threshold_days": "Number of days between optimizations.", }, "returns": "Returns confirmation of updated settings.", }, meta={ "category": "memory", }, )

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/NiclasOlofsson/mode-manager-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server