Skip to main content
Glama

store_data

Store data in a cache with optional expiration time to optimize token usage during language model interactions.

Instructions

Store data in the cache with optional TTL

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
keyYesUnique identifier for the cached data
valueYesData to cache
ttlNoTime-to-live in seconds (optional)

Implementation Reference

  • Executes the store_data tool by parsing arguments and delegating to CacheManager.set()
    case 'store_data': { const { key, value, ttl } = request.params.arguments as { key: string; value: any; ttl?: number; }; this.cacheManager.set(key, value, ttl); return { content: [ { type: 'text', text: `Successfully stored data with key: ${key}`, }, ], }; }
  • src/index.ts:99-120 (registration)
    Registers the store_data tool including its description and input schema in the ListTools response
    { name: 'store_data', description: 'Store data in the cache with optional TTL', inputSchema: { type: 'object', properties: { key: { type: 'string', description: 'Unique identifier for the cached data', }, value: { type: 'any', description: 'Data to cache', }, ttl: { type: 'number', description: 'Time-to-live in seconds (optional)', }, }, required: ['key', 'value'], }, },
  • Input schema definition for the store_data tool
    inputSchema: { type: 'object', properties: { key: { type: 'string', description: 'Unique identifier for the cached data', }, value: { type: 'any', description: 'Data to cache', }, ttl: { type: 'number', description: 'Time-to-live in seconds (optional)', }, }, required: ['key', 'value'], },
  • Core implementation of data storage in the cache, including size calculation, memory limit enforcement, and TTL handling
    set(key: string, value: any, ttl?: number): void { const startTime = performance.now(); // Calculate approximate size in bytes const size = this.calculateSize(value); // Check if adding this entry would exceed memory limit if (this.stats.memoryUsage + size > this.config.maxMemory) { this.enforceMemoryLimit(size); } const entry: CacheEntry = { value, created: Date.now(), lastAccessed: Date.now(), ttl: ttl ?? this.config.defaultTTL, size }; this.cache.set(key, entry); this.stats.totalEntries = this.cache.size; this.stats.memoryUsage += size; const endTime = performance.now(); this.updateAccessTime(endTime - startTime); }

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/tosin2013/mcp-memory-cache-server'

If you have feedback or need assistance with the MCP directory API, please join our Discord server