Skip to main content
Glama

get_cache_statistics

Retrieve current analysis cache statistics to monitor system performance and optimize resource usage for diagnostic workflows.

Instructions

Get statistics about the current analysis cache

WORKFLOW: System diagnostics and function discovery TIP: Start with health_check, use list_functions to explore capabilities SAVES: Claude context for strategic decisions

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault

No arguments

Implementation Reference

  • CacheStatisticsPlugin class: implements the 'get_cache_statistics' tool handler. The execute method retrieves cache statistics using CacheManager and returns a formatted response.
    export class CacheStatisticsPlugin extends BasePlugin implements IPromptPlugin { name = 'get_cache_statistics'; category = 'system' as const; description = 'Get statistics about the current analysis cache'; parameters = {}; async execute(params: any, llmClient: any) { return await withSecurity(this, params, llmClient, async (secureParams) => { const stats = CacheManager.getStatistics(); // Use ResponseFactory for consistent, spec-compliant output ResponseFactory.setStartTime(); return ResponseFactory.createSystemResponse({ status: 'active', details: { totalEntries: stats.totalEntries, memoryUsage: stats.memoryUsage, files: stats.files, oldestEntry: stats.files.length > 0 ? new Date().toISOString() : 'none', newestEntry: stats.files.length > 0 ? new Date().toISOString() : 'none', hitRate: 0, // Would need actual hit tracking statistics: { byType: { 'analysis': stats.totalEntries }, bySize: { 'small': stats.totalEntries } } } }); }); } // MODERN: 3-Stage prompt architecture (system utility - no prompting needed) getPromptStages(params: any): PromptStages { return { systemAndContext: 'System cache statistics utility', dataPayload: 'Cache statistics request', outputInstructions: 'Return cache statistics and metrics' }; } // LEGACY: Backwards compatibility method getPrompt(params: any): string { const stages = this.getPromptStages(params); return `${stages.systemAndContext}\n\n${stages.dataPayload}\n\n${stages.outputInstructions}`; } }
  • CacheManager class: provides static methods for cache operations, including getStatistics() which returns totalEntries, memoryUsage, and cached files used by the tool handler.
    export class CacheManager { private static cache: Map<string, any> = new Map(); static clear(filePath?: string): void { if (filePath) { this.cache.delete(filePath); } else { this.cache.clear(); } } static getStatistics(): any { return { totalEntries: this.cache.size, memoryUsage: this.estimateMemoryUsage(), files: Array.from(this.cache.keys()) }; } static getCacheSize(): number { return this.cache.size; } static estimateMemoryUsage(): string { const size = JSON.stringify(Array.from(this.cache.entries())).length; return `${(size / 1024).toFixed(2)} KB`; } }
  • GetCacheStatisticsResponse interface: defines the expected response schema for the get_cache_statistics tool.
    export interface GetCacheStatisticsResponse extends BaseResponse { data: { totalEntries: number; memoryUsage: string; files: string[]; oldestEntry: string; newestEntry: string; hitRate: number; statistics: { byType: { [type: string]: number }; bySize: { [range: string]: number }; }; }; }
  • FunctionResponseMap entry: maps 'get_cache_statistics' tool name to its response schema.
    'get_cache_statistics': GetCacheStatisticsResponse;

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/houtini-ai/lm'

If you have feedback or need assistance with the MCP directory API, please join our Discord server