Skip to main content
Glama

toggl_warm_cache

Pre-fetch and cache workspace, project, and client data to improve Toggl Track integration performance by loading frequently accessed information in advance.

Instructions

Pre-fetch and cache workspace, project, and client data for better performance

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
workspace_idNoSpecific workspace to warm cache for

Implementation Reference

  • Tool schema definition including name, description, and input schema for 'toggl_warm_cache'.
    {
      name: 'toggl_warm_cache',
      description: 'Pre-fetch and cache workspace, project, and client data for better performance',
      inputSchema: {
        type: 'object',
        properties: {
          workspace_id: {
            type: 'number',
            description: 'Specific workspace to warm cache for'
          }
        }
      },
    },
  • The handler function for the 'toggl_warm_cache' tool that parses arguments, warms the cache using CacheManager, sets cacheWarmed flag, retrieves stats, and returns a success response.
    case 'toggl_warm_cache': {
      const workspaceId = (args?.workspace_id as number | undefined) || defaultWorkspaceId;
      await cache.warmCache(workspaceId);
      cacheWarmed = true;
      
      const stats = cache.getStats();
      
      return {
        content: [{
          type: 'text',
          text: JSON.stringify({ 
            success: true,
            message: 'Cache warmed successfully',
            stats 
          }, null, 2)
        }]
      };
    }
  • Core implementation of cache warming: fetches workspaces, and for specified or first few workspaces, pre-caches projects, clients, and tags.
    async warmCache(workspaceId?: number): Promise<void> {
      // Log to stderr to avoid interfering with MCP stdio protocol
      console.error('Warming cache...');
      
      try {
        // Fetch all workspaces
        const workspaces = await this.getWorkspaces();
        
        // If workspace specified, fetch its entities
        if (workspaceId) {
          await Promise.all([
            this.getProjects(workspaceId),
            this.getClients(workspaceId),
            this.getTags(workspaceId)
          ]);
        } else {
          // Fetch entities for all workspaces (be careful with rate limits)
          for (const ws of workspaces.slice(0, 3)) { // Limit to first 3 workspaces
            await Promise.all([
              this.getProjects(ws.id),
              this.getClients(ws.id),
              this.getTags(ws.id)
            ]);
          }
        }
        
        console.error('Cache warmed successfully');
      } catch (error) {
        console.error('Failed to warm cache:', error);
      }
    }

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/verygoodplugins/mcp-toggl'

If you have feedback or need assistance with the MCP directory API, please join our Discord server