Skip to main content
Glama
wn01011

llm-token-tracker

track_usage

Monitor and record token consumption for AI API calls from OpenAI and Anthropic models to track usage patterns and manage costs.

Instructions

Track token usage for an AI API call

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
input_tokensYesInput tokens used
modelYesModel name
output_tokensYesOutput tokens used
providerYesAI provider
user_idNoOptional user ID

Implementation Reference

  • The main handler function that executes the track_usage tool logic. It tracks token usage via the TokenTracker, calculates costs, and returns a formatted response.
    private trackUsage(args: any) { const { provider, model, input_tokens, output_tokens, user_id = 'current-session' } = args; const trackingId = this.tracker.startTracking(user_id); this.tracker.endTracking(trackingId, { provider: provider as 'openai' | 'anthropic' | 'gemini', model, inputTokens: input_tokens, outputTokens: output_tokens, totalTokens: input_tokens + output_tokens }); const usage = this.tracker.getUserUsage(user_id); const totalTokens = input_tokens + output_tokens; const cost = usage?.totalCost || 0; return { content: [ { type: 'text', text: `βœ… Tracked ${totalTokens.toLocaleString()} tokens for ${model}\n` + `πŸ’° Session Cost: ${formatCost(cost)}\n` + `πŸ“Š Total: ${usage?.totalTokens.toLocaleString() || 0} tokens` } ] }; }
  • The input schema definition for the track_usage tool, specifying parameters like provider, model, input/output tokens.
    inputSchema: { type: 'object', properties: { provider: { type: 'string', enum: ['openai', 'anthropic', 'gemini'], description: 'AI provider' }, model: { type: 'string', description: 'Model name' }, input_tokens: { type: 'number', description: 'Input tokens used' }, output_tokens: { type: 'number', description: 'Output tokens used' }, user_id: { type: 'string', description: 'Optional user ID' } }, required: ['provider', 'model', 'input_tokens', 'output_tokens'] }
  • The tool registration in the ListTools response, including name, description, and schema.
    { name: 'track_usage', description: 'Track token usage for an AI API call', inputSchema: { type: 'object', properties: { provider: { type: 'string', enum: ['openai', 'anthropic', 'gemini'], description: 'AI provider' }, model: { type: 'string', description: 'Model name' }, input_tokens: { type: 'number', description: 'Input tokens used' }, output_tokens: { type: 'number', description: 'Output tokens used' }, user_id: { type: 'string', description: 'Optional user ID' } }, required: ['provider', 'model', 'input_tokens', 'output_tokens'] } },
  • The dispatch case in the CallToolRequest handler that routes to the trackUsage method.
    case 'track_usage': return this.trackUsage(request.params.arguments);

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/wn01011/llm-token-tracker'

If you have feedback or need assistance with the MCP directory API, please join our Discord server