Skip to main content
Glama

get_daily_metrics

Retrieve daily usage trends and patterns from Langfuse analytics for specified date ranges, with optional environment filtering and missing data handling.

Instructions

Daily usage trends and patterns.

Input Schema

NameRequiredDescriptionDefault
fromYesStart timestamp (ISO 8601)
toYesEnd timestamp (ISO 8601)
environmentNoOptional environment filter
fillMissingDaysNoFill missing days with zero values (default: true)

Input Schema (JSON Schema)

{ "properties": { "environment": { "description": "Optional environment filter", "type": "string" }, "fillMissingDays": { "description": "Fill missing days with zero values (default: true)", "type": "boolean" }, "from": { "description": "Start timestamp (ISO 8601)", "format": "date-time", "type": "string" }, "to": { "description": "End timestamp (ISO 8601)", "format": "date-time", "type": "string" } }, "required": [ "from", "to" ], "type": "object" }

Implementation Reference

  • The main handler function that fetches daily metrics from Langfuse client, processes and filters data by date range, calculates aggregates like total tokens, costs, averages, fills missing days if requested, and returns formatted JSON response.
    export async function getDailyMetrics( client: LangfuseAnalyticsClient, args: z.infer<typeof getDailyMetricsSchema> ) { try { // Use the working getDailyMetrics API directly (same approach as cost_analysis) const dailyResponse = await client.getDailyMetrics({ tags: args.environment ? [`environment:${args.environment}`] : undefined, }); const dailyData: any[] = []; if (dailyResponse.data && Array.isArray(dailyResponse.data)) { // Filter by date range const fromDate = new Date(args.from); const toDate = new Date(args.to); const filteredData = dailyResponse.data.filter((day: any) => { const dayDate = new Date(day.date); return dayDate >= fromDate && dayDate <= toDate; }); // Process each day's data filteredData.forEach((day: any) => { // Calculate total tokens from usage breakdown let totalTokens = 0; let totalObservations = 0; if (day.usage && Array.isArray(day.usage)) { totalTokens = day.usage.reduce((sum: number, usage: any) => { return sum + (usage.totalUsage || usage.inputUsage + usage.outputUsage || 0); }, 0); totalObservations = day.usage.reduce((sum: number, usage: any) => { return sum + (usage.countObservations || 0); }, 0); } dailyData.push({ date: day.date, totalCost: day.totalCost || 0, totalTokens: totalTokens, totalTraces: day.countTraces || 0, totalObservations: totalObservations || day.countObservations || 0, avgCostPerTrace: (day.countTraces || 0) > 0 ? Math.round(((day.totalCost || 0) / (day.countTraces || 0)) * 10000) / 10000 : 0, avgTokensPerTrace: (day.countTraces || 0) > 0 ? Math.round((totalTokens / (day.countTraces || 0)) * 100) / 100 : 0, }); }); // Fill in missing days if requested if (args.fillMissingDays) { const startDate = new Date(args.from); const endDate = new Date(args.to); const dataMap = new Map(dailyData.map(d => [d.date, d])); dailyData.length = 0; // Clear array for (let date = new Date(startDate); date <= endDate; date.setDate(date.getDate() + 1)) { const dateStr = date.toISOString().split('T')[0]; const existingData = dataMap.get(dateStr); if (existingData) { dailyData.push(existingData); } else { // Fill missing day with zeros dailyData.push({ date: dateStr, totalCost: 0, totalTokens: 0, totalTraces: 0, totalObservations: 0, avgCostPerTrace: 0, avgTokensPerTrace: 0, }); } } } // Sort by date dailyData.sort((a, b) => a.date.localeCompare(b.date)); } // Return the successful result const result: DailyMetrics = { projectId: client.getProjectId(), from: args.from, to: args.to, dailyData, }; return { content: [ { type: 'text' as const, text: JSON.stringify(result, null, 2), }, ], }; } catch (error) { return { content: [ { type: 'text' as const, text: JSON.stringify({ error: 'Failed to get daily metrics', message: error instanceof Error ? error.message : 'Unknown error', projectId: client.getProjectId(), from: args.from, to: args.to, }, null, 2), }, ], isError: true, }; } }
  • Zod schema defining input parameters: from/to datetimes (required), optional environment filter, and fillMissingDays boolean (default true).
    export const getDailyMetricsSchema = z.object({ from: z.string().datetime(), to: z.string().datetime(), environment: z.string().optional(), fillMissingDays: z.boolean().default(true), });
  • src/index.ts:1056-1059 (registration)
    Registration in the CallToolRequestSchema handler switch statement: parses arguments with schema and calls the getDailyMetrics handler function.
    case 'get_daily_metrics': { const args = getDailyMetricsSchema.parse(request.params.arguments); return await getDailyMetrics(this.client, args); }
  • src/index.ts:497-524 (registration)
    Tool registration in ListToolsRequestSchema handler: defines name, description, and inputSchema for discovery.
    { name: 'get_daily_metrics', description: 'Daily usage trends and patterns.', inputSchema: { type: 'object', properties: { from: { type: 'string', format: 'date-time', description: 'Start timestamp (ISO 8601)', }, to: { type: 'string', format: 'date-time', description: 'End timestamp (ISO 8601)', }, environment: { type: 'string', description: 'Optional environment filter', }, fillMissingDays: { type: 'boolean', description: 'Fill missing days with zero values (default: true)', }, }, required: ['from', 'to'], }, },
  • src/index.ts:63-63 (registration)
    Import of the handler function and schema from the tools module.
    import { getDailyMetrics, getDailyMetricsSchema } from './tools/get-daily-metrics.js';

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/therealsachin/langfuse-mcp-server'

If you have feedback or need assistance with the MCP directory API, please join our Discord server