Skip to main content
Glama
Log-LogN

langfuse-mcp-java

fetch_traces

Retrieve paginated Langfuse traces to monitor LLM pipeline executions with filtering by user, session, tags, and time range for observability analysis.

Instructions

Returns a paginated list of Langfuse traces.

Each trace represents one end-to-end LLM pipeline execution. The response includes: id, name, userId, sessionId, level (DEFAULT | DEBUG | WARNING | ERROR), latency (seconds), totalTokens, totalCost (USD), tags, timestamp.

All filter parameters are optional. Omit any filter you do not need — omitted filters are ignored and do not narrow the result set.

Pagination: page is 1-based (default 1), limit controls page size (default 20, max 100). To page through results, increment page while keeping limit fixed.

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
pageYesPage number, 1-based. Omit to use default (1).
limitYesResults per page, max 100. Omit to use default (20).
userIdYesFilter by the Langfuse user ID attached to the trace. Omit to return traces for all users.
nameYesFilter by trace name — must be an exact string match. Omit to return all trace names.
sessionIdYesFilter by session ID to return only traces belonging to that session. Omit to return traces across all sessions.
tagsYesFilter by a single tag string. Omit to return traces regardless of tags.
fromTimestampYesStart of time range in ISO-8601 format, e.g. 2025-01-01T00:00:00Z. Omit to include traces from the beginning of the project.
toTimestampYesEnd of time range in ISO-8601 format, e.g. 2025-12-31T23:59:59Z. Omit to include traces up to the current time.

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/Log-LogN/langfuse-mcp-java'

If you have feedback or need assistance with the MCP directory API, please join our Discord server