Langfuse MCP Server
Server Configuration
Describes the environment variables required to run the server.
| Name | Required | Description | Default |
|---|---|---|---|
| LANGFUSE_BASEURL | No | Your Langfuse instance URL | https://us.cloud.langfuse.com |
| LANGFUSE_PUBLIC_KEY | Yes | Your Langfuse public key | |
| LANGFUSE_SECRET_KEY | Yes | Your Langfuse secret key |
Capabilities
Server capabilities have not been inspected yet.
Tools
Functions exposed to the LLM to take actions
| Name | Description |
|---|---|
| list_projectsB | List configured Langfuse projects available to this MCP server. |
| project_overviewC | Get a summary of total cost, tokens, and traces for a project over a time window. |
| usage_by_modelC | Break down usage and cost by AI model over a time period. |
| usage_by_serviceC | Analyze usage and cost by service/feature tag over a time period. |
| top_expensive_tracesC | Find the most expensive traces by cost over a time period. |
| get_trace_detailC | Get detailed information about a specific trace including all observations. |
| get_projectsA | List available Langfuse projects (alias for list_projects). |
| get_metricsB | Query aggregated metrics (costs, tokens, counts) with flexible filtering and dimensions. |
| get_tracesC | Fetch traces with flexible filtering options. |
| get_observationsB | Get LLM generations/spans with details and filtering. |
| get_cost_analysisC | Specialized cost breakdowns by model, user, and daily trends. |
| get_daily_metricsC | Daily usage trends and patterns. |
| get_observation_detailC | Get detailed information about a specific observation by ID. |
| get_health_statusB | Get system health status and availability information. |
| list_modelsB | List all available AI models in the Langfuse project. |
| get_model_detailB | Get detailed information about a specific AI model. |
| list_promptsC | List all prompt templates in the Langfuse project. |
| get_prompt_detailC | Get detailed information about a specific prompt template. |
| list_datasetsB | List all datasets in the project with pagination support. |
| get_datasetC | Get detailed information about a specific dataset by name. |
| list_dataset_itemsC | List items in datasets with filtering and pagination. |
| get_dataset_itemC | Get detailed information about a specific dataset item. |
| list_commentsC | List comments with filtering options for objects and users. |
| get_commentC | Get detailed information about a specific comment. |
Prompts
Interactive templates invoked by user choice
| Name | Description |
|---|---|
No prompts | |
Resources
Contextual data attached and managed by the client
| Name | Description |
|---|---|
No resources | |
Latest Blog Posts
MCP directory API
We provide all the information about MCP servers via our MCP API.
curl -X GET 'https://glama.ai/api/mcp/v1/servers/therealsachin/langfuse-mcp'
If you have feedback or need assistance with the MCP directory API, please join our Discord server