Skip to main content
Glama
avivsinai

langfuse-mcp

Langfuse MCP Server

PyPI Python 3.10–3.13 License: MIT

Model Context Protocol server for Langfuse observability. Query traces, debug errors, analyze sessions, manage prompts.

Why langfuse-mcp?

Comparison with official Langfuse MCP (as of Jan 2026):

langfuse-mcp

Official

Traces & Observations

Yes

No

Sessions & Users

Yes

No

Exception Tracking

Yes

No

Prompt Management

Yes

Yes

Dataset Management

Yes

No

Selective Tool Loading

Yes

No

This project provides a full observability toolkit — traces, observations, sessions, exceptions, and prompts — while the official MCP focuses on prompt management.

Quick Start

Requires uv (for uvx).

Get credentials from Langfuse Cloud → Settings → API Keys. If self-hosted, use your instance URL for LANGFUSE_HOST.

# Claude Code (project-scoped, shared via .mcp.json) claude mcp add \ --scope project \ --env LANGFUSE_PUBLIC_KEY=pk-... \ --env LANGFUSE_SECRET_KEY=sk-... \ --env LANGFUSE_HOST=https://cloud.langfuse.com \ langfuse -- uvx --python 3.11 langfuse-mcp # Codex CLI (user-scoped, stored in ~/.codex/config.toml) codex mcp add langfuse \ --env LANGFUSE_PUBLIC_KEY=pk-... \ --env LANGFUSE_SECRET_KEY=sk-... \ --env LANGFUSE_HOST=https://cloud.langfuse.com \ -- uvx --python 3.11 langfuse-mcp

Restart your CLI, then verify with /mcp (Claude Code) or codex mcp list (Codex).

Tools (25 total)

Category

Tools

Traces

fetch_traces, fetch_trace

Observations

fetch_observations, fetch_observation

Sessions

fetch_sessions, get_session_details, get_user_sessions

Exceptions

find_exceptions, find_exceptions_in_file, get_exception_details, get_error_count

Prompts

list_prompts, get_prompt, get_prompt_unresolved, create_text_prompt, create_chat_prompt, update_prompt_labels

Datasets

list_datasets, get_dataset, list_dataset_items, get_dataset_item, create_dataset, create_dataset_item, delete_dataset_item

Schema

get_data_schema

Dataset Item Updates (Upsert)

Langfuse uses upsert for dataset items. To edit an existing item, call create_dataset_item with item_id. If the ID exists, it updates; otherwise it creates a new item.

create_dataset_item( dataset_name="qa-test-cases", item_id="item_123", input={"question": "What is 2+2?"}, expected_output={"answer": "4"} )

Skill

This project includes a skill with debugging playbooks.

Via (registry-based):

npx skild install @avivsinai/langfuse

Via (GitHub-based):

npx skills add avivsinai/langfuse-mcp

Manual install:

cp -r skills/langfuse ~/.claude/skills/ # Claude Code cp -r skills/langfuse ~/.codex/skills/ # Codex CLI

Try asking: "help me debug langfuse traces"

See skills/langfuse/SKILL.md for full documentation.

Selective Tool Loading

Load only the tool groups you need to reduce token overhead:

langfuse-mcp --tools traces,prompts

Available groups: traces, observations, sessions, exceptions, prompts, datasets, schema

Read-Only Mode

Disable all write operations for safer read-only access:

langfuse-mcp --read-only # Or via environment variable LANGFUSE_MCP_READ_ONLY=true langfuse-mcp

This disables: create_text_prompt, create_chat_prompt, update_prompt_labels, create_dataset, create_dataset_item, delete_dataset_item

Other Clients

Cursor

Create .cursor/mcp.json in your project (or ~/.cursor/mcp.json for global):

{ "mcpServers": { "langfuse": { "command": "uvx", "args": ["--python", "3.11", "langfuse-mcp"], "env": { "LANGFUSE_PUBLIC_KEY": "pk-...", "LANGFUSE_SECRET_KEY": "sk-...", "LANGFUSE_HOST": "https://cloud.langfuse.com" } } } }

Docker

docker run --rm -i \ -e LANGFUSE_PUBLIC_KEY=pk-... \ -e LANGFUSE_SECRET_KEY=sk-... \ -e LANGFUSE_HOST=https://cloud.langfuse.com \ ghcr.io/avivsinai/langfuse-mcp:latest

Development

uv venv --python 3.11 .venv && source .venv/bin/activate uv pip install -e ".[dev]" pytest

License

MIT

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/avivsinai/landfuse-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server