Skip to main content
Glama

MCP Server for langfuse

by z9905080

langfuse 的 MCP 服务器

用于将 AI 助手与 Langfuse 工作区集成的模型上下文协议 (MCP) 服务器实现。

概述

此软件包提供了一个 MCP 服务器,使 AI 助手能够与 Langfuse 工作区交互。它允许 AI 模型执行以下操作:

  • 按时间范围查询 LLM 指标

安装

# Install from npm npm install shouting-mcp-langfuse # Or install globally npm install -g shouting-mcp-langfuse

您可以在 npm 上找到该软件包: shouting-mcp-langfuse

先决条件

在使用服务器之前,您需要创建一个 Langfuse 项目并获取项目的公钥和私钥。您可以在 Langfuse 仪表板中找到这些密钥。

  1. 建立一个 Langfuse 项目
  2. 获取公钥和私钥
  3. 设置环境变量

配置

服务器需要以下环境变量:

  • LANGFUSE_DOMAIN :Langfuse 域名(默认值: https://api.langfuse.com
  • LANGFUSE_PUBLIC_KEY :你的 Langfuse 项目公钥
  • LANGFUSE_PRIVATE_KEY :您的 Langfuse 项目私钥

用法

作为 CLI 工具运行

# Set environment variables export LANGFUSE_DOMAIN="https://api.langfuse.com" export LANGFUSE_PUBLIC_KEY="your-public-key" export LANGFUSE_PRIVATE_KEY="your-private # Run the server mcp-server-langfuse

在代码中使用

import { Server } from "@modelcontextprotocol/sdk/server/index.js"; import { langfuseClient } from "shouting-mcp-langfuse"; // Initialize the server and client const server = new Server({...}); const langfuseClient = new LangfuseClient(process.env.LANGFUSE_DOMAIN, process.env.LANGFUSE_PUBLIC_KEY, process.env.LANGFUSE_PRIVATE_KEY); // Register your custom handlers // ...

可用工具

服务器提供以下 langfuse 集成工具:

  • getLLMMetricsByTimeRange :按时间范围获取 LLM 指标

执照

国际学习中心

作者

shouting.hsiao@gmail.com

存储库

https://github.com/z9905080/mcp-langfuse

-
security - not tested
A
license - permissive license
-
quality - not tested

remote-capable server

The server can be hosted and run remotely because it primarily relies on remote services or has no dependency on the local environment.

MCP 服务器实现将 AI 助手与 Langfuse 工作区相集成,允许模型按时间范围查询 LLM 指标。

  1. 概述
    1. 安装
      1. 先决条件
        1. 配置
          1. 用法
            1. 作为 CLI 工具运行
            2. 在代码中使用
          2. 可用工具
            1. 执照
              1. 作者
                1. 存储库

                  Related MCP Servers

                  • -
                    security
                    F
                    license
                    -
                    quality
                    A production-ready MCP server built with FastAPI, providing an enhanced tool registry for creating, managing, and documenting AI tools for Large Language Models (LLMs).
                    Last updated 5 months ago
                    32
                    Python
                  • A
                    security
                    F
                    license
                    A
                    quality
                    An MCP server that enables LLMs to understand and analyze code structure through function call graphs, allowing AI assistants to explore relationships between functions and analyze dependencies in Python repositories.
                    Last updated 4 months ago
                    6
                    11
                    Python
                  • A
                    security
                    A
                    license
                    A
                    quality
                    A Model Context Protocol server that enables AI assistants to interact with Linear project management systems, allowing users to retrieve, create, and update issues, projects, and teams through natural language.
                    Last updated 12 days ago
                    32
                    813
                    94
                    TypeScript
                    MIT License
                    • Apple
                  • -
                    security
                    F
                    license
                    -
                    quality
                    An MCP server that fetches real-time documentation for popular libraries like Langchain, Llama-Index, MCP, and OpenAI, allowing LLMs to access updated library information beyond their knowledge cut-off dates.
                    Last updated 4 months ago
                    2
                    Python

                  View all related MCP servers

                  MCP directory API

                  We provide all the information about MCP servers via our MCP API.

                  curl -X GET 'https://glama.ai/api/mcp/v1/servers/z9905080/mcp-langfuse'

                  If you have feedback or need assistance with the MCP directory API, please join our Discord server