Utility MCP Server
The Utility MCP Server provides AI assistants with tools for time management, data analysis, and observability integration.
Time Management Tools
get_current_time: Retrieve the current time in any timezone (e.g., Asia/Shanghai, UTC, America/New_York), returned asYYYY-MM-DD HH:MM:SS timezoneget_timestamp: Get the current Unix timestamp in seconds (timezone-independent)format_timestamp: Convert a Unix timestamp to a human-readableYYYY-MM-DD HH:MM:SS timezonestring in a specified timezone
ChatBI Data Analysis Tools
schema_search: Search project schemas using vector recall and keyword retrieval (with stages: tokenization, embedding, vector search, keyword search, fusion)execute_sql: Execute read-only PostgreSQL queries and return JSON results (with SQL Guardrail validation)
Observability Integration
Automatically logs tool usage to Langfuse for end-to-end tracing with nested stage logging
Propagates tracing context (traceparent headers) from upstream services like ChatBI
Deployment & Configuration
Runs in local (stdio) or remote (Streamable HTTP/SSE) modes
Supports Docker/Docker Compose deployment
Compatible with Cursor, Claude Desktop, and ChatBI/LangChain4j clients
Health check endpoint at
/health; configurable via environment variables
Provides capabilities to execute read-only PostgreSQL queries and perform schema searches using vector recall and keyword retrieval.
Click on "Install Server".
Wait a few minutes for the server to deploy. Once ready, it will show a "Started" state.
In the chat, type
@followed by the MCP server name and your instructions, e.g., "@Utility MCP ServerWhat is the current time in Tokyo?"
That's it! The server will respond to your query, and you can continue using it as needed.
Here is a step-by-step guide with screenshots.
MCP Server - 通用功能服务器
一个基于 Model Context Protocol (MCP) 的通用功能服务器,为 AI 提供扩展能力。
支持 本地模式 (stdio) 与 远程模式 (Streamable HTTP / SSE) 双模式运行,并可将 ChatBI -> MCP -> Langfuse 的观测链路串起来。
✨ 功能
🕐 时间工具
工具 | 描述 |
| 获取指定时区的当前时间 |
| 获取当前 Unix 时间戳 |
| 将时间戳转换为可读格式 |
📊 ChatBI 数据分析工具
工具 | 描述 |
| 基于向量召回 + 关键词检索返回项目相关 Schema |
| 执行只读 PostgreSQL 查询并返回 JSON 结果 |
🔭 Langfuse 链路观测
schema_search会记录嵌套阶段:分词、Embedding、向量检索、关键词检索、融合裁剪。execute_sql会记录嵌套阶段:SQL Guardrail 校验、SQL 执行。当上游 ChatBI 通过
traceparent和x-chatbi-*头透传上下文时,MCP 工具观测会自动挂到同一条 Langfuse Trace 下。
🚀 快速开始
安装依赖
uv sync本地模式运行
uv run mcp-server远程模式运行(默认推荐)
uv run mcp-server --remote --transport streamable-http --port 8000启动后可访问:
MCP:
http://localhost:8000/mcpHealth:
http://localhost:8000/healthSSE 兼容端点:
http://localhost:8000/sse
SSE 兼容模式
uv run mcp-server --remote --transport sse --port 8000🔌 客户端集成
本地模式配置
适用于 Cursor / Claude Desktop:
{
"mcpServers": {
"utility-server": {
"command": "uv",
"args": ["--directory", "/path/to/mcp-server", "run", "mcp-server"]
}
}
}远程模式配置
ChatBI / LangChain4j
直接连接 Streamable HTTP 端点:
http://your-server:8000/mcp兼容 mcp-remote 的 SSE 客户端
{
"mcpServers": {
"utility-server": {
"command": "npx",
"args": ["-y", "mcp-remote", "http://your-server:8000/sse"]
}
}
}🐳 Docker 部署
使用 Docker Compose(推荐)
docker compose up -d
docker compose logs -f
docker compose down手动 Docker 命令
docker build -t mcp-server .
docker run -d -p 8000:8000 --name mcp-server mcp-server
docker run -d -p 9000:9000 -e PORT=9000 --name mcp-server mcp-server🔐 Langfuse 配置
在 .env 中配置:
LANGFUSE_ENABLED=true
LANGFUSE_HOST=https://cloud.langfuse.com
LANGFUSE_PUBLIC_KEY=pk-lf-xxxx
LANGFUSE_SECRET_KEY=sk-lf-xxxx
LANGFUSE_ENVIRONMENT=production
MCP_SERVER_RELEASE=2026.03.09如果未配置 Langfuse 凭证,服务仍可正常提供 MCP 工具,只是不会上报 tracing 数据。
🧪 开发测试
uv run mcp dev src/mcp_server/server.py
uv run mcp-server --help📖 架构说明
ChatBI Java Service
|
| traceparent + x-chatbi-* headers
v
External MCP Server (/mcp)
|
|-- schema_search
| |-- tokenize
| |-- embedding
| |-- vector_search
| |-- keyword_search
| `-- fusion
|
`-- execute_sql
|-- validate
`-- query
All observations -> Langfuse📄 许可证
MIT
Resources
Unclaimed servers have limited discoverability.
Looking for Admin?
If you are the server author, to access and configure the admin panel.
Latest Blog Posts
MCP directory API
We provide all the information about MCP servers via our MCP API.
curl -X GET 'https://glama.ai/api/mcp/v1/servers/Love-Gwen2025/mcp-server'
If you have feedback or need assistance with the MCP directory API, please join our Discord server