Skip to main content
Glama

eRegulations MCP Server

by unctad-ai

eRegulations MCP 服务器

用于访问 eRegulations API 数据的模型上下文协议 (MCP) 服务器实现。该服务器提供结构化、AI 友好的 eRegulations 实例访问,使 AI 模型能够更轻松地回答用户关于行政流程的问题。

特征

  • 通过标准化协议访问电子法规数据
  • 查询流程、步骤、要求和费用
  • MCP 提示模板用于指导 LLM 工具的使用
  • 使用标准 I/O 连接简化实施

用法

使用 Docker 运行(推荐)

推荐使用 GitHub 容器注册表 (GHCR) 中发布的 Docker 镜像来运行服务器。这可以确保环境的一致性和隔离性。

# Pull the latest image (optional) docker pull ghcr.io/unctad-ai/eregulations-mcp-server:latest # Run the server, providing the target eRegulations API URL export EREGULATIONS_API_URL="https://your-eregulations-api.com" docker run -i --rm -e EREGULATIONS_API_URL ghcr.io/unctad-ai/eregulations-mcp-server

https://your-eregulations-api.com替换为您要连接到的 eRegulations 实例的实际基本 URL(例如, https://api-tanzania.tradeportal.org )。

服务器在标准输入上监听 MCP JSON 请求并将响应发送到标准输出。

客户端配置示例

下面是如何配置客户端(如 Claude)以通过 Docker 使用该服务器的示例:

{ "mcpServers": { "eregulations": { "command": "docker", "args": [ "run", "-i", "--rm", "-e", "EREGULATIONS_API_URL", "ghcr.io/unctad-ai/eregulations-mcp-server:latest" ], "env": { "EREGULATIONS_API_URL": "https://your-eregulations-api.com" } } } }

(记住也要替换env部分中的EREGULATIONS_API_URL值。)

通过 Smithery 安装

或者,您可以使用 Smithery 安装并运行服务器:

访问https://smithery.ai/server/@unctad-ai/eregulations-mcp-server获取安装命令。

通过 npm Registry 安装(已弃用)

~~由于潜在的环境不一致,直接使用npx运行服务器已被弃用。~~

~~```bash

已弃用:设置环境变量并使用 npx 运行

导出 EREGULATIONS_API_URL= https://example.com/api && 导出 NODE_ENV=production && npx -y @unctad-ai/eregulations-mcp-server@latest

## Configuration The server requires the URL of the target eRegulations API. ### Environment Variables - `EREGULATIONS_API_URL`: **(Required)** URL of the eRegulations API to connect to (e.g., `https://api-tanzania.tradeportal.org`). Passed to the Docker container using the `-e` flag. ## Available Tools The MCP server provides the following tools: ### `listProcedures` Lists all available procedures in the eRegulations system. ### `getProcedureDetails` Gets detailed information about a specific procedure by its ID. Parameters: - `procedureId`: ID of the procedure to retrieve ### `getProcedureStep` Gets information about a specific step within a procedure. Parameters: - `procedureId`: ID of the procedure - `stepId`: ID of the step within the procedure ### `searchProcedures` Searches for procedures by keyword or phrase. Note: This currently searches related objectives based on the underlying API and may include results beyond direct procedure names. Parameters: - `keyword`: The keyword or phrase to search for ## Prompt Templates The server provides prompt templates to guide LLMs in using the available tools correctly. These templates explain the proper format and parameters for each tool. LLM clients that support the MCP prompt templates capability will automatically receive these templates to improve their ability to work with the API. ## Development ```bash # Run in development mode npm run start # Run tests npm test # Run tests with watch mode npm run test:watch # Run test client npm run test-client ```
Install Server
A
security – no known vulnerabilities
F
license - not found
A
quality - confirmed to work

remote-capable server

The server can be hosted and run remotely because it primarily relies on remote services or has no dependency on the local environment.

模型上下文协议服务器实现提供了结构化的、AI 友好的电子法规数据访问,使 AI 模型更容易回答用户有关管理程序的问题。

  1. 特征
    1. 用法
      1. 使用 Docker 运行(推荐)
      2. 客户端配置示例
      3. 通过 Smithery 安装
      4. 通过 npm Registry 安装(已弃用)
    2. 已弃用:设置环境变量并使用 npx 运行
      1. Run in development mode
        1. Run tests
          1. Run tests with watch mode
            1. Run test client

              Related MCP Servers

              • -
                security
                F
                license
                -
                quality
                A Model Context Protocol server that provides AI assistants with structured access to your Logseq knowledge graph, enabling retrieval, searching, analysis, and creation of content within your personal knowledge base.
                Last updated -
                19
                TypeScript
                • Apple
              • -
                security
                F
                license
                -
                quality
                A demonstration implementation of the Model Context Protocol server that facilitates communication between AI models and external tools while maintaining context awareness.
                Last updated -
                Python
                • Linux
                • Apple
              • -
                security
                F
                license
                -
                quality
                A comprehensive Model Context Protocol server implementation that enables AI assistants to interact with file systems, databases, GitHub repositories, web resources, and system tools while maintaining security and control.
                Last updated -
                16
                TypeScript
              • -
                security
                F
                license
                -
                quality
                A Model Context Protocol server that extends AI capabilities by providing file system access and management functionalities to Claude or other AI assistants.
                Last updated -
                3
                TypeScript
                • Apple

              View all related MCP servers

              MCP directory API

              We provide all the information about MCP servers via our MCP API.

              curl -X GET 'https://glama.ai/api/mcp/v1/servers/unctad-ai/eregulations-mcp-server'

              If you have feedback or need assistance with the MCP directory API, please join our Discord server