Skip to main content
Glama

MCP Server for Dify AI

mcp 服务器-dify

Dify AI 的模型上下文协议服务器。该服务器使 LLM 能够通过标准化协议与 Dify AI 的聊天完成功能进行交互。

特征

  • 与 Dify AI 聊天完成 API 集成
  • 餐厅推荐工具(meshi-doko)
  • 支持对话上下文
  • 流式响应支持
  • TypeScript 实现

安装

使用 Docker

# Build the Docker image make docker # Run with Docker docker run -i --rm mcp/dify https://your-dify-api-endpoint your-dify-api-key

用法

使用 Claude Desktop

将以下配置添加到您的claude_desktop_config.json

{ "mcpServers": { "dify": { "command": "npx", "args": [ "-y", "@modelcontextprotocol/server-dify", "https://your-dify-api-endpoint", "your-dify-api-key" ] } } }

用您的实际 Dify API 凭证替换your-dify-api-endpointyour-dify-api-key

工具

饭铺

与 Dify AI 接口的餐厅推荐工具:

参数:

  • LOCATION (字符串):餐厅位置
  • BUDGET (字符串):预算限制
  • query (字符串):发送给 Dify AI 的查询
  • conversation_id (字符串,可选):用于维护聊天上下文

发展

# Initial setup make setup # Build the project make build # Format code make format # Run linter make lint

执照

该项目根据MIT 许可证发布。

安全

此服务器使用您提供的 API 密钥与 Dify AI 交互。请确保:

  • 确保您的 API 凭证安全
  • 使用 HTTPS 作为 API 端点
  • 切勿将 API 密钥提交到版本控制

贡献

欢迎贡献代码!欢迎提交 Pull 请求。

-
security - not tested
A
license - permissive license
-
quality - not tested

remote-capable server

The server can be hosted and run remotely because it primarily relies on remote services or has no dependency on the local environment.

使 LLM 能够与 Dify AI 的聊天完成 API 进行交互,包括对话上下文支持和餐厅推荐工具。

  1. 特征
    1. 安装
      1. 使用 Docker
    2. 用法
      1. 使用 Claude Desktop
      2. 工具
    3. 发展
      1. 执照
        1. 安全
          1. 贡献

            Related MCP Servers

            • -
              security
              F
              license
              -
              quality
              Enables LLMs to interact with Discord channels by sending and reading messages through Discord's API, with a focus on maintaining user control and security.
              Last updated -
              33
              TypeScript
              • Apple
            • A
              security
              A
              license
              A
              quality
              Enables integration of Perplexity's AI API with LLMs, delivering advanced chat completion by utilizing specialized prompt templates for tasks like technical documentation, code review, and API documentation.
              Last updated -
              1
              94
              8
              JavaScript
              MIT License
              • Linux
            • A
              security
              F
              license
              A
              quality
              Provides access to Intercom conversations and chats through the Model Context Protocol, allowing LLMs to query and analyze Intercom conversations with various filtering options.
              Last updated -
              2
              3
              TypeScript
              • Apple
            • A
              security
              A
              license
              A
              quality
              Enables AI models to interact with messages from various messaging platforms (Mobile, Mail, WhatsApp, LinkedIn, Slack, Twitter, Telegram, Instagram, Messenger) through a standardized interface.
              Last updated -
              3
              8
              Python
              MIT License
              • Linux

            View all related MCP servers

            MCP directory API

            We provide all the information about MCP servers via our MCP API.

            curl -X GET 'https://glama.ai/api/mcp/v1/servers/yuru-sha/mcp-server-dify'

            If you have feedback or need assistance with the MCP directory API, please join our Discord server