Provides CI/CD workflow status information through GitHub Actions, displayed as a badge in the README.
Available as a package on npm, allowing installation via the npm package manager.
Implemented using TypeScript, providing type safety for the server implementation.
mcp-server-dify
Model Context Protocol Server for Dify AI. This server enables LLMs to interact with Dify AI's chat completion capabilities through a standardized protocol.
Features
Integration with Dify AI chat completion API
Restaurant recommendation tool (meshi-doko)
Support for conversation context
Streaming response support
TypeScript implementation
Related MCP server: MCP Perplexity Search
Installation
Using Docker
Usage
With Claude Desktop
Add the following configuration to your claude_desktop_config.json:
Replace your-dify-api-endpoint and your-dify-api-key with your actual Dify API credentials.
Tools
meshi-doko
Restaurant recommendation tool that interfaces with Dify AI:
Parameters:
LOCATION(string): Location of the restaurantBUDGET(string): Budget constraintsquery(string): Query to send to Dify AIconversation_id(string, optional): For maintaining chat context
Development
License
This project is released under the MIT License.
Security
This server interacts with Dify AI using your provided API key. Ensure to:
Keep your API credentials secure
Use HTTPS for the API endpoint
Never commit API keys to version control
Contributing
Contributions are welcome! Please feel free to submit a Pull Request.