MCP Server for Dify AI
remote-capable server
The server can be hosted and run remotely because it primarily relies on remote services or has no dependency on the local environment.
Integrations
Provides CI/CD workflow status information through GitHub Actions, displayed as a badge in the README.
Available as a package on npm, allowing installation via the npm package manager.
Implemented using TypeScript, providing type safety for the server implementation.
mcp-server-dify
Model Context Protocol Server for Dify AI. This server enables LLMs to interact with Dify AI's chat completion capabilities through a standardized protocol.
Features
- Integration with Dify AI chat completion API
- Restaurant recommendation tool (meshi-doko)
- Support for conversation context
- Streaming response support
- TypeScript implementation
Installation
Using NPM
Usage
With Claude Desktop
Add the following configuration to your claude_desktop_config.json
:
Replace your-dify-api-endpoint
and your-dify-api-key
with your actual Dify API credentials.
Tools
meshi-doko
Restaurant recommendation tool that interfaces with Dify AI:
Parameters:
LOCATION
(string): Location of the restaurantBUDGET
(string): Budget constraintsquery
(string): Query to send to Dify AIconversation_id
(string, optional): For maintaining chat context
Development
License
This project is released under the MIT License.
Security
This server interacts with Dify AI using your provided API key. Ensure to:
- Keep your API credentials secure
- Use HTTPS for the API endpoint
- Never commit API keys to version control
Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
This server cannot be installed
Enables LLMs to interact with Dify AI's chat completion API, including conversation context support and a restaurant recommendation tool.