Licensed under Apache 2.0 license framework
Supports containerized deployment and orchestration of the MCP server and tool ecosystem using Docker Compose
Uses Mermaid diagrams for architectural documentation and workflow visualization
Enables integration of Python-based tools including ML capabilities like text summarization, semantic search, keyword extraction, and code interpretation
Click on "Install Server".
Wait a few minutes for the server to deploy. Once ready, it will show a "Started" state.
In the chat, type
@followed by the MCP server name and your instructions, e.g., "@MCP-NGsummarize the latest research paper on LLM agents"
That's it! The server will respond to your query, and you can continue using it as needed.
Here is a step-by-step guide with screenshots.
For more information on how to integrate MCP-NG with an LLM and use the ReAct pattern, please see the Integration Guide. List of available tools