Licensed under Apache 2.0 license framework
Supports containerized deployment and orchestration of the MCP server and tool ecosystem using Docker Compose
Uses Mermaid diagrams for architectural documentation and workflow visualization
Enables integration of Python-based tools including ML capabilities like text summarization, semantic search, keyword extraction, and code interpretation
graph TD
subgraph "Client Applications"
A[gRPC Client]
H[HTTP/REST Client]
end
A -->|gRPC Request on port 8090| B(Main MCP Server);
H -->|HTTP/REST Request on port 8002| B;
B -->|gRPC Proxy| C{Tool 1 Go};
B -->|gRPC Proxy| D{Tool 2 Go};
B -->|gRPC Proxy| E{Tool 3 Python};
B -->|gRPC Proxy| F[Human Bridge];
subgraph "Tool Servers"
C
D
E
F
end
style B fill:#ffa500,stroke:#333,stroke-width:2px,color:#000
sequenceDiagram
participant User
participant LLM
participant "MCP Server (gRPC/HTTP)"
participant Tools
User->>LLM: Prompt
LLM->>"MCP Server (gRPC/HTTP)": ListTools() via GET /v1/tools
"MCP Server (gRPC/HTTP)"-->>LLM: List of available tools
LLM->>LLM: Reason which tool to use
LLM->>"MCP Server (gRPC/HTTP)": RunTool(tool_name, args) via POST /v1/tools:run
"MCP Server (gRPC/HTTP)"->>Tools: Execute tool via gRPC
Tools-->>"MCP Server (gRPC/HTTP)": Tool output
"MCP Server (gRPC/HTTP)"-->>LLM: Observation (tool result)
LLM->>User: Final Answer
For more information on how to integrate MCP-NG with an LLM and use the ReAct pattern, please see the Integration Guide. List of available tools