Skip to main content
Glama

MCP Mermaid build npm Version npm License Trust Score

Generate mermaid diagram and chart with AI MCP dynamically. Also you can use:

  • mcp-server-chart to generate chart, graph, map.

  • Infographic to generate infographic, such as Timeline, Comparison, List, Process and so on.

✨ Features

  • Fully support all features and syntax of Mermaid.

  • Support configuration of backgroundColor and theme, enabling large AI models to output rich style configurations.

  • Support exporting to base64, svg, mermaid, file, and remote-friendly svg_url, png_url formats, with validation for Mermaid to facilitate the model's multi-round output of correct syntax and graphics. Use outputType: "file" to automatically save PNG diagrams to disk for AI agents, or the URL modes to share diagrams through public mermaid.ink links.

Related MCP server: Alchemy MCP Server

🤖 Usage

To use with Desktop APP, such as Claude, VSCode, Cline, Cherry Studio, and so on, add the MCP server config below. On Mac system:

{ "mcpServers": { "mcp-mermaid": { "command": "npx", "args": [ "-y", "mcp-mermaid" ] } } }

On Window system:

{ "mcpServers": { "mcp-mermaid": { "command": "cmd", "args": [ "/c", "npx", "-y", "mcp-mermaid" ] } } }

Also, you can use it on aliyun, modelscope, glama.ai, smithery.ai or others with HTTP, SSE Protocol.

🚰 Run with SSE or Streamable transport

Option 1: Global Installation

Install the package globally:

npm install -g mcp-mermaid

Run the server with your preferred transport option:

# For SSE transport (default endpoint: /sse) mcp-mermaid -t sse # For Streamable transport with custom endpoint mcp-mermaid -t streamable

Option 2: Local Development

If you're working with the source code locally:

# Clone and setup git clone https://github.com/hustcc/mcp-mermaid.git cd mcp-mermaid npm install npm run build # Run with npm scripts npm run start:sse # SSE transport on port 3033 npm run start:streamable # Streamable transport on port 1122

Access Points

Then you can access the server at:

  • SSE transport: http://localhost:3033/sse

  • Streamable transport: http://localhost:1122/mcp (local) or http://localhost:3033/mcp (global)

🎮 CLI Options

You can also use the following CLI options when running the MCP server. Command options by run cli with -h.

MCP Mermaid CLI Options: --transport, -t Specify the transport protocol: "stdio", "sse", or "streamable" (default: "stdio") --port, -p Specify the port for SSE or streamable transport (default: 3033) --endpoint, -e Specify the endpoint for the transport: - For SSE: default is "/sse" - For streamable: default is "/mcp" --help, -h Show this help message

🔨 Development

Install dependencies:

npm install

Build the server:

npm run build

Start the MCP server

Using MCP Inspector (for debugging):

npm run start

Using different transport protocols:

# SSE transport (Server-Sent Events) npm run start:sse # Streamable HTTP transport npm run start:streamable

Direct node commands:

# SSE transport on port 3033 node build/index.js --transport sse --port 3033 # Streamable HTTP transport on port 1122 node build/index.js --transport streamable --port 1122 # STDIO transport (for MCP client integration) node build/index.js --transport stdio

📄 License

MIT@hustcc.

One-click Deploy
A
security – no known vulnerabilities
A
license - permissive license
A
quality - confirmed to work

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/hustcc/mcp-mermaid'

If you have feedback or need assistance with the MCP directory API, please join our Discord server