Skip to main content
Glama

GLM-4.6 MCP Server

by bobvasic
GITHUB_SETUP.md4.26 kB
# GitHub Repository Setup Instructions ## Step 1: Add Repository Description 1. Go to https://github.com/bobvasic/glm-mcp-server 2. Click the ⚙️ **Settings** icon (top right of repo page) 3. In the **About** section, add: **Description:** ``` Enterprise MCP server enabling Claude 4.5 Sonnet to consult GLM-4.6 for advanced software architecture guidance via Warp Terminal ``` **Website:** ``` https://docs.z.ai/guides/llm/glm-4.6 ``` 4. Check ✅ **Releases** (to show v1.0.0) 5. Check ✅ **Packages** (for future npm publishing) --- ## Step 2: Add Topics/Tags In the same **About** section, click "Add topics" and add: ``` mcp model-context-protocol glm-4 glm-4-6 zhipu-ai warp-terminal claude architecture system-design ai-consultant enterprise typescript nodejs ``` Topics help developers discover your repository. --- ## Step 3: Create GitHub Release (v1.0.0) ### Option A: Via GitHub Web UI (Recommended) 1. Go to https://github.com/bobvasic/glm-mcp-server/releases 2. Click **"Create a new release"** 3. Click **"Choose a tag"** → Select `v1.0.0` 4. **Release title:** `v1.0.0 - Initial Production Release` 5. **Description:** Copy from below ```markdown # GLM-4.6 MCP Server v1.0.0 🎉 **Initial Production Release** Enterprise-grade Model Context Protocol server enabling **Claude 4.5 Sonnet** (via Warp Terminal) to leverage **GLM-4.6** for advanced software architecture consulting. --- ## 🏗️ Features - **4 Specialized Architectural Tools** - `consult_architecture` - General architectural guidance - `analyze_code_architecture` - Code analysis with SOLID principles - `design_system_architecture` - Complete system design from requirements - `review_technical_decision` - Technical decision evaluation - **Enterprise Standards** - TypeScript strict mode - MCP protocol compliant - Zero hardcoded secrets - Comprehensive error handling - Production-ready deployment strategies - **Integration Ready** - Warp Terminal MCP configuration - GLM-4.6 API client with authentication - Docker, PM2, systemd deployment options --- ## 📦 Installation ```bash git clone https://github.com/bobvasic/glm-mcp-server.git cd glm-mcp-server npm install npm run build ``` Create `.env`: ```bash GLM_API_KEY=your_api_key_here ``` --- ## 🔧 Warp Configuration Add to Warp Terminal MCP Servers: ```json { "mcpServers": { "glm-architecture": { "command": "node", "args": ["/absolute/path/to/glm-mcp-server/build/index.js"], "env": { "GLM_API_KEY": "your_glm_api_key_here" } } } } ``` --- ## 📚 Documentation - [README](README.md) - Quick start and overview - [ARCHITECTURE](ARCHITECTURE.md) - System architecture details - [DEPLOYMENT](DEPLOYMENT.md) - Production deployment guide - [SECURITY_CHECKLIST](SECURITY_CHECKLIST.md) - Security verification --- ## 🔐 Security - MIT Licensed - No hardcoded secrets - Environment-based configuration - Security-hardened .gitignore --- ## 🤝 Support **Enterprise Support:** info@cyberlinksec.com **Issues:** https://github.com/bobvasic/glm-mcp-server/issues --- **Built with Enterprise Standards by CyberLink Security** ``` 6. Check ✅ **"Set as the latest release"** 7. Click **"Publish release"** ### Option B: Via GitHub CLI (Alternative) ```bash gh release create v1.0.0 \ --title "v1.0.0 - Initial Production Release" \ --notes-file GITHUB_SETUP.md \ --latest ``` --- ## Verification After completing steps: 1. ✅ Repository description visible on main page 2. ✅ Topics appear as clickable tags 3. ✅ Release v1.0.0 shows in "Releases" section 4. ✅ Latest release badge appears --- ## Optional: Add Badges to README Add at the top of README.md: ```markdown [![GitHub release](https://img.shields.io/github/release/bobvasic/glm-mcp-server.svg)](https://github.com/bobvasic/glm-mcp-server/releases) [![GitHub stars](https://img.shields.io/github/stars/bobvasic/glm-mcp-server.svg)](https://github.com/bobvasic/glm-mcp-server/stargazers) [![GitHub issues](https://img.shields.io/github/issues/bobvasic/glm-mcp-server.svg)](https://github.com/bobvasic/glm-mcp-server/issues) ``` --- **All steps automated where possible. Manual web UI steps require GitHub login.**

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/bobvasic/glm-mcp-server'

If you have feedback or need assistance with the MCP directory API, please join our Discord server