Skip to main content
Glama
wllcnm

DingTalk MCP Server V2

by wllcnm
docker.yml930 B
name: Docker on: push: branches: [ main ] tags: [ 'v*.*.*' ] pull_request: branches: [ main ] jobs: build: runs-on: ubuntu-latest permissions: contents: read packages: write steps: - uses: actions/checkout@v2 - name: Set up QEMU uses: docker/setup-qemu-action@v2 - name: Set up Docker Buildx uses: docker/setup-buildx-action@v2 - name: Login to GitHub Container Registry uses: docker/login-action@v2 with: registry: ghcr.io username: ${{ github.repository_owner }} password: ${{ secrets.GITHUB_TOKEN }} - name: Build and push uses: docker/build-push-action@v3 with: context: . platforms: linux/amd64,linux/arm64 push: true tags: | ghcr.io/${{ github.repository_owner }}/mcp-dingding-v2:latest

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/wllcnm/dingding_mcp_v2'

If you have feedback or need assistance with the MCP directory API, please join our Discord server