Skip to main content
Glama

n8n MCP Server

by Peakviker
build-ghcr.yml1.07 kB
name: Build and Push to GHCR on: push: branches: [ "main" ] workflow_dispatch: jobs: build-and-push: runs-on: ubuntu-latest permissions: contents: read packages: write steps: - name: Checkout uses: actions/checkout@v4 - name: Set up QEMU uses: docker/setup-qemu-action@v3 - name: Set up Docker Buildx uses: docker/setup-buildx-action@v3 - name: Log in to GHCR uses: docker/login-action@v3 with: registry: ghcr.io username: ${{ github.actor }} password: ${{ secrets.GITHUB_TOKEN }} - name: Docker metadata id: meta uses: docker/metadata-action@v5 with: images: ghcr.io/${{ github.repository }} tags: | type=raw,value=latest type=sha - name: Build and push uses: docker/build-push-action@v6 with: context: . push: true tags: ${{ steps.meta.outputs.tags }} labels: ${{ steps.meta.outputs.labels }}

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/Peakviker/MCPn8n'

If you have feedback or need assistance with the MCP directory API, please join our Discord server