Skip to main content
Glama

Figma MCP Server

by bbalakriz
build-container.sh1.63 kB
#!/bin/bash # multi-arch arm64 and amd64 Figma MCP Server container set -e IMAGE_NAME="quay.io/balki404/figma-mcp-server" TAG="1.0" PLATFORMS="linux/amd64,linux/arm64" echo "🏗️ Building multi-architecture container for Figma MCP Server..." echo "Image: ${IMAGE_NAME}:${TAG}" echo "Platforms: ${PLATFORMS}" # Check if podman or docker is available if command -v podman &> /dev/null; then CONTAINER_CMD="podman" echo "--- Using Podman" else echo "xxx Podman not found. Please install Podman first." exit 1 fi build_with_podman() { echo "*** Building with Podman..." # AMD64 echo "Building for linux/amd64..." $CONTAINER_CMD build \ --platform linux/amd64 \ -t ${IMAGE_NAME}:${TAG}-amd64 \ . # ARM64 echo "Building for linux/arm64..." $CONTAINER_CMD build \ --platform linux/arm64 \ -t ${IMAGE_NAME}:${TAG}-arm64 \ . echo "--- Multi-architecture build completed with Podman!" echo "Available images:" echo " - ${IMAGE_NAME}:${TAG}-amd64" echo " - ${IMAGE_NAME}:${TAG}-arm64" } # Build based on available container runtime if [ "$CONTAINER_CMD" = "podman" ]; then build_with_podman fi echo "" echo "--- Build completed successfully!" echo "" echo "To run the container with Podman:" echo " podman run -p 3333:3333 -e FIGMA_API_KEY=your_api_key_here ${IMAGE_NAME}:${TAG}" echo "" echo "Access the SSE endpoint at: http://localhost:3333/sse" echo "" echo "To use docker-compose:" echo " 1. Create a .env file with: FIGMA_API_KEY=your_api_key_here" echo " 2. Run: docker-compose up -d"

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/bbalakriz/figma-mcp-server'

If you have feedback or need assistance with the MCP directory API, please join our Discord server