Skip to main content
Glama

figma-mcp-server

This project runs a Model Context Protocol (MCP) server for Figma integration using the figma-developer-mcp package.

Prerequisites

  • Node.js v20 or higher

  • A Figma Personal Access Token (API key)

Installation

Install dependencies:

npm install

Running the Server

Option 1: Using Podman Compose

This project includes containerization support using Red Hat Universal Base Image (UBI) for both Apple Silicon (ARM64) and Linux (AMD64) architectures.

Prerequisites

  • Podman installed

  • A Figma Personal Access Token (API key)

Steps

  1. Create a .env file in the project root:

    FIGMA_API_KEY=your_figma_personal_access_token_here
  2. Start the service:

    podman-compose up -d
  3. Stop the service:

    podman-compose down

Option 2: Using Podman Directly

Building the Container

Using the Build Script
./build-container.sh
Build with Podman
# For current architecture podman build -t quay.io/balki404/figma-mcp-server:1.0 . # For specific architecture podman build --platform linux/amd64 -t fquay.io/balki404/figma-mcp-server:1.0-amd64 . podman build --platform linux/arm64 -t quay.io/balki404/figma-mcp-server:1.0-arm64 .

Running the Container

# Run with environment variable podman run -e FIGMA_API_KEY=your_figma_api_key_here quay.io/balki404/figma-mcp-server:1.0-arm64 # Run in detached mode with port mapping podman run -d --name figma-mcp -p 3333:3333 -e FIGMA_API_KEY=your_figma_api_key_here quay.io/balki404/figma-mcp-server:1.0-arm64 # Run with restart policy and port mapping podman run -d --restart=unless-stopped --name figma-mcp -p 3333:3333 -e FIGMA_API_KEY=your_figma_api_key_here quay.io/balki404/figma-mcp-server:1.0-arm64

Option 3: Direct Node.js

You can start the server directly with your Figma API key using the following command:

npx figma-developer-mcp --figma-api-key <YOUR_FIGMA_API_KEY>

Replace <YOUR_FIGMA_API_KEY> with your actual Figma Personal Access Token.

Features

  • Environment-based configuration: Easy API key management through environment variables

  • SSE endpoint: Server-Sent Events available at http://localhost:3333/sse

-
security - not tested
F
license - not found
-
quality - not tested

Related MCP Servers

  • A
    security
    F
    license
    A
    quality
    Enables seamless interaction with Figma via the Model Context Protocol, allowing LLM applications to access, manipulate, and track Figma files, components, and variables.
    Last updated -
    194
    142
    • Apple
  • A
    security
    F
    license
    A
    quality
    A Model Context Protocol server that provides access to Figma API functionality, allowing AI assistants like Claude to interact with Figma files, comments, components, and team resources.
    Last updated -
    18
    417
    4
  • A
    security
    F
    license
    A
    quality
    Enables AI assistants to interact with Figma files through the ModelContextProtocol, allowing viewing, commenting, and analyzing Figma designs directly in chat interfaces.
    Last updated -
    5
    616
    206
    • Apple
  • A
    security
    F
    license
    A
    quality
    A Model Context Protocol server that integrates with Figma's API, allowing interaction with Figma files, comments, components, projects, and webhook management.
    Last updated -
    5
    616

View all related MCP servers

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/bbalakriz/figma-mcp-server'

If you have feedback or need assistance with the MCP directory API, please join our Discord server