Skip to main content
Glama

Figma MCP Server

This is a custom Model Context Protocol (MCP) server for Figma. It connects your Figma designs directly to AI-powered IDEs (like Cursor, Windsurf, or any editor that supports MCP).

Instead of manually copy-pasting CSS, hex codes, or dimensions from Figma into your prompts, you can just give the AI your Figma file ID. The AI will use this server to read your designs, figure out how screens scale from desktop to mobile, and write accurate frontend code.

How it works

The project has two ways to run:

  1. MCP Agent (mcp-agent.js): This is the main use case. It connects directly to your IDE over stdio, giving your AI assistant custom tools to query your Figma files.

  2. Standard Web Server (server.js): It also includes an Express server. You can run this if you just want a standard REST API to pull your Figma data into other scripts or apps.

Under the hood, the server handles all the annoying parts of dealing with the Figma API. It batches requests to avoid hitting rate limits, caches responses so your AI doesn't wait around for data, and handles background retries if the network drops.

Tools it gives your AI

Once connected, your AI assistant will be able to use three new tools:

  • fetch_file: Pulls the raw JSON structure of a Figma file.

  • list_screens: Finds all the frames in the file and groups them by page.

  • build_responsive_screens: This is the most useful one. It looks at your design and figures out which mobile screens belong to which desktop screens. When the AI uses this, it knows exactly what to do when writing responsive CSS.

Setup Instructions

1. What you need first

  • Node.js installed (v18 or newer should be fine).

  • A Figma Personal Access Token (you can get this from your Figma profile settings).

2. Install dependencies

Clone the repo and install the packages:

npm install

3. Add your Figma token

Make a .env file in the root folder, copy everything in it from example.env and add your token:

FIGMA_TOKEN=your_token_here

4. Running the server

For AI IDEs (MCP Mode): To plug this into your IDE, run:

npm run mcp

To actually connect it, head to your editor's MCP settings, add a new MCP server, select the command option, and tell it to run node with the absolute path to the mcp-agent.js file in this folder.

Here's an example of how to connect it in Antigravity:

{
  "mcpServers": {
    "figma-mcp-v2": {
      "command": "node",
      "args": ["/absolute/path/to/figma_mcp/mcp-agent.js"],
      "env": {
        "FIGMA_TOKEN": "your_token_here",
        "NODE_ENV": "production",
        "DOTENV_QUIET": "true"
      }
    }
  }
}

For the REST API: If you just want the local web server:

npm run dev

It runs on http://localhost:3010 by default.

Example Prompt

Use figma-mcp-v2 tool: call build_responsive_screens with fileId "YOUR_FIGMA_FILE_ID_HERE"
Then generate clean responsive React + Tailwind code for all returned screens, one by one.

Where to get the file ID

  1. Go to your web browser and head to figma.com

  2. Navigate to any of your figma files

  3. Look at the URL in your browser's address bar. It will look something like this: https://www.figma.com/file/FILE_ID_HERE/file-name

  4. The FILE_ID_HERE part is the file ID you need.

-
security - not tested
F
license - not found
-
quality - not tested

Resources

Unclaimed servers have limited discoverability.

Looking for Admin?

If you are the server author, to access and configure the admin panel.

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/aashraf09/figma-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server