Skip to main content
Glama

🎨 Fal.ai MCP Server

CI Docker MCP GitHub Release PyPI Docker Image Python License

A Model Context Protocol (MCP) server that enables Claude Desktop (and other MCP clients) to generate images, videos, music, and audio using Fal.ai models.

✨ Features

πŸš€ Performance

  • Native Async API - Uses fal_client.run_async() for optimal performance

  • Queue Support - Long-running tasks (video/music) use queue API with progress updates

  • Non-blocking - All operations are truly asynchronous

🌐 Transport Modes (New!)

  • STDIO - Traditional Model Context Protocol communication

  • HTTP/SSE - Web-based access via Server-Sent Events

  • Dual Mode - Run both transports simultaneously

🎨 Media Generation

  • πŸ–ΌοΈ Image Generation - Create images using Flux, SDXL, and other models

  • 🎬 Video Generation - Generate videos from images or text prompts

  • 🎡 Music Generation - Create music from text descriptions

  • πŸ—£οΈ Text-to-Speech - Convert text to natural speech

  • πŸ“ Audio Transcription - Transcribe audio using Whisper

  • ⬆️ Image Upscaling - Enhance image resolution

  • πŸ”„ Image-to-Image - Transform existing images with prompts

Related MCP server: FL Studio MCP

πŸš€ Quick Start

Prerequisites

  • Python 3.10 or higher

  • Fal.ai API key (free tier available)

  • Claude Desktop (or any MCP-compatible client)

Installation

Official Docker image available on GitHub Container Registry:

# Pull the latest image docker pull ghcr.io/raveenb/fal-mcp-server:latest # Run with your API key docker run -d \ --name fal-mcp \ -e FAL_KEY=your-api-key \ -p 8080:8080 \ ghcr.io/raveenb/fal-mcp-server:latest

Or use Docker Compose:

curl -O https://raw.githubusercontent.com/raveenb/fal-mcp-server/main/docker-compose.yml echo "FAL_KEY=your-api-key" > .env docker-compose up -d

Option 2: Install from PyPI

pip install fal-mcp-server

Or with uv:

uv pip install fal-mcp-server

Option 3: Install from source

git clone https://github.com/raveenb/fal-mcp-server.git cd fal-mcp-server pip install -e .

Configuration

  1. Get your Fal.ai API key from fal.ai

  2. Configure Claude Desktop by adding to:

    • macOS: ~/Library/Application Support/Claude/claude_desktop_config.json

    • Windows: %APPDATA%\Claude\claude_desktop_config.json

For Docker Installation:

{ "mcpServers": { "fal-ai": { "command": "curl", "args": ["-N", "http://localhost:8080/sse"] } } }

For PyPI Installation:

{ "mcpServers": { "fal-ai": { "command": "python", "args": ["-m", "fal_mcp_server.server"], "env": { "FAL_KEY": "your-fal-api-key" } } } }

For Source Installation:

{ "mcpServers": { "fal-ai": { "command": "python", "args": ["/path/to/fal-mcp-server/src/fal_mcp_server/server.py"], "env": { "FAL_KEY": "your-fal-api-key" } } } }
  1. Restart Claude Desktop

πŸ’¬ Usage

With Claude Desktop

Once configured, ask Claude to:

  • "Generate an image of a sunset"

  • "Create a video from this image"

  • "Generate 30 seconds of ambient music"

  • "Convert this text to speech"

  • "Transcribe this audio file"

HTTP/SSE Transport (New!)

Run the server with HTTP transport for web-based access:

# Using Docker (recommended) docker run -d -e FAL_KEY=your-key -p 8080:8080 ghcr.io/raveenb/fal-mcp-server:latest # Using pip installation fal-mcp-http --host 0.0.0.0 --port 8000 # Or dual mode (STDIO + HTTP) fal-mcp-dual --transport dual --port 8000

Connect from web clients via Server-Sent Events:

  • SSE endpoint: http://localhost:8080/sse (Docker) or http://localhost:8000/sse (pip)

  • Message endpoint: POST http://localhost:8080/messages/

See Docker Documentation and HTTP Transport Documentation for details.

πŸ“¦ Supported Models

Image Models

  • flux_schnell - Fast high-quality generation

  • flux_dev - Development version with more control

  • sdxl - Stable Diffusion XL

Video Models

  • svd - Stable Video Diffusion

  • animatediff - Text-to-video animation

Audio Models

  • musicgen - Music generation

  • bark - Text-to-speech

  • whisper - Audio transcription

🀝 Contributing

Contributions are welcome! Please see CONTRIBUTING.md for guidelines.

Local Development

We support local CI testing with act:

# Quick setup make ci-local # Run CI locally before pushing # See detailed guide cat docs/LOCAL_TESTING.md

πŸ“ License

MIT License - see LICENSE file for details.

πŸ™ Acknowledgments

One-click Deploy
A
security – no known vulnerabilities
A
license - permissive license
A
quality - confirmed to work

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/raveenb/fal-mcp-server'

If you have feedback or need assistance with the MCP directory API, please join our Discord server