Skip to main content
Glama

fal

by derekalia
MIT License
1

fal MCP Server

A Model Context Protocol (MCP) server for interacting with fal.ai models and services. This project was inspired by am0y's MCP server, but updated to use the latest streaming MCP support.

Features

  • List all available fal.ai models
  • Search for specific models by keywords
  • Get model schemas
  • Generate content using any fal.ai model
  • Support for both direct and queued model execution
  • Queue management (status checking, getting results, cancelling requests)
  • File upload to fal.ai CDN
  • Full streaming support via HTTP transport

Requirements

  • Python 3.12+
  • fastmcp
  • httpx
  • aiofiles
  • A fal.ai API key

Installation

  1. Clone this repository:
git clone https://github.com/derekalia/fal.git cd fal
  1. Install the required packages:
# Using uv (recommended) uv sync # Or using pip pip install fastmcp httpx aiofiles

Usage

Running the Server Locally

  1. Get your fal.ai API key from fal.ai
  2. Start the MCP server with HTTP transport:
./run_http.sh YOUR_FAL_API_KEY

The server will start and display connection information in your terminal.

  1. Connect to it from your LLM IDE (Claude Code or Cursor) by adding to your configuration:
{ "Fal": { "url": "http://127.0.0.1:6274/mcp/" } }

Development Mode (with MCP Inspector)

For testing and debugging, you can run the server in development mode:

fastmcp dev main.py

This will:

  • Start the server on a random port
  • Launch the MCP Inspector web interface in your browser
  • Allow you to test all tools interactively with a web UI

The Inspector URL will be displayed in the terminal (typically http://localhost:PORT).

Environment Variables

The run_http.sh script automatically handles all environment variables for you. If you need to customize:

  • PORT: Server port for HTTP transport (default: 6274)
Setting API Key Permanently

If you prefer to set your API key permanently instead of passing it each time:

  1. Create a .env file in the project root:
echo 'FAL_KEY="YOUR_FAL_API_KEY_HERE"' > .env
  1. Then run the server without the API key argument:
./run_http.sh

For manual setup:

  • FAL_KEY: Your fal.ai API key (required)
  • MCP_TRANSPORT: Transport mode - stdio (default) or http

Available Tools

  • models(page=None, total=None) - List available models with optional pagination
  • search(keywords) - Search for models by keywords
  • schema(model_id) - Get OpenAPI schema for a specific model
  • generate(model, parameters, queue=False) - Generate content using a model
  • result(url) - Get result from a queued request
  • status(url) - Check status of a queued request
  • cancel(url) - Cancel a queued request
  • upload(path) - Upload a file to fal.ai CDN

License

MIT

-
security - not tested
A
license - permissive license
-
quality - not tested

MCP server for interacting with fal.ai models and services. Uses the latest streaming MCP support.

  1. Features
    1. Requirements
      1. Installation
        1. Usage
          1. Running the Server Locally
          2. Development Mode (with MCP Inspector)
          3. Environment Variables
        2. Available Tools
          1. License

            Related MCP Servers

            • -
              security
              A
              license
              -
              quality
              A simple MCP server for interacting with OpenAI assistants. This server allows other tools (like Claude Desktop) to create and interact with OpenAI assistants through the Model Context Protocol.
              Last updated -
              26
              Python
              MIT License
              • Apple
            • -
              security
              A
              license
              -
              quality
              MCP Server provides a simpler API to interact with the Model Context Protocol by allowing users to define custom tools and services to streamline workflows and processes.
              Last updated -
              13
              2
              TypeScript
              MIT License
            • -
              security
              F
              license
              -
              quality
              A flexible server that enables communication between AI models and tools, supporting multiple MCP servers and compatible with Claude, MCP Dockmaster, and other MCP clients.
              Last updated -
              190
              TypeScript
            • A
              security
              A
              license
              A
              quality
              Enables AI assistants to discover, retrieve details about, and manage MCP (Model Context Protocol) servers that provide additional tools and capabilities on demand.
              Last updated -
              4
              724
              1
              JavaScript
              AGPL 3.0
              • Linux
              • Apple

            View all related MCP servers

            MCP directory API

            We provide all the information about MCP servers via our MCP API.

            curl -X GET 'https://glama.ai/api/mcp/v1/servers/derekalia/fal'

            If you have feedback or need assistance with the MCP directory API, please join our Discord server