Skip to main content
Glama

upload_file

Upload local files to Fal.ai storage to generate URLs for use with AI models like image-to-video conversion and audio transformation tools.

Instructions

Upload a local file to Fal.ai storage and get a URL. Use this to upload images, videos, or audio files that can then be used with other Fal.ai tools (e.g., image-to-video, audio transform).

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
file_pathYesAbsolute path to the local file to upload (e.g., '/path/to/image.png')

Implementation Reference

  • The core handler function that implements the upload_file tool. It takes a file_path argument, checks if the file exists, uploads it using fal_client.upload_file wrapped in asyncio.to_thread, and returns the resulting URL or error messages.
    async def handle_upload_file( arguments: Dict[str, Any], registry: ModelRegistry, # Not used but kept for consistency ) -> List[TextContent]: """Handle the upload_file tool.""" file_path = arguments.get("file_path") if not file_path: return [ TextContent( type="text", text="❌ No file path specified. Provide the absolute path to the file.", ) ] try: # Use the synchronous upload wrapped in asyncio import asyncio import os if not os.path.exists(file_path): return [ TextContent( type="text", text=f"❌ File not found: {file_path}", ) ] # Upload using fal_client url = await asyncio.to_thread(fal_client.upload_file, file_path) return [ TextContent( type="text", text=f"✅ File uploaded successfully!\n\n**URL**: {url}\n\nYou can use this URL with image-to-video, image-to-image, or other tools.", ) ] except Exception as e: logger.error("File upload failed: %s", e) return [ TextContent( type="text", text=f"❌ Upload failed: {e}", ) ]
  • The MCP Tool schema definition for the upload_file tool, specifying the inputSchema with required file_path parameter.
    Tool( name="upload_file", description="Upload a local file to Fal.ai storage and get a URL. Use this to upload images, videos, or audio files that can then be used with other Fal.ai tools (e.g., image-to-video, audio transform).", inputSchema={ "type": "object", "properties": { "file_path": { "type": "string", "description": "Absolute path to the local file to upload (e.g., '/path/to/image.png')", }, }, "required": ["file_path"], }, ), ]
  • Registration of the upload_file handler in the TOOL_HANDLERS dictionary used by the stdio MCP server.
    TOOL_HANDLERS = { # Utility tools (no queue needed) "list_models": handle_list_models, "recommend_model": handle_recommend_model, "get_pricing": handle_get_pricing, "get_usage": handle_get_usage, "upload_file": handle_upload_file, # Image generation tools "generate_image": handle_generate_image, "generate_image_structured": handle_generate_image_structured, "generate_image_from_image": handle_generate_image_from_image, # Image editing tools "remove_background": handle_remove_background, "upscale_image": handle_upscale_image, "edit_image": handle_edit_image, "inpaint_image": handle_inpaint_image, "resize_image": handle_resize_image, "compose_images": handle_compose_images, # Video tools "generate_video": handle_generate_video, "generate_video_from_image": handle_generate_video_from_image, "generate_video_from_video": handle_generate_video_from_video, # Audio tools "generate_music": handle_generate_music, }
  • Registration of the upload_file handler in the TOOL_HANDLERS dictionary used by the HTTP/SSE MCP server.
    TOOL_HANDLERS = { # Utility tools (no queue needed) "list_models": handle_list_models, "recommend_model": handle_recommend_model, "get_pricing": handle_get_pricing, "get_usage": handle_get_usage, "upload_file": handle_upload_file, # Image tools "generate_image": handle_generate_image, "generate_image_structured": handle_generate_image_structured, "generate_image_from_image": handle_generate_image_from_image, # Video tools "generate_video": handle_generate_video, "generate_video_from_image": handle_generate_video_from_image, "generate_video_from_video": handle_generate_video_from_video, # Audio tools "generate_music": handle_generate_music, } # Tools that don't require a queue strategy NO_QUEUE_TOOLS = { "list_models", "recommend_model", "get_pricing", "get_usage", "upload_file", }
  • Re-export of the handle_upload_file function in handlers/__init__.py for easy import in server files.
    from fal_mcp_server.handlers.utility_handlers import ( handle_get_pricing, handle_get_usage, handle_list_models, handle_recommend_model, handle_upload_file, )

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/raveenb/fal-mcp-server'

If you have feedback or need assistance with the MCP directory API, please join our Discord server