Luma MCP Server

remote-capable server

The server can be hosted and run remotely because it primarily relies on remote services or has no dependency on the local environment.

Integrations

  • Allows installation of the Luma MCP Server as an npm package for easy integration into projects.

  • Provides TypeScript type definitions and interfaces for type-safe interaction with the MCP server.

  • Uses Zod for input validation schemas to ensure properly formatted requests to the Luma AI API.

Luma MCP Server

A server that provides Luma AI's video generation API as the Model Context Protocol (MCP)

🌟 Overview

Luma MCP Server provides the video generation capabilities of Luma AI as an MCP server, providing the ability to generate videos from text and images, as well as augment and interpolate existing videos.

🏗️ Project structure

src/ ├── types/ - 型定義 │ ├── schemas.ts - 入力スキーマ │ └── types.ts - 共通型定義 ├── services/ - ビジネスロジック ├── handlers/ - リクエストハンドラー │ └── tool-handlers.ts ├── clients/ - 外部APIクライアント │ └── luma-client.ts ├── utils/ - ユーティリティ │ └── error-handler.ts ├── config/ - 設定 │ └── server-config.ts └── index.ts - エントリーポイント

📦 Installation

npm install @sunwood-ai-labs/luma-mcp-server

⚙️ Preferences

  1. Obtaining a Luma API key
  2. Setting environment variables
    export LUMA_API_KEY=your_api_key_here

🛠️ Available tools

generate_video

Generate videos from text prompts.

{ name: 'generate_video', arguments: { prompt: "A teddy bear in sunglasses playing electric guitar and dancing", loop: true, // オプション callback_url: "https://your-callback-url.com" // オプション } }

generate_video_from_image

Generate a video using the image as the starting frame.

{ name: 'generate_video_from_image', arguments: { prompt: "Low-angle shot of a majestic tiger prowling through a snowy landscape", image_url: "https://your-image-url.com/start-frame.jpg", loop: true, // オプション callback_url: "https://your-callback-url.com" // オプション } }

extend_video

Extend an existing video.

{ name: 'extend_video', arguments: { prompt: "Continue the dance sequence", source_generation_id: "existing-video-generation-id", loop: true, // オプション callback_url: "https://your-callback-url.com" // オプション } }

interpolate_videos

It smoothly interpolates between two videos.

{ name: 'interpolate_videos', arguments: { prompt: "Create a smooth transition between the videos", start_generation_id: "first-video-generation-id", end_generation_id: "second-video-generation-id", callback_url: "https://your-callback-url.com" // オプション } }

🔧 Developer Information

architecture

  • Type definitions ( types/ ) :
    • schemas.ts : Input validation schema using Zod
    • types.ts : Common type definitions and interfaces
  • handlers/ :
    • tool-handlers.ts : MCP tool request handling
  • Clients ( clients/ ) :
    • luma-client.ts : Responsible for communication with the Luma AI API
  • Utilities ( utils/ ) :
    • error-handler.ts : Unified error handling
  • Configuration ( config/ ) :
    • server-config.ts : Centralized server configuration

Error Handling

  • Unified Error Handling System
  • Proper mapping to MCP error codes
  • Detailed error messages and logging

📝 Notes

  • Please write your prompts in English
  • Video generation may take some time
  • Please be aware of API usage restrictions

🤝 Contributions

  1. Fork this repository
  2. Create a new branch ( git checkout -b feature/amazing-feature )
  3. Commit the changes ( git commit -m '✨ feat: Add amazing feature' )
  4. Push the branch ( git push origin feature/amazing-feature )
  5. Create a pull request

📄 License

MIT License - see the LICENSE file for details.

ID: ngjqwu3hyg