Skip to main content
Glama

Manim MCP Server

Manim MCP Server

A Model Context Protocol (MCP) server for compiling and serving Manim animations.

🎯 Two Server Modes

  1. HTTP API Server (app/server.py) - For REST API calls, testing, and web integration

  2. Standard MCP Server (mcp_server.py) - For Claude Desktop, Dify, and other MCP clients

See MCP_SETUP.md for detailed MCP configuration instructions.

A FastAPI-based MCP (Model Control Protocol) server that provides two main tools:

  1. Manim Compile: Compile Manim code and return a video ID

  2. Video Download: Download a compiled Manim video by ID

Features

  • Secure authentication using JWT tokens

  • LangGraph integration for workflow management

  • Support for different video qualities and resolutions

  • Simple API endpoints for integration

Prerequisites

  • Python 3.8+

  • Manim Community Edition (v0.19.0 or later)

  • FFmpeg

  • Required Python packages (see requirements.txt)

Installation

  1. Clone the repository:

    git clone <repository-url> cd manim-mcp-server
  2. Create a virtual environment and activate it:

    python -m venv venv source venv/bin/activate # On Windows: venv\Scripts\activate
  3. Install the required packages:

    pip install -r requirements.txt
  4. Install Manim and its dependencies:

    pip install manim

Configuration

  1. Set up environment variables (create a .env file):

    SECRET_KEY=your-secret-key-here ACCESS_TOKEN_EXPIRE_MINUTES=30

Running the Server

Option 1: Using the startup script (recommended)

./start_server.sh

Option 2: Using uvicorn directly

uvicorn app.server:app --reload

The server will be available at http://localhost:8000

API Documentation

Once the server is running, you can access the interactive API documentation at:

API Endpoints

Root

  • GET / - Get server information and available tools

Manim Compilation

  • POST /tools/manim_compile - Compile Manim code

    { "parameters": { "code": "from manim import *\nclass Example(Scene):\n def construct(self):\n circle = Circle()\n self.play(Create(circle))", "scene_name": "Example" } }

    Parameters:

    • code (required): The Manim Python code to compile

    • scene_name (required): Name of the specific scene class to compile

Video Download

  • GET /videos/{file_id} - Download a compiled video by ID

LangGraph Compatible Endpoints

  • GET /v1/tools - List all available tools

  • POST /v1/tools/call - Call a tool (LangGraph compatible)

    { "tool": "manim_compile", "parameters": { "code": "from manim import *\nclass Example(Scene):\n def construct(self):\n circle = Circle()\n self.play(Create(circle))" } }

Example Usage

1. Check server status

curl http://localhost:8000/

2. Compile Manim code

curl -X 'POST' \ 'http://localhost:8000/tools/manim_compile' \ -H 'accept: application/json' \ -H 'Content-Type: application/json' \ -d '{ "parameters": { "code": "from manim import *\nclass Example(Scene):\n def construct(self):\n circle = Circle()\n self.play(Create(circle))" } }'

3. Download the compiled video

# Replace VIDEO_ID with the file_id from the compile response curl -X 'GET' \ 'http://localhost:8000/videos/VIDEO_ID' \ --output output.mp4

4. Compile a specific scene by name

curl -X 'POST' \ 'http://localhost:8000/tools/manim_compile' \ -H 'accept: application/json' \ -H 'Content-Type: application/json' \ -d '{ "parameters": { "code": "from manim import *\nclass Scene1(Scene):\n def construct(self):\n circle = Circle()\n self.play(Create(circle))\n\nclass Scene2(Scene):\n def construct(self):\n square = Square()\n self.play(Create(square))", "scene_name": "Scene1" } }'

5. List available tools

curl http://localhost:8000/v1/tools

6. Run the example script

python example_usage.py

Testing

See TESTING.md for detailed testing instructions.

Quick test:

# Run tool tests (no server needed) python test_tools.py # Run API tests (server must be running) python test_api.py

Security

  • Always use HTTPS in production

  • Consider adding authentication for production deployments

  • Validate and sanitize all user inputs

  • Set appropriate CORS policies for your use case

License

This project is licensed under the MIT License - see the LICENSE file for details.

-
security - not tested
F
license - not found
-
quality - not tested

local-only server

The server can only run on the client's local machine because it depends on local resources.

Enables compilation and serving of Manim animations through natural language. Supports compiling Manim Python code into videos and downloading the generated animations with secure authentication.

  1. 🎯 Two Server Modes
    1. Features
      1. Prerequisites
        1. Installation
          1. Configuration
            1. Running the Server
              1. API Documentation
                1. API Endpoints
                  1. Root
                  2. Manim Compilation
                  3. Video Download
                  4. LangGraph Compatible Endpoints
                2. Example Usage
                  1. 1. Check server status
                  2. 2. Compile Manim code
                  3. 3. Download the compiled video
                  4. 4. Compile a specific scene by name
                  5. 5. List available tools
                  6. 6. Run the example script
                3. Testing
                  1. Security
                    1. License

                      MCP directory API

                      We provide all the information about MCP servers via our MCP API.

                      curl -X GET 'https://glama.ai/api/mcp/v1/servers/qingpengchen2011/manim-mcp'

                      If you have feedback or need assistance with the MCP directory API, please join our Discord server