Skip to main content
Glama

Animagine MCP

FastMCP server for the Animagine XL 4.0 image generation experience, providing prompt validation, optimization, explanation, and checkpoint/LoRA management tools.

Overview

  • Exposes validate_prompt, optimize_prompt, explain_prompt, list_models, load_checkpoint, and unload_loras through FastMCP.

  • Normalizes prompts for consistent structure, category coverage, and tag ordering before handing them to the diffusion pipeline.

  • Integrates with local checkpoint and LoRA assets stored under checkpoints/ and loras/.

  • Encourages responsible use: the platform can technically generate NSFW material, but choosing to do so and owning the resulting content is the caller's responsibility.

Requirements

  • Python >= 3.10

  • GPU with CUDA support for production-grade generation (or compatible Accelerate/torch backends)

  • git plus a package tool such as pip, poetry, or hatch

Installation and usage

  1. Clone the repository and create a virtual environment:

    python -m venv .venv source .venv/bin/activate
  2. Install the main dependencies:

    pip install -e .
  3. (Optional) Install development dependencies:

    pip install -e .[dev]
  4. Start the MCP server:

    animagine-mcp

    This registers the FastMCP tools defined in src/animagine_mcp/server.py and exposes them to MCP clients.

Core tools

  • validate_prompt(prompt, width=832, height=1216, negative_prompt=None) – enforces quality rules, tag ordering, resolution compatibility, and other prompt health checks.

  • optimize_prompt(description=None, prompt=None) – restructures tags, fills missing categories, and keeps quality tags trailing.

  • explain_prompt(prompt) – breaks down each tag by category, intent, and effect while presenting canonically ordered prompts.

  • list_models() – lists available checkpoints, LoRAs, and currently loaded weights.

  • load_checkpoint(checkpoint=None) – preloads a specific checkpoint (or uses the Animagine XL 4.0 default) to reduce latency.

  • unload_loras() – strips LoRAs from the pipeline so the base checkpoint styling can be restored quickly.

Repository layout

  • src/animagine_mcp/ – core package with contracts, prompt processing, diffusion wiring, and the FastMCP server.

  • checkpoints/ – optional .safetensors/.ckpt files referenced by load_checkpoint.

  • loras/ – LoRA modifiers for stylistic tweaks and performance-aligned variants.

  • pyproject.toml – metadata, dependencies, scripts, and build configuration.

  • 02-behavior/ through 05-implementation/ – documentation, standards, and implementation notes that guide the MCP and Codex behaviors.

Suggested workflow

  1. Run animagine-mcp so the tools become available.

  2. Use validate_prompt to inspect the user prompt for issues before generation.

  3. Apply optimize_prompt or explain_prompt as needed to refine or understand prompt structure.

  4. Load a checkpoint/LoRA with load_checkpoint/unload_loras before invoking downstream generation.

  5. Reference list_models whenever you need to know what weights are available or loaded.

Development notes

  • Run tests (once available) using pytest tests/ or the appropriate test folder.

  • Apply formatting and linting tools (e.g., ruff, black) as configured in your workflow.

  • Keep documentation, README, and inline comments aligned with code changes.

Contributions & behavior

  • Open clear pull requests that describe the issue, resolution, and linked issues when applicable.

  • Include tests and documentation updates for new tools, contracts, or behaviors.

  • Promote responsible use; the server enables NSFW generation only when the caller deliberately requests it.

Support

Open an issue describing the desired workflow, including relevant prompts and logs (omit sensitive content). We aim for transparent, responsible guidance.

-
security - not tested
A
license - permissive license
-
quality - not tested

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/gabrielalmir/mcp-animaginexl'

If you have feedback or need assistance with the MCP directory API, please join our Discord server