Skip to main content
Glama
crabsmadethis

crabsmadethis/d2r-horadric-tools

d2r_mod_build

Build a Diablo II Resurrected mod by combining vanilla game files with custom overlays and scripts into a single build directory. Optionally detect conflicts, skip regeneration, or set game directory.

Instructions

Build mod from vanilla + overlays + scripts into build/.

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
warn_conflictsNo
no_regenNo
game_dirNo

Implementation Reference

  • The @mcp.tool() decorator registers the d2r_mod_build tool on the FastMCP server. The handler delegates to _mod_build (imported from d2r_mcp.mod).
    @mcp.tool()
    async def d2r_mod_build(warn_conflicts: bool = False, no_regen: bool = False,
                            game_dir: str | None = None) -> dict:
        """Build mod from vanilla + overlays + scripts into build/."""
        return _mod_build(warn_conflicts=warn_conflicts, no_regen=no_regen,
                          game_dir=game_dir)
  • The build() function in d2r_mcp/mod.py is the actual handler. It calls d2r_mod.build.build_mod() with vanilla_dir, overlays_dir, scripts_dir, build_dir, and returns an envelope with warnings.
    def build(warn_conflicts: bool = False, no_regen: bool = False,
              game_dir: str | None = None) -> dict:
        """Build mod from vanilla + overlays + scripts.
    
        Returns envelope with captured warnings from build_mod.
        """
        from d2r_mod.build import build_mod, DEFAULT_GAME_DIR as _DEFAULT
        root = _project_root()
        try:
            warnings = build_mod(
                vanilla_dir=os.path.join(root, "vanilla"),
                overlays_dir=os.path.join(root, "overlays"),
                scripts_dir=os.path.join(root, "scripts"),
                build_dir=os.path.join(root, "build"),
                regen=not no_regen,
                game_dir=game_dir or _DEFAULT,
                warn_conflicts=warn_conflicts,
            )
        except FileNotFoundError as ex:
            return error("missing_dir", str(ex))
        except Exception as ex:
            return error("build_exception", f"{type(ex).__name__}: {ex}")
        return ok(warnings=list(warnings) if warnings else [],
                  build_dir=os.path.join(root, "build"))
  • The core build_mod() function orchestrates the full build pipeline: loads .txt files, applies YAML overlays, runs Python scripts, writes modified .txt files, copies non-.txt files, runs JSON patch scripts, patches .tbl string tables, registers custom unique display names, builds string registry, patches JSON string files, writes modinfo.json/dataversionbuild.txt, and optionally regenerates chargen data.
    def build_mod(
        vanilla_dir: str,
        overlays_dir: str,
        scripts_dir: str,
        build_dir: str,
        regen: bool = True,
        game_dir: str = DEFAULT_GAME_DIR,
        warn_conflicts: bool = False,
    ) -> list[str]:
        """Run the full build pipeline. Returns list of warning strings."""
        warnings = []
    
        if not os.path.isdir(vanilla_dir):
            raise FileNotFoundError(f"vanilla/ not found at {vanilla_dir}")
        txt_files = _find_txt_files(vanilla_dir)
        if not txt_files:
            raise FileNotFoundError(f"No .txt files in {vanilla_dir}")
    
        if not os.path.isdir(overlays_dir):
            warnings.append(f"Overlays directory not found: {overlays_dir}")
        if not os.path.isdir(scripts_dir):
            warnings.append(f"Scripts directory not found: {scripts_dir}")
    
        stale_warning = check_stale(vanilla_dir, game_dir)
        if stale_warning:
            warnings.append(stale_warning)
            print(f"WARNING: {stale_warning}")
    
        # Step 1: Load all .txt files
        tables: dict[str, list[dict]] = {}
        headers: dict[str, list[str]] = {}
        for rel_path, abs_path in txt_files.items():
            rows = read_tsv_file(abs_path)
            tables[rel_path] = rows
            if rows:
                headers[rel_path] = list(rows[0].keys())
    
        # Step 2: Apply overlays
        overlay_paths = sorted(glob.glob(os.path.join(overlays_dir, "*.yaml")))
        touched_cells: dict[tuple, str] = {}
    
        for ov_path in overlay_paths:
            overlay = load_overlay_file(ov_path)
            target = overlay["target"]
            if target not in tables:
                raise FileNotFoundError(
                    f"Overlay {os.path.basename(ov_path)} targets {target} "
                    f"which does not exist in vanilla/"
                )
    
            if warn_conflicts:
                for change in overlay.get("changes", []):
                    selector_key = tuple(sorted(change["row"].items()))
                    for op in ("set", "multiply", "add"):
                        if op in change:
                            for col in change[op]:
                                cell_key = (target, selector_key, col)
                                if cell_key in touched_cells:
                                    prev = touched_cells[cell_key]
                                    warnings.append(
                                        f"Conflict: {target} [{dict(selector_key)}] "
                                        f"column '{col}' touched by both "
                                        f"{prev} and {os.path.basename(ov_path)}"
                                    )
                                touched_cells[cell_key] = os.path.basename(ov_path)
    
            ov_warnings = apply_overlay(tables[target], overlay)
            warnings.extend(ov_warnings)
    
        # Step 3: Run scripts
        # Scripts named new_*.py are allowed to append rows (allow_add=True).
        # Scripts matching *-ui-json.py are JSON patch scripts (run in Step 5b).
        script_paths = sorted(glob.glob(os.path.join(scripts_dir, "*.py")))
        for script_path in script_paths:
            basename = os.path.basename(script_path)
            if basename.endswith("-ui-json.py"):
                continue  # handled in Step 5b
            allow_add = basename.startswith("new_")
            script_warnings = run_script(script_path, tables, allow_add=allow_add)
            warnings.extend(script_warnings)
    
        # Step 4: Write modified .txt to build/
        if os.path.exists(build_dir):
            shutil.rmtree(build_dir)
    
        for rel_path, rows in tables.items():
            out_path = os.path.join(build_dir, rel_path)
            write_tsv_file(out_path, rows, headers.get(rel_path))
    
        # Step 5: Copy non-.txt files through
        all_files = _find_all_files(vanilla_dir)
        for rel_path, abs_path in all_files.items():
            if rel_path in tables:
                continue
            out_path = os.path.join(build_dir, rel_path)
            os.makedirs(os.path.dirname(out_path), exist_ok=True)
            shutil.copy2(abs_path, out_path)
    
        # Step 5b: Run JSON patch scripts against build/
        json_scripts = sorted(glob.glob(os.path.join(scripts_dir, "*-ui-json.py")))
        for script_path in json_scripts:
            spec = importlib.util.spec_from_file_location("json_patch", script_path)
            mod = importlib.util.module_from_spec(spec)
            spec.loader.exec_module(mod)
            if hasattr(mod, "apply"):
                result = mod.apply(build_dir)
                if result:
                    warnings.extend(result)
    
        # Step 5c: Patch .tbl string tables
        from d2r_mod.assets.tbl import patch_tbl
        patches_strings_dir = os.path.join(
            os.path.dirname(overlays_dir), "patches", "strings"
        )
        if os.path.isdir(patches_strings_dir):
            string_yamls = sorted(glob.glob(
                os.path.join(patches_strings_dir, "*.yaml")
            ))
            for yaml_path in string_yamls:
                basename = os.path.basename(yaml_path)
                if basename.startswith("_"):
                    continue  # skip smoke tests / templates
                with open(yaml_path) as f:
                    config = yaml.safe_load(f)
                target = config.get("target", "")
                entries_list = config.get("entries", [])
                if not target or not entries_list:
                    continue
                overrides = {e["key"]: e["value"] for e in entries_list}
                # Find all matching .tbl files in build output (one per language)
                tbl_paths = []
                for root, _, files in os.walk(build_dir):
                    for fn in files:
                        if fn == target:
                            tbl_paths.append(os.path.join(root, fn))
                if not tbl_paths:
                    warnings.append(f"StringPatch: target not found: {target}")
                    continue
                for tbl_path in tbl_paths:
                    patch_tbl(tbl_path, overrides, tbl_path)
                warnings.append(
                    f"StringPatch: patched {len(overrides)} strings in {target} "
                    f"({len(tbl_paths)} files) ({basename})"
                )
    
        # Step 5d: Auto-register custom unique display names in expansionstring.tbl
        # Any UniqueItems.txt index that is absent from the vanilla key corpus is
        # added as a name→name entry so D2R can resolve the display string.
        # Names already served by JSON (vanilla item-names.json or any
        # patches/json_strings/ patch) are skipped — D2R reads item names from JSON
        # not TBL, so a TBL write for a JSON-served key is dead weight
        # (feedback_strings_json_vs_tbl.md).
        from d2r_mod.build_steps.register_custom_uniques import (
            run as _register_custom_uniques,
            load_vanilla_keys as _load_vanilla_keys,
            DEFAULT_TARGET_TBL as _CUSTOM_UNIQUES_TBL,
        )
        _unique_items_build_path = os.path.join(
            build_dir, "data", "global", "excel", "UniqueItems.txt"
        )
        if os.path.exists(_unique_items_build_path):
            _vanilla_keys = _load_vanilla_keys()
            from tools.audit_string_registry import (
                _load_vanilla_index as _load_json_vanilla_index,
                _load_patch_keys as _load_json_patch_keys,
            )
            _json_vanilla_keys = set(
                _load_json_vanilla_index(
                    os.path.join(vanilla_dir, "data", "local", "lng", "strings")
                )
            )
            _json_patch_keys = _load_json_patch_keys(
                os.path.join(os.path.dirname(overlays_dir), "patches", "json_strings")
            )
            _json_served = _json_vanilla_keys | _json_patch_keys
            # Register into eng only (English); multi-lang extension is a future concern.
            _target_tbl_path = os.path.join(
                build_dir, "data", "local", "lng", "eng",
                f"{_CUSTOM_UNIQUES_TBL}.tbl"
            )
            _reg_result = _register_custom_uniques(
                _unique_items_build_path, _target_tbl_path, _vanilla_keys,
                json_served_names=_json_served,
            )
            _msg = (
                f"CustomUniques: registered {_reg_result['added']} new name(s) in "
                f"eng/{_CUSTOM_UNIQUES_TBL}.tbl "
                f"(skipped {_reg_result['skipped']} vanilla/existing"
            )
            if _reg_result.get("skipped_json"):
                _msg += f", skipped {_reg_result['skipped_json']} json-served"
            _msg += ")"
            warnings.append(_msg)
        else:
            warnings.append("CustomUniques: UniqueItems.txt not found in build — skipping")
    
        # Step 5e: Build string registry (custom strings for runtime injection)
        # Diffs built .tbl files against vanilla to produce a flat key→value
        # registry consumed by the runtime string injector.
        from d2r_mod.build_steps.build_string_registry import run as _build_string_registry
        _str_registry = _build_string_registry(
            build_dir=build_dir,
            vanilla_dir=vanilla_dir,
            write=True,
        )
        _total_custom = sum(len(v) for v in _str_registry.values())
        if _total_custom:
            warnings.append(
                f"StringRegistry: {_total_custom} custom string(s) across "
                f"{len(_str_registry)} table(s) → string_registry.json"
            )
        else:
            warnings.append("StringRegistry: no custom strings detected")
    
        # Step 5f: Patch JSON string files (new keys for D2R's JSON string system)
        _json_patches_dir = os.path.join(
            os.path.dirname(overlays_dir), "patches", "json_strings"
        )
        if os.path.isdir(_json_patches_dir):
            from d2r_mod.build_steps.patch_json_strings import run as _patch_json_strings
            _json_result = _patch_json_strings(
                patches_dir=_json_patches_dir,
                vanilla_dir=vanilla_dir,
                build_dir=build_dir,
            )
            if _json_result["added"]:
                warnings.append(
                    f"JsonStrings: added {_json_result['added']} new key(s) to "
                    f"{', '.join(_json_result['files'])}"
                )
            if _json_result.get("overridden"):
                warnings.append(
                    f"JsonStrings: overrode {_json_result['overridden']} existing key(s)"
                )
        else:
            warnings.append("JsonStrings: no patches/json_strings/ directory — skipping")
    
        # Step 6: Write modinfo.json (required for D2R to load mod .txt files)
        import json
        modinfo_path = os.path.join(build_dir, "modinfo.json")
        with open(modinfo_path, "w") as f:
            json.dump({"name": "rebalance", "savepath": "../"}, f, indent=2)
            f.write("\n")
    
        # Step 6b: Write dataversionbuild.txt (prevents "Data version mismatch" warning)
        # Read build number from .build.info, or copy from vanilla/ if already extracted
        dvb_vanilla = os.path.join(vanilla_dir, "data", "global", "dataversionbuild.txt")
        dvb_out = os.path.join(build_dir, "data", "global", "dataversionbuild.txt")
        if os.path.exists(dvb_vanilla):
            os.makedirs(os.path.dirname(dvb_out), exist_ok=True)
            shutil.copy2(dvb_vanilla, dvb_out)
        elif game_dir is not None:
            # Generate from .build.info
            from d2r_mod.casc import _parse_build_info
            build_info_path = os.path.join(game_dir, ".build.info")
            if os.path.exists(build_info_path):
                with open(build_info_path, "r") as f:
                    lines = f.read().strip().split("\n")
                headers = [h.split("!")[0] for h in lines[0].split("|")]
                values = lines[1].split("|")
                for h, v in zip(headers, values):
                    if h == "Version":
                        # Version is like "3.1.92198" — take the last component
                        build_num = v.strip().rsplit(".", 1)[-1]
                        os.makedirs(os.path.dirname(dvb_out), exist_ok=True)
                        with open(dvb_out, "w") as f:
                            f.write(build_num)
                        break
    
        # Step 7: Regen chargen data
        if regen:
            from d2r_mod.regen import regen_all
            regen_all(build_dir)
    
        return warnings
Behavior2/5

Does the description disclose side effects, auth requirements, rate limits, or destructive behavior?

No annotations provided, and the description only states the action without revealing behavioral traits such as side effects (e.g., file creation), permissions needed, or idempotency. The mention of 'into build/' implies file output but lacks specificity.

Agents need to know what a tool does to the world before calling it. Descriptions should go beyond structured annotations to explain consequences.

Conciseness4/5

Is the description appropriately sized, front-loaded, and free of redundancy?

Extremely concise single sentence with no wasted words. However, it could be slightly expanded to improve completeness without losing conciseness.

Shorter descriptions cost fewer tokens and are easier for agents to parse. Every sentence should earn its place.

Completeness2/5

Given the tool's complexity, does the description cover enough for an agent to succeed on first attempt?

Given 3 optional parameters and no output schema, the description is too minimal. It fails to explain concepts like overlays/scripts, the structure of build/, or return value. Sibling tools provide context but the description itself is incomplete.

Complex tools with many parameters or behaviors need more documentation. Simple tools need less. This dimension scales expectations accordingly.

Parameters1/5

Does the description clarify parameter syntax, constraints, interactions, or defaults beyond what the schema provides?

Schema has 3 parameters with 0% documented in description. The description adds no meaning beyond schema field names. Agent gets no help on what these parameters do or how to set them.

Input schemas describe structure but not intent. Descriptions should explain non-obvious parameter relationships and valid value ranges.

Purpose5/5

Does the description clearly state what the tool does and how it differs from similar tools?

Clearly states the verb 'Build' and the resource 'mod' with inputs (vanilla, overlays, scripts) and output destination (build/). Distinguishes from sibling tools like d2r_mod_clean and d2r_mod_deploy.

Agents choose between tools based on descriptions. A clear purpose with a specific verb and resource helps agents select the right tool.

Usage Guidelines2/5

Does the description explain when to use this tool, when not to, or what alternatives exist?

No guidance on when to use this tool versus alternatives, no prerequisites or conditions mentioned. Relies solely on the tool name to imply usage.

Agents often have multiple tools that could apply. Explicit usage guidance like "use X instead of Y when Z" prevents misuse.

Install Server

Other Tools

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/crabsmadethis/d2r-horadric-tools'

If you have feedback or need assistance with the MCP directory API, please join our Discord server