Skip to main content
Glama

validate_ifc_model

Check IFC building models against IDS specifications to verify compliance with required data standards and identify validation issues.

Instructions

Validate an IFC model against the current session's IDS specifications.

This bonus feature leverages IfcTester's IFC validation capabilities.

Args: ifc_file_path: Path to IFC file ctx: FastMCP Context (auto-injected) report_format: "console", "json", or "html"

Returns (json format): { "status": "validation_complete", "total_specifications": 3, "passed_specifications": 2, "failed_specifications": 1, "report": { "specifications": [ { "name": "Wall Fire Rating", "status": "passed", "applicable_entities": 25, "passed_entities": 25, "failed_entities": 0 }, ... ] } }

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
ifc_file_pathYes
report_formatNojson

Implementation Reference

  • The core handler function for the 'validate_ifc_model' tool. It loads the IFC file using ifcopenshell, validates it against the current IDS session using IfcTester's ids.validate method, and returns a report in the specified format (json, console, html).
    async def validate_ifc_model( ifc_file_path: str, ctx: Context, report_format: str = "json" ) -> Dict[str, Any]: """ Validate an IFC model against the current session's IDS specifications. This bonus feature leverages IfcTester's IFC validation capabilities. Args: ifc_file_path: Path to IFC file ctx: FastMCP Context (auto-injected) report_format: "console", "json", or "html" Returns (json format): { "status": "validation_complete", "total_specifications": 3, "passed_specifications": 2, "failed_specifications": 1, "report": { "specifications": [ { "name": "Wall Fire Rating", "status": "passed", "applicable_entities": 25, "passed_entities": 25, "failed_entities": 0 }, ... ] } } """ import json as json_lib try: ids_obj = await get_or_create_session(ctx) await ctx.info(f"Validating IFC model: {ifc_file_path}") # Validate file exists if not Path(ifc_file_path).exists(): raise ToolError(f"IFC file not found: {ifc_file_path}") # Check has specifications if not ids_obj.specifications: raise ToolError("IDS has no specifications to validate against") # Load IFC file await ctx.info("Loading IFC file...") ifc_file = ifcopenshell.open(ifc_file_path) # Validate await ctx.info("Running validation...") ids_obj.validate(ifc_file) # Generate report if report_format == "console": reporter.Console(ids_obj).report() return {"status": "validation_complete", "format": "console"} elif report_format == "json": json_reporter = reporter.Json(ids_obj) json_reporter.report() raw_report = json_reporter.to_string() # Parse the JSON report to extract structured data try: report_data = json_lib.loads(raw_report) # Extract specification-level summary specifications_summary = [] passed_count = 0 failed_count = 0 for spec in ids_obj.specifications: # Count applicable, passed, and failed entities for this spec applicable = 0 passed = 0 failed = 0 # IfcTester stores results in spec after validation if hasattr(spec, 'requirements'): for req in spec.requirements: if hasattr(req, 'failed_entities'): failed += len(req.failed_entities) if req.failed_entities else 0 if hasattr(req, 'passed_entities'): passed += len(req.passed_entities) if req.passed_entities else 0 applicable = passed + failed spec_status = "passed" if failed == 0 and applicable > 0 else "failed" if failed > 0 else "no_applicable_entities" if spec_status == "passed": passed_count += 1 elif spec_status == "failed": failed_count += 1 specifications_summary.append({ "name": spec.name if spec.name else f"Specification {len(specifications_summary)}", "status": spec_status, "applicable_entities": applicable, "passed_entities": passed, "failed_entities": failed }) return { "status": "validation_complete", "total_specifications": len(ids_obj.specifications), "passed_specifications": passed_count, "failed_specifications": failed_count, "report": { "specifications": specifications_summary, "raw_json": report_data # Include original report } } except Exception as parse_error: # Fallback if parsing fails - return raw report await ctx.warning(f"Could not parse report structure: {parse_error}") return { "status": "validation_complete", "total_specifications": len(ids_obj.specifications), "format": "json", "report": raw_report } elif report_format == "html": html_reporter = reporter.Html(ids_obj) html_reporter.report() return { "status": "validation_complete", "total_specifications": len(ids_obj.specifications), "format": "html", "html": html_reporter.to_string() } else: raise ToolError(f"Invalid report format: {report_format}. Must be 'console', 'json', or 'html'") except FileNotFoundError as e: await ctx.error(f"File not found: {str(e)}") raise ToolError(f"File not found: {str(e)}") except Exception as e: await ctx.error(f"IFC validation error: {str(e)}") raise ToolError(f"IFC validation error: {str(e)}")
  • Registration of the 'validate_ifc_model' tool in the FastMCP server instance.
    mcp_server.tool(validation.validate_ifc_model)
  • Input parameters and output format defined in the function docstring, serving as the tool schema.
    """ Validate an IFC model against the current session's IDS specifications. This bonus feature leverages IfcTester's IFC validation capabilities. Args: ifc_file_path: Path to IFC file ctx: FastMCP Context (auto-injected) report_format: "console", "json", or "html" Returns (json format): { "status": "validation_complete", "total_specifications": 3, "passed_specifications": 2, "failed_specifications": 1, "report": { "specifications": [ { "name": "Wall Fire Rating", "status": "passed", "applicable_entities": 25, "passed_entities": 25, "failed_entities": 0 }, ... ] } } """

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/vinnividivicci/ifc-ids-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server