Skip to main content
Glama
pzfreo

build123d-mcp

shape_compare

Compare two 3D shapes by volume, bounding box, topology, and center offsets to verify matching or quantify modifications.

Instructions

Compare two named shapes (from show()) by geometry metrics: volume delta, bbox delta, topology delta (faces/edges/vertices), and center offset. Useful when you have an intended design and a reference/test shape and want to verify they match — or to quantify how a modification changed the geometry.

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
object_aYes
object_bYes

Output Schema

TableJSON Schema
NameRequiredDescriptionDefault
resultYes
Behavior3/5

Does the description disclose side effects, auth requirements, rate limits, or destructive behavior?

With no annotations provided, the description carries the full burden. It mentions the metrics computed but does not state whether the tool mutates state, requires certain preconditions, or has side effects. The read-only nature is implied but not explicit.

Agents need to know what a tool does to the world before calling it. Descriptions should go beyond structured annotations to explain consequences.

Conciseness5/5

Is the description appropriately sized, front-loaded, and free of redundancy?

Two concise sentences: the first defines the function and metrics, the second explains usage. No wasted words, front-loaded with the action verb 'Compare'.

Shorter descriptions cost fewer tokens and are easier for agents to parse. Every sentence should earn its place.

Completeness3/5

Given the tool's complexity, does the description cover enough for an agent to succeed on first attempt?

Despite having an output schema (not shown), the description covers the metrics but does not address prerequisites like whether shapes must be loaded or from the same session. The low schema description coverage (0%) places more burden on the description, which it partially meets.

Complex tools with many parameters or behaviors need more documentation. Simple tools need less. This dimension scales expectations accordingly.

Parameters4/5

Does the description clarify parameter syntax, constraints, interactions, or defaults beyond what the schema provides?

The input schema only has 'Object A' and 'Object B' with no descriptions. The description adds that these are 'two named shapes (from show())', clarifying they are shape identifiers. This adds meaning beyond the schema, though format details are missing.

Input schemas describe structure but not intent. Descriptions should explain non-obvious parameter relationships and valid value ranges.

Purpose5/5

Does the description clearly state what the tool does and how it differs from similar tools?

The description explicitly states 'Compare two named shapes (from show()) by geometry metrics: volume delta, bbox delta, topology delta (faces/edges/vertices), and center offset.' This provides a specific verb and resource, clearly differentiating from sibling tools like diff_snapshot or interference.

Agents choose between tools based on descriptions. A clear purpose with a specific verb and resource helps agents select the right tool.

Usage Guidelines4/5

Does the description explain when to use this tool, when not to, or what alternatives exist?

The description gives clear use cases: 'Useful when you have an intended design and a reference/test shape and want to verify they match — or to quantify how a modification changed the geometry.' It lacks explicit when-not-to-use or alternative recommendations, but the context is strong.

Agents often have multiple tools that could apply. Explicit usage guidance like "use X instead of Y when Z" prevents misuse.

Install Server

Other Tools

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/pzfreo/build123d-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server