Skip to main content
Glama
tresor4k

macalc

calculate_torus

Calculate volume and surface area of a torus (ring shape) using major and minor radii. Ideal for donuts, inner tubes, and other ring-shaped objects.

Instructions

Compute torus volume V=2π²Rr² and surface area. Use for ring-shaped objects (donuts, inner tubes). Inputs: major radius R, minor radius r. Returns volume and area. See list_bundles for related 'math' calculators.

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
major_rYesMajor radius (center to tube center)
minor_rYesMinor radius (tube radius)

Output Schema

TableJSON Schema
NameRequiredDescriptionDefault
resultNoComputed result. Object whose fields depend on the tool (e.g. {tax, marginal_rate, brackets} for tax tools, {volume_l, gallons} for volume tools).
formulaNoHuman-readable formula or method used (e.g. "I=P·r·t", "Magnus formula").
sourceNoAuthoritative source for the rule or formula (e.g. "Article 197 CGI", "NF DTU 21").
reference_urlNoLink to a calcul2 page documenting the calculation in detail.
Behavior4/5

Does the description disclose side effects, auth requirements, rate limits, or destructive behavior?

No annotations are provided, so the description bears the full burden. It discloses that it performs computation and returns volume and area. No destructive or rate-limit concerns exist for a calculator, so this is adequate.

Agents need to know what a tool does to the world before calling it. Descriptions should go beyond structured annotations to explain consequences.

Conciseness5/5

Is the description appropriately sized, front-loaded, and free of redundancy?

Three sentences, front-loaded with the key computation, efficiently providing all necessary information without redundancy.

Shorter descriptions cost fewer tokens and are easier for agents to parse. Every sentence should earn its place.

Completeness5/5

Given the tool's complexity, does the description cover enough for an agent to succeed on first attempt?

Given the tool has an output schema (though not shown) and the description mentions return values, it is complete. No additional information is needed for a straightforward mathematical calculation.

Complex tools with many parameters or behaviors need more documentation. Simple tools need less. This dimension scales expectations accordingly.

Parameters4/5

Does the description clarify parameter syntax, constraints, interactions, or defaults beyond what the schema provides?

Schema coverage is 100%, so baseline is 3. The description adds value by explaining 'major radius R' and 'minor radius r' in the context of a torus, linking them to the shape description.

Input schemas describe structure but not intent. Descriptions should explain non-obvious parameter relationships and valid value ranges.

Purpose5/5

Does the description clearly state what the tool does and how it differs from similar tools?

The description clearly states that the tool computes torus volume and surface area, specifies the formula, and identifies the shape (ring-shaped objects). This distinguishes it from the many sibling calculators.

Agents choose between tools based on descriptions. A clear purpose with a specific verb and resource helps agents select the right tool.

Usage Guidelines4/5

Does the description explain when to use this tool, when not to, or what alternatives exist?

It provides a clear usage case ('Use for ring-shaped objects') and directs to list_bundles for related math calculators, but does not explicitly state when not to use or provide detailed alternatives.

Agents often have multiple tools that could apply. Explicit usage guidance like "use X instead of Y when Z" prevents misuse.

Install Server

Other Tools

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/tresor4k/macalc-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server