Given the tool's complexity, does the description cover enough for an agent to succeed on first attempt?
Given the complexity of compliance results retrieval, no annotations, no output schema, and 2 undocumented parameters, the description is inadequate. It doesn't explain what 'results' means (e.g., report, status, violations), how they're formatted, or any behavioral aspects. For a tool that likely returns structured compliance data, this leaves significant gaps.
Complex tools with many parameters or behaviors need more documentation. Simple tools need less. This dimension scales expectations accordingly.