detect_nsfw
Classify image safety with normal, suggestive, or explicit labels. Returns classification and is_nsfw flag for content moderation, paid per image via Bitcoin Lightning.
Instructions
Classify image safety (normal / suggestive / explicit). Falcons.ai NSFW detection — 100x cheaper and faster than asking an LLM. Returns classification label and boolean is_nsfw flag. Essential for content moderation pipelines. 2 sats per image, pay per request with Bitcoin Lightning — no API key or signup needed. Requires create_payment with toolName='detect_nsfw'.
Input Schema
| Name | Required | Description | Default |
|---|---|---|---|
| paymentId | Yes | Valid payment ID (must be paid) | |
| imageBase64 | Yes | Base64-encoded image (PNG, JPEG, WEBP) or data URI |