Skip to main content
Glama

ask_duck

Explain problems to AI assistants for research and diverse perspectives, similar to rubber duck debugging with actual responses.

Instructions

Ask a question to a specific LLM provider (duck)

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
promptYesThe question or prompt to send to the duck
providerNoThe provider name (optional, uses default if not specified)
modelNoSpecific model to use (optional, uses provider default if not specified)
temperatureNoTemperature for response generation (0-2)

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/nesquikm/mcp-rubber-duck'

If you have feedback or need assistance with the MCP directory API, please join our Discord server