Skip to main content
Glama

mcp-rubber-duck

by nesquikm

Server Configuration

Describes the environment variables required to run the server.

NameRequiredDescriptionDefault
LOG_LEVELNoLog level settinginfo
GROQ_API_KEYNoYour Groq API key
GROQ_NICKNAMENoOptional: defaults to "Groq Duck"Groq Duck
CUSTOM_API_KEYNoYour custom provider API key
GEMINI_API_KEYNoYour Google Gemini API key
OPENAI_API_KEYNoYour OpenAI API key
CUSTOM_BASE_URLNoCustom provider base URL
CUSTOM_NICKNAMENoOptional: defaults to "Custom Duck"Custom Duck
GEMINI_NICKNAMENoOptional: defaults to "Gemini Duck"Gemini Duck
OLLAMA_BASE_URLNoOllama base URLhttp://localhost:11434/v1
OLLAMA_NICKNAMENoOptional: defaults to "Local Duck"Local Duck
OPENAI_NICKNAMENoOptional: defaults to "GPT Duck"GPT Duck
DEFAULT_PROVIDERNoDefault provider to useopenai
TOGETHER_API_KEYNoYour Together AI API key
GROQ_DEFAULT_MODELNoOptional: defaults to llama-3.3-70b-versatilellama-3.3-70b-versatile
DEFAULT_TEMPERATURENoDefault temperature setting0.7
CUSTOM_DEFAULT_MODELNoOptional: defaults to custom-modelcustom-model
GEMINI_DEFAULT_MODELNoOptional: defaults to gemini-2.5-flashgemini-2.5-flash
OLLAMA_DEFAULT_MODELNoOptional: defaults to llama3.2llama3.2
OPENAI_DEFAULT_MODELNoOptional: defaults to gpt-4o-minigpt-4o-mini

Schema

Prompts

Interactive templates invoked by user choice

NameDescription

No prompts

Resources

Contextual data attached and managed by the client

NameDescription

No resources

Tools

Functions exposed to the LLM to take actions

NameDescription

No tools

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/nesquikm/mcp-rubber-duck'

If you have feedback or need assistance with the MCP directory API, please join our Discord server