Skip to main content
Glama

mcp-rubber-duck

.env.template5.97 kB
# MCP Rubber Duck - Universal Configuration Template # Copy this file to .env and fill in your API keys # For platform-specific examples, see .env.pi.example or .env.desktop.example # ============================================================================= # BASIC CONFIGURATION # ============================================================================= # Docker image to use (multi-platform: AMD64, ARM64) # Works on: macOS, Linux, Windows, Raspberry Pi 3+ DOCKER_IMAGE=ghcr.io/nesquikm/mcp-rubber-duck:latest # Default provider to use when none specified DEFAULT_PROVIDER=openai # Default temperature for LLM responses (0.0 - 2.0) DEFAULT_TEMPERATURE=0.7 # Logging level (error, warn, info, debug) LOG_LEVEL=info # ============================================================================= # NODE.JS & SYSTEM OPTIMIZATION # ============================================================================= # Node.js memory optimization for Raspberry Pi (in MB) # Adjust based on your Pi model: # - Pi 3 (1GB RAM): --max-old-space-size=256 # - Pi 4 (2GB RAM): --max-old-space-size=512 # - Pi 4 (4GB+): --max-old-space-size=1024 NODE_OPTIONS=--max-old-space-size=256 # Node environment NODE_ENV=production # ============================================================================= # MCP SERVER CONFIGURATION # ============================================================================= # Enable MCP server mode (required for Claude Desktop integration) MCP_SERVER=true # ============================================================================= # AI PROVIDER API KEYS # Replace the placeholder values with your actual API keys # ============================================================================= # OpenAI (https://platform.openai.com/api-keys) # Get your API key from: https://platform.openai.com/account/api-keys OPENAI_API_KEY=sk-your-openai-key-here OPENAI_DEFAULT_MODEL=gpt-4o-mini # Google Gemini (https://aistudio.google.com/apikey) # Get your API key from: https://aistudio.google.com/apikey GEMINI_API_KEY=your-gemini-key-here GEMINI_DEFAULT_MODEL=gemini-2.5-flash # Groq (https://console.groq.com/keys) # Get your API key from: https://console.groq.com/keys GROQ_API_KEY=gsk_your-groq-key-here GROQ_DEFAULT_MODEL=llama-3.3-70b-versatile # Together AI (https://api.together.xyz/) # Uncomment and fill if you have a Together AI account # TOGETHER_API_KEY=your-together-key-here # Perplexity AI (https://perplexity.ai/) # Uncomment and fill if you have a Perplexity account # PERPLEXITY_API_KEY=your-perplexity-key-here # Anyscale (https://app.endpoints.anyscale.com/) # Uncomment and fill if you have an Anyscale account # ANYSCALE_API_KEY=your-anyscale-key-here # Azure OpenAI (https://azure.microsoft.com/en-us/products/cognitive-services/openai-service/) # Uncomment and fill if you have Azure OpenAI # AZURE_OPENAI_API_KEY=your-azure-key-here # AZURE_OPENAI_ENDPOINT=your-resource-name.openai.azure.com # ============================================================================= # LOCAL AI PROVIDERS (RASPBERRY PI) # ============================================================================= # Ollama (Local AI on Raspberry Pi) # Only enable if you're running Ollama locally or in Docker # OLLAMA_BASE_URL=http://localhost:11434/v1 # OLLAMA_DEFAULT_MODEL=llama3.2 # LM Studio (Local AI) # Only enable if you're running LM Studio locally # LMSTUDIO_BASE_URL=http://localhost:1234/v1 # LMSTUDIO_DEFAULT_MODEL=local-model # ============================================================================= # CUSTOM PROVIDERS # You can add multiple custom providers using CUSTOM_{NAME}_* format # ============================================================================= # Example: Custom OpenAI-compatible API # CUSTOM_MYAPI_API_KEY=your-custom-key # CUSTOM_MYAPI_BASE_URL=https://api.example.com/v1 # CUSTOM_MYAPI_DEFAULT_MODEL=custom-model # CUSTOM_MYAPI_NICKNAME=My Custom Duck # ============================================================================= # MCP BRIDGE CONFIGURATION (Advanced) # Allows ducks to access external MCP servers # ============================================================================= # Enable MCP Bridge (allows ducks to use external MCP tools) MCP_BRIDGE_ENABLED=false # Approval mode for MCP tool usage: always, trusted, or never MCP_APPROVAL_MODE=trusted # Timeout for approval requests (seconds) MCP_APPROVAL_TIMEOUT=300 # Example: Context7 Documentation Server # Uncomment to enable Context7 MCP server integration # MCP_SERVER_CONTEXT7_TYPE=http # MCP_SERVER_CONTEXT7_URL=https://mcp.context7.com/mcp # MCP_SERVER_CONTEXT7_ENABLED=true # MCP_TRUSTED_TOOLS_CONTEXT7=* # ============================================================================= # DUCK NICKNAMES (Optional Fun Customization) # ============================================================================= # Customize your duck names (optional) # OPENAI_NICKNAME=DUCK-4 # GEMINI_NICKNAME=Duckmini # GROQ_NICKNAME=Quackers # OLLAMA_NICKNAME=Local Quacker # ============================================================================= # RASPBERRY PI SPECIFIC SETTINGS # ============================================================================= # Cache TTL for responses (seconds) # Lower values use less memory but more API calls CACHE_TTL=300 # Enable automatic failover to other providers ENABLE_FAILOVER=true # Maximum retries for failed API calls MAX_RETRIES=3 # Request timeout (milliseconds) # Increase for slower internet connections REQUEST_TIMEOUT=30000 # ============================================================================= # MONITORING & DEBUGGING # ============================================================================= # Enable performance monitoring ENABLE_PERFORMANCE_MONITORING=false # Enable detailed request logging (increases log size) ENABLE_REQUEST_LOGGING=false # Enable memory usage reporting ENABLE_MEMORY_REPORTING=true

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/nesquikm/mcp-rubber-duck'

If you have feedback or need assistance with the MCP directory API, please join our Discord server