Skip to main content
Glama

translator-ai

by DatanoiseTV
.env.example585 B
# Translation Provider Configuration # Google Gemini API Key (for cloud-based translation) # Get your API key from https://aistudio.google.com/app/apikey GEMINI_API_KEY=your_gemini_api_key_here # Ollama Configuration (for local translation) # No API key needed - runs locally # Default: http://localhost:11434 OLLAMA_API_URL=http://localhost:11434 # Default translation provider: gemini or ollama # Can be overridden with --provider flag TRANSLATOR_PROVIDER=gemini # Enable verbose output for debugging # Set to 'true' to see detailed Ollama requests/responses OLLAMA_VERBOSE=false

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/DatanoiseTV/translator-ai'

If you have feedback or need assistance with the MCP directory API, please join our Discord server