Skip to main content
Glama

MCP Tailwind Gemini Server

by Tai-DT
.env.example•777 B
# Environment variables for MCP Tailwind Gemini Docker Deployment # Copy this file to .env and fill in your API keys # Required: Gemini AI API Key (Get from https://makersuite.google.com/app/apikey) GEMINI_API_KEY=your_gemini_api_key_here # Optional: OpenAI API Key (for additional AI features) OPENAI_API_KEY=your_openai_api_key_here # Optional: Claude API Key (for Anthropic Claude integration) CLAUDE_API_KEY=your_claude_api_key_here # Optional: Figma Access Token (for design-to-code features) FIGMA_ACCESS_TOKEN=your_figma_access_token_here # Application Configuration NODE_ENV=production MCP_PORT=3000 LOG_LEVEL=info # AI Model Preferences DEFAULT_AI_MODEL=gemini-pro FALLBACK_AI_MODEL=gpt-4 # Performance Settings MAX_CONCURRENT_REQUESTS=10 REQUEST_TIMEOUT=30000

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/Tai-DT/mcp-tailwind-gemini'

If you have feedback or need assistance with the MCP directory API, please join our Discord server