Skip to main content
Glama

Frontend Test Generation & Code Review MCP Server

.env.example741 B
# OpenAI API Configuration OPENAI_API_KEY=your_openai_api_key_here OPENAI_BASE_URL=https://api.openai.com/v1 OPENAI_MODEL=gpt-4 # Embedding Configuration (optional, defaults to OpenAI) EMBEDDING_BASE_URL=https://api.openai.com/v1 EMBEDDING_MODEL=text-embedding-3-small # Optional Settings MODEL_TEMPERATURE=0 MODEL_TOP_P=1 CACHE_DIR=.cache STATE_DIR=.state LOG_LEVEL=info # Worker configuration (optional) WORKER_ENABLED=true WORKER_MAX_POOL=3 # Workspace configuration (optional) WORKSPACE_CLEANUP_INTERVAL=600000 WORKSPACE_MAX_AGE=3600000 # Test fix configuration (optional) FIX_MAX_ATTEMPTS=3 FIX_CONFIDENCE_THRESHOLD=0.7 # Config file path (optional, defaults to project root config.yaml) # CONFIG_PATH=/path/to/custom/config.yaml

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/NorthSeacoder/fe-testgen-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server