Skip to main content
Glama

MCP Spec Generator

by Huxley-Brown
env.example960 B
# Copy this file to .env (or export in your shell) and restart Cursor # Provider selection (default: openai) LLM_PROVIDER=openai # OpenAI settings OPENAI_API_KEY=your_openai_api_key_here OPENAI_MODEL=gpt-5-2025-08-07 OPENAI_MAX_TOKENS=20000 # Anthropic settings (used only if LLM_PROVIDER=anthropic) ANTHROPIC_API_KEY=your_anthropic_api_key_here ANTHROPIC_MODEL=claude-sonnet-4-20250514 ANTHROPIC_TIMEOUT_MS=15000 ANTHROPIC_MAX_RETRIES=3 ANTHROPIC_BASE_DELAY_MS=500 ANTHROPIC_MAX_TOKENS=20000 # Writing behavior # Set to 1 to enable automatic writes; otherwise the tool returns proposals ALLOW_WRITE=0 # Where to write files (defaults to the server process CWD if unset) # Use this to force writing into your current project when the server runs elsewhere PROJECT_ROOT=/absolute/path/to/your/project # Misc MCP_AUDIT_FILE=.mcp-audit.log # Optional: explicitly point to an env file (useful for global server) # ENV_FILE=/absolute/path/to/your/project/.env

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/Huxley-Brown/Project-Setup-MCP'

If you have feedback or need assistance with the MCP directory API, please join our Discord server