Skip to main content
Glama

Chain of Draft Prompt Tool

.env.example1.14 kB
# LLM Provider Configuration # Choose which provider to use: 'anthropic', 'openai', 'mistral', or 'ollama' LLM_PROVIDER=anthropic # Default model to use (provider-specific) # Anthropic: claude-3-7-sonnet-latest, claude-3-opus-20240229 # OpenAI: gpt-4-turbo-preview, gpt-4, gpt-3.5-turbo # Mistral: mistral-tiny, mistral-small, mistral-medium # Ollama: llama2, mistral, codellama LLM_MODEL=claude-3-7-sonnet-latest # Anthropic Configuration ANTHROPIC_API_KEY=your_anthropic_api_key_here ANTHROPIC_BASE_URL=https://api.anthropic.com # OpenAI Configuration OPENAI_API_KEY=your_openai_api_key_here OPENAI_BASE_URL=https://api.openai.com # Mistral Configuration MISTRAL_API_KEY=your_mistral_api_key_here # Ollama Configuration (local deployment) OLLAMA_BASE_URL=http://localhost:11434 # Chain of Draft Settings # These are optional and will use defaults if not set # MAX_WORDS_PER_STEP=5 # ENFORCE_FORMAT=true # ADAPTIVE_WORD_LIMIT=true # Database Settings COD_DB_URL=sqlite:///cod_analytics.db COD_EXAMPLES_DB=cod_examples.db # Default Settings COD_DEFAULT_MODEL=claude-3-7-sonnet-latest COD_MAX_TOKENS=500 COD_MAX_WORDS_PER_STEP=5

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/brendancopley/mcp-chain-of-draft-prompt-tool'

If you have feedback or need assistance with the MCP directory API, please join our Discord server