Skip to main content
Glama

Server Configuration

Describes the environment variables required to run the server.

NameRequiredDescriptionDefault
LLM_KEYNoThe name of the model you want to use (e.g., OPENAI_GPT4_1, ANTHROPIC_CLAUDE4_SONNET, etc.)
BROWSER_TYPENoThe type of browser connection, e.g., 'cdp-connect'.
ENABLE_AZURENoSet to 'true' to register Azure OpenAI models.
AZURE_API_KEYNoAzure deployment API key.
ENABLE_GEMININoSet to 'true' to register Gemini models.
ENABLE_OLLAMANoSet to 'true' to register local models via Ollama.
ENABLE_OPENAINoSet to 'true' to register OpenAI models.
ENABLE_BEDROCKNoSet to 'true' to register AWS Bedrock models.
GEMINI_API_KEYNoGemini API Key.
OPENAI_API_KEYNoOpenAI API Key.
ENABLE_ANTHROPICNoSet to 'true' to register Anthropic models.
ANTHROPIC_API_KEYNoAnthropic API key.
ENABLE_OPENROUTERNoSet to 'true' to register OpenRouter models.
OLLAMA_SERVER_URLNoURL for your Ollama server.
SECONDARY_LLM_KEYNoThe name of the model for mini agents Skyvern runs with.
SKYVERN_TELEMETRYNoBy default, Skyvern collects basic usage statistics. Set to 'false' to opt-out.true
OPENROUTER_API_KEYNoOpenRouter API key.
LLM_CONFIG_MAX_TOKENSNoOverride the max tokens used by the LLM.
CHROME_EXECUTABLE_PATHNoThe path to your Chrome browser executable.

Capabilities

Server capabilities have not been inspected yet.

Tools

Functions exposed to the LLM to take actions

NameDescription

No tools

Prompts

Interactive templates invoked by user choice

NameDescription

No prompts

Resources

Contextual data attached and managed by the client

NameDescription

No resources

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/Skyvern-AI/skyvern'

If you have feedback or need assistance with the MCP directory API, please join our Discord server