Server Configuration
Describes the environment variables required to run the server.
| Name | Required | Description | Default |
|---|---|---|---|
| DEBUG | No | Debug log settings (e.g., mcp:*). | |
| browserUrl | No | Command line argument to connect to a browser via URL (e.g., http://127.0.0.1:9222). | |
| wsEndpoint | No | Command line argument to connect to a browser via WebSocket endpoint. | |
| GEMINI_MODEL | No | The Gemini model to use (e.g., gemini-2.0-flash-exp). | gemini-2.0-flash-exp |
| OPENAI_MODEL | No | The OpenAI model to use (e.g., gpt-4o). | gpt-4o |
| GEMINI_API_KEY | No | API key for Gemini. | |
| OPENAI_API_KEY | No | API key for OpenAI. | |
| ANTHROPIC_MODEL | No | The Anthropic model to use (e.g., claude-3-5-sonnet-20241022). | claude-3-5-sonnet-20241022 |
| BROWSER_CHANNEL | No | The browser channel to use (e.g., chrome). | |
| GEMINI_CLI_PATH | No | Path to the Gemini CLI. | |
| BROWSER_HEADLESS | No | Whether to run the browser in headless mode (true/false). | true |
| BROWSER_ISOLATED | No | Whether to run the browser in isolated mode (true/false). | true |
| ANTHROPIC_API_KEY | No | API key for Anthropic. | |
| USE_STEALTH_SCRIPTS | No | Whether to use stealth scripts to avoid detection (true/false). | false |
| DEFAULT_LLM_PROVIDER | No | The AI provider to use. Valid values: openai, anthropic, or gemini. | |
| REMOTE_DEBUGGING_URL | No | The URL for remote debugging an existing Chrome instance. | http://localhost:9222 |
| REMOTE_DEBUGGING_PORT | No | The port for remote debugging. | 9222 |
| BROWSER_EXECUTABLE_PATH | No | The file path to the Chrome/Chromium executable. |
Capabilities
Server capabilities have not been inspected yet.
Tools
Functions exposed to the LLM to take actions
| Name | Description |
|---|---|
No tools | |
Prompts
Interactive templates invoked by user choice
| Name | Description |
|---|---|
No prompts | |
Resources
Contextual data attached and managed by the client
| Name | Description |
|---|---|
No resources | |