Server Configuration
Describes the environment variables required to run the server.
Name | Required | Description | Default |
---|---|---|---|
OPENAI_API_KEY | No | API key for OpenAI, required if using OpenAI providers in PEEKABOO_AI_PROVIDERS. | |
PEEKABOO_CLI_PATH | No | Optional override for the Swift peekaboo CLI executable path. | |
PEEKABOO_LOG_FILE | No | Path to the server's log file. If the specified directory is not writable, falls back to the system temp directory. | ~/Library/Logs/peekaboo-mcp.log |
PEEKABOO_LOG_LEVEL | No | Logging level (trace, debug, info, warn, error, fatal). | info |
PEEKABOO_AI_PROVIDERS | No | Comma-separated list of provider_name/default_model_for_provider pairs (e.g., "openai/gpt-4o,ollama/llava:7b"). Determines which AI backends are available for the analyze tool and the image tool (when a question is provided). | "" |
PEEKABOO_CONSOLE_LOGGING | No | Boolean ("true"/"false") for development console logs. | "false" |
PEEKABOO_OLLAMA_BASE_URL | No | Base URL for the Ollama API server. Only needed if Ollama is running on a non-default address. | http://localhost:11434 |
PEEKABOO_DEFAULT_SAVE_PATH | No | Default base absolute path for saving images captured by the image tool. If the path argument is provided to the image tool, it takes precedence. |
Schema
Prompts
Interactive templates invoked by user choice
Name | Description |
---|---|
No prompts |
Resources
Contextual data attached and managed by the client
Name | Description |
---|---|
No resources |
Tools
Functions exposed to the LLM to take actions
Name | Description |
---|---|
No tools |