Server Configuration
Describes the environment variables required to run the server.
| Name | Required | Description | Default |
|---|---|---|---|
| FOUNDRY_MODE | No | Server mode: full (16 tools) or minimal (1 wake tool) | full |
| OPENAI_API_KEY | No | OpenAI API key for LLM provider credentials | |
| ANTHROPIC_API_KEY | No | Anthropic API key for LLM provider credentials | |
| FOUNDRY_MCP_API_KEYS | No | Comma-separated API keys required for tool access (disabled by default) | |
| FOUNDRY_MCP_LOG_LEVEL | No | Logging level (DEBUG, INFO, etc.) | INFO |
| FOUNDRY_MCP_SPECS_DIR | No | Path to specs directory (auto-detected from workspace if not provided) | |
| FOUNDRY_MCP_FEATURE_FLAGS | No | Additional feature flags to enable (e.g., planning_tools). Based on spec rollout by default | |
| FOUNDRY_MCP_WORKFLOW_MODE | No | Execution mode: single, autonomous, or batch | single |
| FOUNDRY_MCP_RESPONSE_CONTRACT | No | Force response contract version (v2). Auto-negotiated by default |
Capabilities
Server capabilities have not been inspected yet.
Tools
Functions exposed to the LLM to take actions
| Name | Description |
|---|---|
No tools | |
Prompts
Interactive templates invoked by user choice
| Name | Description |
|---|---|
No prompts | |
Resources
Contextual data attached and managed by the client
| Name | Description |
|---|---|
No resources | |