Server Configuration
Describes the environment variables required to run the server.
| Name | Required | Description | Default |
|---|---|---|---|
| STEAM_ID | No | Your Steam ID | |
| GITHUB_TOKEN | No | GitHub Personal Access Token (PAT) with repo read scope | |
| STEAM_API_KEY | No | Your Steam API key | |
| LM_STUDIO_MODEL | No | The model name to use in LM Studio | |
| LM_STUDIO_API_KEY | No | The API key for LM Studio | lm-studio |
| LM_STUDIO_BASE_URL | No | The base URL for LM Studio local server (OpenAI-compatible) | http://localhost:1234 |
Tools
Functions exposed to the LLM to take actions
| Name | Description |
|---|---|
No tools | |
Prompts
Interactive templates invoked by user choice
| Name | Description |
|---|---|
No prompts | |
Resources
Contextual data attached and managed by the client
| Name | Description |
|---|---|
No resources | |