Server Configuration
Describes the environment variables required to run the server.
Name | Required | Description | Default |
---|---|---|---|
DIR | No | Directory path | |
MODEL | No | Model to use | |
MCP_MODE | No | MCP client mode, supports streamable-http, sse, stdio | streamable-http |
MAX_TOKENS | No | Maximum number of tokens | |
TEMPERATURE | No | Temperature setting for the model | |
USER_DATA_DIR | No | User data directory | |
MAX_ITERATIONS | No | Maximum number of iterations | |
OPENAI_API_KEY | Yes | Your OpenAI API key | |
REQUEST_TIMEOUT | No | Timeout for requests | |
MAX_INPUT_LENGTH | No | Maximum input length | |
PROFILE_DIRECTORY | No | Profile directory |
Schema
Prompts
Interactive templates invoked by user choice
Name | Description |
---|---|
No prompts |
Resources
Contextual data attached and managed by the client
Name | Description |
---|---|
No resources |
Tools
Functions exposed to the LLM to take actions
Name | Description |
---|---|
No tools |