Server Configuration
Describes the environment variables required to run the server.
Name | Required | Description | Default |
---|---|---|---|
SEED | No | Random seed for reproducible results | 42 |
DEBUG | No | Enable debug mode | false |
TOP_K | No | Top-k sampling parameter (1-100) | 40 |
TOP_P | No | Top-p sampling parameter (0.0-1.0) | 0.9 |
CACHE_TTL | No | Cache time-to-live in seconds | 3600 |
PYTHONPATH | No | Python path for module imports | |
TEMPERATURE | No | Model temperature setting (0.0-2.0) | 0.7 |
ENABLE_VISION | No | Enable vision processing capabilities | false |
LLAMA_API_KEY | No | Optional API key for Ollama authentication | |
LLAMA_API_URL | No | URL for the Ollama API server | http://localhost:11434 |
MCP_LOG_LEVEL | No | Logging level for the MCP server | INFO |
CACHE_MAX_SIZE | No | Maximum cache size | 1000 |
REPEAT_PENALTY | No | Repetition penalty for text generation | 1.1 |
MCP_SERVER_HOST | No | Host address for the MCP server | localhost |
MCP_SERVER_PORT | No | Port number for the MCP server | 3000 |
VERBOSE_LOGGING | No | Enable verbose logging | false |
ENABLE_STREAMING | No | Enable streaming support for real-time token generation | true |
LLAMA_MODEL_NAME | No | Name of the Llama model to use | llama3:latest |
ALLOW_FILE_WRITES | No | Allow file write operations | true |
ENABLE_WEB_SEARCH | No | Enable web search functionality | true |
MAX_CONTEXT_LENGTH | No | Maximum context length for the model | 128000 |
REQUEST_TIMEOUT_MS | No | Request timeout in milliseconds | 30000 |
ENABLE_CODE_EXECUTION | No | Enable code execution (security risk) | false |
FILE_SYSTEM_BASE_PATH | No | Base path for file system operations | |
ENABLE_FUNCTION_CALLING | No | Enable function calling capabilities | true |
MAX_CONCURRENT_REQUESTS | No | Maximum number of concurrent requests | 10 |
Schema
Prompts
Interactive templates invoked by user choice
Name | Description |
---|---|
No prompts |
Resources
Contextual data attached and managed by the client
Name | Description |
---|---|
No resources |
Tools
Functions exposed to the LLM to take actions
Name | Description |
---|---|
No tools |