Server Configuration
Describes the environment variables required to run the server.
| Name | Required | Description | Default |
|---|---|---|---|
| HOST | No | 서버가 바인딩할 호스트 주소 / Server host address | 0.0.0.0 |
| PORT | No | 서버가 사용할 포트 번호 / Server port number | 8000 |
| DEBUG | No | 디버그 모드 활성화 (True/False) / Debug mode activation | False |
| API_PREFIX | No | API 엔드포인트 경로 prefix / API endpoint path prefix | /link-scan |
| OLLAMA_MODEL | No | 사용할 Ollama 모델 이름 / Ollama model name to use | llama3:latest |
| OLLAMA_API_URL | No | Ollama API 서버 URL / Ollama API server URL | http://localhost:11434 |
Capabilities
Server capabilities have not been inspected yet.
Tools
Functions exposed to the LLM to take actions
| Name | Description |
|---|---|
No tools | |
Prompts
Interactive templates invoked by user choice
| Name | Description |
|---|---|
No prompts | |
Resources
Contextual data attached and managed by the client
| Name | Description |
|---|---|
No resources | |