Server Configuration
Describes the environment variables required to run the server.
Name | Required | Description | Default |
---|---|---|---|
LLM_MODEL | No | The LLM model to use | gpt-4o-mini |
LOG_LEVEL | No | Logging level for the application | INFO |
LLM_API_KEY | Yes | Your API key for the LLM service | |
LLM_TEMPERATURE | No | Temperature setting for LLM responses | 0.0 |
LLM_API_BASE_URL | No | Base URL for the LLM API | https://api.openai.com/v1 |
FUN_FACTS_API_URL | No | URL for the fun facts API | https://uselessfacts.jsph.pl/random.json?language=en |
GITHUB_TRENDING_URL | No | URL for the GitHub trending repositories API | https://api.ossinsight.io/v1/trends/repos/ |
LLM_REQUEST_TIMEOUT | No | Request timeout for LLM API calls in seconds | 15 |
TECH_TRIVIA_API_URL | No | URL for the tech trivia API | https://opentdb.com/api.php?amount=1&category=18&type=multiple |
Schema
Prompts
Interactive templates invoked by user choice
Name | Description |
---|---|
No prompts |
Resources
Contextual data attached and managed by the client
Name | Description |
---|---|
No resources |
Tools
Functions exposed to the LLM to take actions
Name | Description |
---|---|
No tools |