Server Configuration
Describes the environment variables required to run the server.
| Name | Required | Description | Default |
|---|---|---|---|
| HOST | No | Server host | 0.0.0.0 |
| PORT | No | Server port | 3100 |
| DB_PATH | No | SQLite database path | ./data/context-hub.db |
| LOG_LEVEL | No | Log level (debug, info, warn, error) | info |
| CACHE_TTL_MS | No | Cache TTL (5 minutes) | 300000 |
| PRIMARY_MODEL | No | Primary chat model | llama3.1:8b-instruct-q4_K_M |
| PROXY_SERVERS | No | Sub-MCP server configs (JSON string) | {} |
| FALLBACK_MODEL | No | Fallback chat model | qwen2.5:7b-instruct-q4_K_M |
| MCP_AUTH_TOKEN | Yes | Bearer token for authentication. Required for security. | |
| EMBEDDING_MODEL | No | Embedding model | nomic-embed-text:v1.5 |
| MCP_ALLOWED_IPS | No | Comma-separated allowed IPs | 127.0.0.1,::1 |
| OLLAMA_BASE_URL | No | Ollama API URL | http://localhost:11434 |
| CACHE_MAX_ENTRIES | No | Max cache entries | 100 |
Capabilities
Server capabilities have not been inspected yet.
Tools
Functions exposed to the LLM to take actions
| Name | Description |
|---|---|
No tools | |
Prompts
Interactive templates invoked by user choice
| Name | Description |
|---|---|
No prompts | |
Resources
Contextual data attached and managed by the client
| Name | Description |
|---|---|
No resources | |