Server Configuration
Describes the environment variables required to run the server.
Name | Required | Description | Default |
---|---|---|---|
QDRANT_URL | Yes | URL of your Qdrant server | http://localhost:6333 |
OLLAMA_MODEL | No | Ollama model to use for embeddings | nomic-embed-text |
OPENAI_API_KEY | No | Your OpenAI API key for embedding generation | |
QDRANT_API_KEY | No | API key for your Qdrant server if authentication is enabled | |
OLLAMA_ENDPOINT | No | Ollama server endpoint for local embedding models | http://localhost:11434 |
OPENAI_ENDPOINT | No | OpenAI API endpoint | https://api.openai.com/v1 |
OPENROUTER_API_KEY | No | Your OpenRouter API key for embedding generation | |
OPENROUTER_ENDPOINT | No | OpenRouter API endpoint | https://api.openrouter.com/v1 |
DEFAULT_EMBEDDING_SERVICE | No | Default embedding service to use (ollama, openai, openrouter, fastembed) |
Schema
Prompts
Interactive templates invoked by user choice
Name | Description |
---|---|
No prompts |
Resources
Contextual data attached and managed by the client
Name | Description |
---|---|
No resources |
Tools
Functions exposed to the LLM to take actions
Name | Description |
---|---|
No tools |