Server Configuration
Describes the environment variables required to run the server.
| Name | Required | Description | Default |
|---|---|---|---|
| GEMINI_MODE | No | Choose between API or CLI mode for Gemini models (optional). Options: api (default), cli | |
| GEMINI_API_KEY | No | Your Google AI API key (required for Gemini models in API mode) | |
| OPENAI_API_KEY | No | Your OpenAI API key (required for o3) | |
| DEEPSEEK_API_KEY | No | Your DeepSeek API key (required for DeepSeek models) | |
| CONSULT_LLM_DEFAULT_MODEL | No | Override the default model (optional). Options: o3 (default), gemini-2.5-pro, deepseek-reasoner |
Schema
Prompts
Interactive templates invoked by user choice
| Name | Description |
|---|---|
No prompts | |
Resources
Contextual data attached and managed by the client
| Name | Description |
|---|---|
No resources | |
Tools
Functions exposed to the LLM to take actions
| Name | Description |
|---|---|
| consult_llm | Ask a more powerful AI for help with complex problems. Provide your question in the prompt field and optionally include relevant code files as context. Be specific about what you want: code implementation, code review, bug analysis, architecture advice, etc. IMPORTANT: Ask neutral, open-ended questions. Avoid suggesting specific solutions or alternatives in your prompt as this can bias the analysis. Instead of "Should I use X or Y approach?", ask "What's the best approach for this problem?" Let the consultant LLM provide unbiased recommendations. |