Server Configuration
Describes the environment variables required to run the server.
| Name | Required | Description | Default |
|---|---|---|---|
| OXIDE_AUTO_START_WEB | No | Setting OXIDE_AUTO_START_WEB=true automatically starts the Web UI when the MCP server launches | false |
Tools
Functions exposed to the LLM to take actions
| Name | Description |
|---|---|
| route_task | Intelligently route a task to the best LLM.
Analyzes the task characteristics and automatically selects the most
appropriate LLM service (Gemini for large codebases, Qwen for code review, etc.).
Args:
prompt: Task description or query
files: Optional list of file paths to include as context
preferences: Optional routing preferences |
| analyze_parallel | Analyze large codebase in parallel across multiple LLMs.
Distributes files across multiple LLM services for faster analysis.
Ideal for analyzing large codebases with 20+ files.
Args:
directory: Directory path to analyze
prompt: Analysis prompt/query
num_workers: Number of parallel workers (default: 3) |
| list_services | Check health and availability of all configured LLM services.
Returns status information for all services including:
- Service health (available/unavailable)
- Service type (CLI/HTTP)
- Routing rules configuration |
Prompts
Interactive templates invoked by user choice
| Name | Description |
|---|---|
No prompts | |
Resources
Contextual data attached and managed by the client
| Name | Description |
|---|---|
No resources | |