Integrates the Qwen3-Coder 30B parameter model with Claude Code through 5 specialized tools for code review, explanation, generation, bug fixing, and optimization. Optimized for 64GB RAM systems with advanced performance settings including flash attention and parallel processing.
Routes AI tasks to appropriate local LLM models (quick, coder, MoE, thinking) with automatic model selection, multi-backend support (Ollama, llama.cpp, Gemini), and parallel processing capabilities.
An MCP server that supercharges AI assistants with powerful tools for software development, enabling research, planning, code generation, and project scaffolding through natural language interaction.
A utility toolkit that enhances Claude's code interaction capabilities by providing seamless tools for Java code analysis, manipulation, and testing workflows.
Provides access to multiple AI models (Grok, Gemini 2.5 Pro, Kimi, Qwen3 Coder, GLM-4.5) through OpenRouter, enabling users to query different specialized models for reasoning, coding, translation, and general tasks with automatic fallback to free variants.