View available AI models and their specializations, or manually override model selection to match specific query types like research, reasoning, or general search.
Identify and display all Ollama models accessible for querying within the Multi-Model Advisor, enabling users to select appropriate AI models for diverse insights.
A minimal MCP Server that provides Claude AI models with the 'think' tool capability, enabling better performance on complex reasoning tasks by allowing the model to pause during response generation for additional thinking steps.
An MCP server that provides a "think" tool enabling structured reasoning for AI agents, allowing them to pause and record explicit thoughts during complex tasks or multi-step tool use.
Enables Claude to use Google Gemini as a secondary AI through MCP for large-scale codebase analysis and complex reasoning tasks. Supports both Gemini Flash and Pro models with specialized functions for general queries and comprehensive code analysis.