Uses .env files for configuration, specifically for storing API keys such as the OpenAI API key required for the server's operation.
Integrates with OpenAI API to provide deep thinking and analysis capabilities, supporting multiple AI models including o3-mini and gpt-4 for problem solving, code enhancement, and code review.
Deep Thinking Assistant - Gemini MCP Server
A Gemini API-based MCP server that provides deep thinking and analysis. Works with AI Editor models to provide deeper analysis and insights.
Features
Problem analysis from multiple perspectives
Integrating Critical and Creative Thinking
Practical and concrete proposals
Integrating existing knowledge and providing new perspectives
Context-sensitive and accurate granularity
Critical analysis of the proposed solution and suggestions for improvement
Project Structure
set up
Install dependencies:
Set environment variables: Create a
.env
file with the following content:
How to use
Start the server:
Available Tools
deep_thinking_agent
It deepens the thought process for solving problems and provides perspectives. This tool provides a deeper understanding and multi-faceted analysis of the problem, and provides guidelines to arrive at better solutions.
Parameters:
instructions
: Instructions from the user (required)context
: the context of your thought process (required)model
: The model name to use (default: "gemini-2.0-flash")
enhancement_agent
Analyze your code and provide practical suggestions for improvement. This tool performs a comprehensive analysis of your code in terms of quality, performance, maintainability, and more, and provides actionable improvement suggestions.
Parameters:
instructions
: instructions for the code being reviewed (required)code
: A list of codes (required)model
: The model name to use (default: "gemini-2.0-flash")temperature
: Temperature parameter at generation (default: 0.7)
final_review_agent
Perform a final code review and suggest improvements. The tool critically analyzes the proposed changes and improvements to identify potential issues and opportunities for further optimization.
Parameters:
instructions
: instructions for the code being reviewed (required)code
: A list of codes (required)model
: The model name to use (default: "gemini-2.0-flash")temperature
: Temperature parameter at generation (default: 0.7)
Usage Example
Deepening the thought process:
Code improvement suggestions:
Final Review:
Default System Prompt
Thought-Support Prompts
The server helps you think along these lines:
Problem understanding and structured thinking
Understanding the big picture through systems thinking
Decomposing a problem using MECE
Causal analysis (why-why analysis, fishbone diagram)
Stakeholder analysis and requirements organization
Designing and Evaluating Solutions
Applying design patterns and architectural principles
Quantitative evaluation of trade-offs (cost vs. benefit)
Risk analysis and countermeasures (FMEA method)
Verification of feasibility (PoC strategy)
Pursuit of technical excellence
Clean Architecture principles, loose coupling and high cohesion, proper direction of dependencies, interface abstraction
Optimizing code quality - Readability and maintainability - Performance and scalability - Security and robustness
Designing a test strategy, considering the test pyramid, boundary values and edge cases, automation and continuous verification
Innovation and Creative Thinking
Use Lateral Thinking
Idea development using the SCAMPER method
Creative problem solving using constraints
Integrating new technologies with legacy systems
Optimizing implementation and deployment
Phased Implementation Strategy
Technical Debt Management and Repayment Plans
Change impact analysis
Minimizing deployment risks
Continuous improvement and learning
Setting KPIs and metrics
Establishing a feedback loop
Systematizing and sharing knowledge
PDCA Cycle
Communication and collaboration
Technical clarification
Structuring the document
Knowledge sharing across teams
Facilitating reviews and feedback
Answer Analysis Prompt
Your responses will be analysed based on the following criteria:
Logical consistency and completeness
Validity of assumptions and constraints
Consistency of logical development
The process of drawing conclusions
Identifying overlooked elements
Falsifiability Test
Technical feasibility and optimality
Appropriateness of algorithms and data structures
Robustness of the system architecture
Performance and Scalability
Security and Reliability
Maintainability and Extensibility
Implementation and operation
Development Efficiency and Productivity
Operational burden and costs
Monitoring and troubleshooting
Versioning and Deployment
Effective team collaboration
Risks and challenges
Technical constraints and limitations
Security Vulnerabilities
Performance Bottlenecks
Dependency Complexity
Potential technical debt
Business Value and Impact
Development and operation costs
Time to market
Impact on user experience
Alignment with business requirements
Contributing to competitive advantage
The analysis results consist of:
Strengths of the proposal
Technical Advantages
Efficiency of implementation
Business Value
Innovative elements
Areas for improvement
Technical challenges
Implementation Risks
Operational concerns
Scalability Limitations
Specific improvement proposals
Short-term improvements
Mid- to long-term optimization
Alternative Approach
Applying best practices
Additional Considerations
Edge cases and exception handling
Future Scalability
Security Considerations
Performance Optimization
Implementation Roadmap
Task Prioritization
Setting Milestones
Define success metrics (KPIs)
Risk Mitigation Strategies
This server cannot be installed
remote-capable server
The server can be hosted and run remotely because it primarily relies on remote services or has no dependency on the local environment.
An OpenAI API-based MCP server that provides deep thinking and analysis capabilities, integrating with AI editor models to deliver comprehensive insights and practical solutions.
Related MCP Servers
- -securityAlicense-qualityA simple MCP server for interacting with OpenAI assistants. This server allows other tools (like Claude Desktop) to create and interact with OpenAI assistants through the Model Context Protocol.Last updated -936MIT License
- -securityAlicense-qualityA server that integrates the MCP library with OpenAI's API, allowing users to interact with various tools, such as the weather tool, through natural language queries.Last updated -MIT License
- -securityFlicense-qualityAn auto-generated MCP server that enables interaction with the OpenAI API, allowing users to access OpenAI's models and capabilities through the Multi-Agent Conversation Protocol.Last updated -
- AsecurityAlicenseAqualityAn MCP server that enables AI applications to access 20+ model providers (including OpenAI, Anthropic, Google) through a unified interface for text and image generation.Last updated -223MIT License