Skip to main content
Glama
taylorleese

mcp-toolz

ask_gemini

Ask Google Gemini questions about specific context entries or request general second opinions to enhance understanding and decision-making.

Instructions

Ask Google Gemini a question about a context entry, or get a general second opinion

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
context_idYesContext ID to ask about
questionNoOptional specific question to ask about the context. If not provided, gets a general second opinion.

Implementation Reference

  • Main MCP tool handler for 'ask_gemini': retrieves context by ID, calls GeminiClient.get_second_opinion, updates storage if no specific question, formats and returns response.
    if name == "ask_gemini":
        context_id = arguments["context_id"]
        question = arguments.get("question")
        context = self.storage.get_context(context_id)
        if not context:
            return [TextContent(type="text", text=f"Context {context_id} not found")]
    
        try:
            gemini_client = GeminiClient()
            response = gemini_client.get_second_opinion(context, question)
    
            # Only save to database if it's a generic second opinion (no custom question)
            if not question:
                self.storage.update_gemini_response(context_id, response)
    
            header = "Gemini's Answer:" if question else "Gemini's Opinion:"
            return [TextContent(type="text", text=f"{header}\n\n{response}")]
        except ValueError as e:
            return [TextContent(type="text", text=f"Error: {e}")]
  • Input schema definition for the 'ask_gemini' tool in list_tools(), specifying context_id (required) and optional question.
    Tool(
        name="ask_gemini",
        description="Ask Google Gemini a question about a context entry, or get a general second opinion",
        inputSchema={
            "type": "object",
            "properties": {
                "context_id": {"type": "string", "description": "Context ID to ask about"},
                "question": {
                    "type": "string",
                    "description": (
                        "Optional specific question to ask about the context. If not provided, gets a general second opinion."
                    ),
                },
            },
            "required": ["context_id"],
        },
    ),
  • Core helper function in GeminiClient that formats the context, sets up system instruction based on whether there's a question, generates content via Gemini API, and returns the response text.
        def get_second_opinion(self, context: ContextEntry, question: str | None = None) -> str:
            """Get Gemini's second opinion on a context, or answer a specific question.
    
            Args:
                context: The context entry to analyze
                question: Optional specific question to ask. If None, provides general second opinion.
            """
            if question:
                # Custom question mode
                system_instruction = """You are a senior software engineering consultant answering questions about code, \
    architecture decisions, and implementation plans.
    
    Provide clear, actionable answers based on the context provided."""
                user_content = self._format_context_for_gemini(context, question)
            else:
                # Generic second opinion mode
                system_instruction = """You are a senior software engineering consultant providing second opinions on code, \
    architecture decisions, and implementation plans.
    
    Your role is to:
    - Provide constructive, balanced feedback
    - Highlight both strengths and potential issues
    - Suggest alternatives when appropriate
    - Point out edge cases or security concerns
    - Be concise but thorough
    
    Format your response clearly with sections as needed."""
                user_content = self._format_context_for_gemini(context)
    
            # Configure model with system instruction
            model_with_instruction = genai.GenerativeModel(self.model_name, system_instruction=system_instruction)
    
            # Use request_options to set timeout
            response = model_with_instruction.generate_content(user_content, request_options={"timeout": self.timeout})
    
            return str(response.text)
  • Helper method that formats the ContextEntry into a string suitable for Gemini prompt, including title, type, content sections, and question or default second opinion request.
    def _format_context_for_gemini(self, context: ContextEntry, question: str | None = None) -> str:
        """Format a context entry for Gemini consumption.
    
        Args:
            context: The context entry to format
            question: Optional specific question to append
        """
        parts = [
            f"# Context: {context.title}",
            f"\n**Type:** {context.type}",
            f"**Timestamp:** {context.timestamp.isoformat()}",
        ]
    
        if context.tags:
            parts.append(f"**Tags:** {', '.join(context.tags)}")
    
        parts.append("\n## Content\n")
    
        # Add specific content based on type
        if context.content.messages:
            parts.append("### Conversation\n")
            for msg in context.content.messages:
                parts.append(msg)
    
        if context.content.code:
            parts.append("### Code\n")
            for file_path, code in context.content.code.items():
                parts.append(f"**File:** `{file_path}`\n```\n{code}\n```\n")
    
        if context.content.suggestions:
            parts.append(f"### Suggestion\n{context.content.suggestions}\n")
    
        if context.content.errors:
            parts.append(f"### Error/Debug Info\n```\n{context.content.errors}\n```\n")
    
        # Add question or default request
        if question:
            parts.append(f"\n---\n**Question:** {question}")
        else:
            parts.append("\n---\nPlease provide a second opinion on the above context.")
    
        return "\n".join(parts)

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/taylorleese/mcp-toolz'

If you have feedback or need assistance with the MCP directory API, please join our Discord server