Skip to main content
Glama

ask_chatgpt

Get ChatGPT's perspective on saved context entries to verify information or obtain alternative insights for decision-making.

Instructions

Ask ChatGPT a question about a context entry, or get a general second opinion

Input Schema

NameRequiredDescriptionDefault
context_idYesContext ID to ask about
questionNoOptional specific question to ask about the context. If not provided, gets a general second opinion.

Input Schema (JSON Schema)

{ "properties": { "context_id": { "description": "Context ID to ask about", "type": "string" }, "question": { "description": "Optional specific question to ask about the context. If not provided, gets a general second opinion.", "type": "string" } }, "required": [ "context_id" ], "type": "object" }

Implementation Reference

  • The main handler for the 'ask_chatgpt' tool within the call_tool method. It retrieves the context by ID, instantiates ChatGPTClient, calls get_second_opinion, optionally updates the storage with the response, and returns the formatted response.
    if name == "ask_chatgpt": context_id = arguments["context_id"] question = arguments.get("question") context = self.storage.get_context(context_id) if not context: return [TextContent(type="text", text=f"Context {context_id} not found")] try: chatgpt_client = ChatGPTClient() response = chatgpt_client.get_second_opinion(context, question) # Only save to database if it's a generic second opinion (no custom question) if not question: self.storage.update_chatgpt_response(context_id, response) header = "ChatGPT's Answer:" if question else "ChatGPT's Opinion:" return [TextContent(type="text", text=f"{header}\n\n{response}")] except ValueError as e: return [TextContent(type="text", text=f"Error: {e}")]
  • Registration of the 'ask_chatgpt' tool in the list_tools method, including name, description, and input schema.
    Tool( name="ask_chatgpt", description="Ask ChatGPT a question about a context entry, or get a general second opinion", inputSchema={ "type": "object", "properties": { "context_id": {"type": "string", "description": "Context ID to ask about"}, "question": { "type": "string", "description": ( "Optional specific question to ask about the context. If not provided, gets a general second opinion." ), }, }, "required": ["context_id"], }, ),
  • Core helper method in ChatGPTClient that constructs the appropriate prompt (system and user) based on whether a specific question is provided, calls the OpenAI ChatGPT API, and returns the response.
    def get_second_opinion(self, context: ContextEntry, question: str | None = None) -> str: """Get ChatGPT's second opinion on a context, or answer a specific question. Args: context: The context entry to analyze question: Optional specific question to ask. If None, provides general second opinion. """ if question: # Custom question mode system_prompt = """You are a senior software engineering consultant answering questions about code, \ architecture decisions, and implementation plans. Provide clear, actionable answers based on the context provided.""" user_content = self._format_context_for_chatgpt(context, question) else: # Generic second opinion mode system_prompt = """You are a senior software engineering consultant providing second opinions on code, \ architecture decisions, and implementation plans from Claude Code. Your role is to: - Provide constructive, balanced feedback - Highlight both strengths and potential issues - Suggest alternatives when appropriate - Point out edge cases or security concerns - Be concise but thorough Format your response clearly with sections as needed.""" user_content = self._format_context_for_chatgpt(context) response = self.client.chat.completions.create( model=self.model, messages=[ {"role": "system", "content": system_prompt}, {"role": "user", "content": user_content}, ], ) return response.choices[0].message.content or ""
  • Helper method that formats the ContextEntry into a markdown-structured prompt suitable for ChatGPT, including title, type, timestamp, tags, content based on type, and the question or default second opinion request.
    def _format_context_for_chatgpt(self, context: ContextEntry, question: str | None = None) -> str: """Format a context entry for ChatGPT consumption. Args: context: The context entry to format question: Optional specific question to append """ parts = [ f"# Context from Claude Code: {context.title}", f"\n**Type:** {context.type}", f"**Timestamp:** {context.timestamp.isoformat()}", ] if context.tags: parts.append(f"**Tags:** {', '.join(context.tags)}") parts.append("\n## Content\n") # Add specific content based on type if context.content.messages: parts.append("### Conversation\n") for msg in context.content.messages: parts.append(msg) if context.content.code: parts.append("### Code\n") for file_path, code in context.content.code.items(): parts.append(f"**File:** `{file_path}`\n```\n{code}\n```\n") if context.content.suggestions: parts.append(f"### Suggestion\n{context.content.suggestions}\n") if context.content.errors: parts.append(f"### Error/Debug Info\n```\n{context.content.errors}\n```\n") # Add question or default request if question: parts.append(f"\n---\n**Question:** {question}") else: parts.append("\n---\nPlease provide a second opinion on the above context from Claude Code.") return "\n".join(parts)

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/taylorleese/mcp-toolz'

If you have feedback or need assistance with the MCP directory API, please join our Discord server