Skip to main content
Glama

query_repository

Ask questions about GitHub repositories to get AI-powered insights about code, architecture, and tech stack after indexing.

Instructions

Ask questions about a GitHub repository and receive detailed AI responses. The repository must be indexed first.

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
repo_urlYesThe GitHub repository URL to query (format: https://github.com/username/repo).
questionYesThe question to ask about the repository.
conversation_historyNoPrevious conversation history for multi-turn conversations.

Implementation Reference

  • The core handler function for the 'query_repository' MCP tool. It is registered via the @mcp.tool() decorator and implements the logic to query a GitHub repository by sending the question to the GitHub Chat API, handling conversation history, and formatting the response.
    @mcp.tool()
    def query_repository(
        repo_url: str = Field(
            description="The GitHub repository URL to query (format: https://github.com/username/repo)."
        ),
        question: str = Field(
            description="The question to ask about the repository."
        ),
        conversation_history: Optional[List[Dict[str, str]]] = Field(
            description="Previous conversation history for multi-turn conversations.", default=None
        ),
    ) -> str:
        """Ask questions about a GitHub repository and receive detailed AI responses. The repository must be indexed first."""
        try:
            if not repo_url or not question:
                raise ValueError("Repository URL and question cannot be empty.")
            
            if not repo_url.startswith("https://github.com/"):
                raise ValueError("Repository URL must be in the format: https://github.com/username/repo")
            
            # Prepare messages array
            messages = conversation_history or []
            messages.append({"role": "user", "content": question})
            
            # Call the chat completions API endpoint
            response = requests.post(
                f"{GITHUB_CHAT_API_BASE}/chat/completions/sync",
                headers={"Content-Type": "application/json"},
                json={
                    "repo_url": repo_url,
                    "messages": messages
                }
            )
            
            if response.status_code != 200:
                return f"Error querying repository: {response.text}"
            
            # Format the response
            result = response.json()
            formatted_response = format_chat_response(result)
            
            return formatted_response
        
        except Exception as e:
            return f"Error: {str(e) or repr(e)}"
  • Supporting helper function called by query_repository to format the API response, including the answer and sources from file paths.
    def format_chat_response(response: Dict[str, Any]) -> str:
        """Format the chat response in a readable way."""
        formatted = ""
        
        if "answer" in response:
            formatted += response["answer"] + "\n\n"
        
        if "contexts" in response and response["contexts"]:
            formatted += "Sources:\n"
            for i, context in enumerate(response["contexts"], 1):
                if "meta_data" in context and "file_path" in context["meta_data"]:
                    formatted += f"{i}. {context['meta_data']['file_path']}\n"
        
        return formatted.strip()

Tool Description Quality Score

Score is being calculated. Check back soon.

Install Server

Other Tools

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/AsyncFuncAI/github-chat-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server