Skip to main content
Glama

query_repository

Query GitHub repositories to receive AI-generated responses about code, architecture, and tech stack by providing a repository URL and specific questions.

Instructions

Ask questions about a GitHub repository and receive detailed AI responses. The repository must be indexed first.

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
conversation_historyNoPrevious conversation history for multi-turn conversations.
questionYesThe question to ask about the repository.
repo_urlYesThe GitHub repository URL to query (format: https://github.com/username/repo).

Implementation Reference

  • The main handler function for the 'query_repository' tool. It defines the input schema using Pydantic Fields, validates the repository URL and question, prepares the conversation messages, calls the GitHub Chat API, handles the response, formats it using a helper function, and returns the formatted answer with sources.
    @mcp.tool() def query_repository( repo_url: str = Field( description="The GitHub repository URL to query (format: https://github.com/username/repo)." ), question: str = Field( description="The question to ask about the repository." ), conversation_history: Optional[List[Dict[str, str]]] = Field( description="Previous conversation history for multi-turn conversations.", default=None ), ) -> str: """Ask questions about a GitHub repository and receive detailed AI responses. The repository must be indexed first.""" try: if not repo_url or not question: raise ValueError("Repository URL and question cannot be empty.") if not repo_url.startswith("https://github.com/"): raise ValueError("Repository URL must be in the format: https://github.com/username/repo") # Prepare messages array messages = conversation_history or [] messages.append({"role": "user", "content": question}) # Call the chat completions API endpoint response = requests.post( f"{GITHUB_CHAT_API_BASE}/chat/completions/sync", headers={"Content-Type": "application/json"}, json={ "repo_url": repo_url, "messages": messages } ) if response.status_code != 200: return f"Error querying repository: {response.text}" # Format the response result = response.json() formatted_response = format_chat_response(result) return formatted_response except Exception as e: return f"Error: {str(e) or repr(e)}"
  • Helper function used by the query_repository handler to format the API response, including the answer and listing sources with file paths from the contexts.
    def format_chat_response(response: Dict[str, Any]) -> str: """Format the chat response in a readable way.""" formatted = "" if "answer" in response: formatted += response["answer"] + "\n\n" if "contexts" in response and response["contexts"]: formatted += "Sources:\n" for i, context in enumerate(response["contexts"], 1): if "meta_data" in context and "file_path" in context["meta_data"]: formatted += f"{i}. {context['meta_data']['file_path']}\n" return formatted.strip()
  • The @mcp.tool() decorator registers the query_repository function as an MCP tool.
    @mcp.tool()
  • Pydantic Field definitions providing the input schema for the query_repository tool, including descriptions for repo_url, question, and conversation_history.
    repo_url: str = Field( description="The GitHub repository URL to query (format: https://github.com/username/repo)." ), question: str = Field( description="The question to ask about the repository." ), conversation_history: Optional[List[Dict[str, str]]] = Field( description="Previous conversation history for multi-turn conversations.", default=None ), ) -> str:

Other Tools

Related Tools

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/AsyncFuncAI/github-chat-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server