Search for:
Why this server?
Allows AI assistants to browse and read files from specified GitHub repositories, providing access to repository contents via the Model Context Protocol.
Why this server?
Facilitates authentication with GitHub using OAuth protocol, allowing secure access and interaction with GitHub repositories and services.
Why this server?
A Model Context Protocol server that enables Large Language Models to interact with Git repositories through a robust API, supporting operations like repository initialization, cloning, file staging, committing, and branch management.
Why this server?
Enables interaction with GitHub through the GitHub API, supporting file operations, repository management, advanced search, and issue tracking with comprehensive error handling and automatic branch creation.
Why this server?
A modular command processor server that enables interaction with GitHub's REST API to fetch user details, repository information, and authenticated user data through natural language commands in Claude.
Why this server?
Provides GitHub data to Claude through the Model Context Protocol for performing repository management tasks.
Why this server?
An integration tool that enables AI assistants like Claude to directly access and interact with Bitbucket repositories, pull requests, and code without requiring copy/paste operations.
Why this server?
A Model Context Protocol server that provides Claude and other LLMs with read-only access to Hugging Face Hub APIs, enabling interaction with models, datasets, spaces, papers, and collections through natural language.
Why this server?
Enables Cursor and Windsurf to safely interact with Supabase databases by providing tools for database management, SQL query execution, and Supabase Management API access with built-in safety controls.
Why this server?
Enables search, exploration, and analysis of all QAnon posts for sociological study.