Search for:
Why this server?
Allows users to access repository information, manage issues, pull requests, workflows, and other GitHub features through Cursor.
Why this server?
This server facilitates the invocation of Azure DevOps services.
Why this server?
An integration tool that enables AI assistants like Claude to directly access and interact with Bitbucket repositories, pull requests, and code without requiring copy/paste operations.
Why this server?
A Model Context Protocol server for Git repository interaction and automation. This server provides tools to read, search, and manipulate Git repositories via Large Language Models.
Why this server?
A Model Context Protocol server that provides Claude and other LLMs with read-only access to Hugging Face Hub APIs, enabling interaction with models, datasets, spaces, papers, and collections through natural language.
Why this server?
Facilitates authentication with GitHub using OAuth protocol, allowing secure access and interaction with GitHub repositories and services.
Why this server?
Bridges Large Language Models with Language Server Protocol interfaces, allowing LLMs to access LSP's hover information, completions, diagnostics, and code actions for improved code suggestions.
Why this server?
Stdio MCP Server wrapping custom Python runtime (LocalPythonExecutor) from Hugging Faces' `smolagents` framework.
Why this server?
An MCP server that wraps the Azure CLI allowing your LLM to list resources, update/create/delete them, fix errors, fix security issues and more
Why this server?
Connects AI models to the Terraform Registry via MCP, enabling provider lookups, resource usage examples, and module recommendations for streamlined Terraform workflows.