Search for:
Why this server?
Helps large language models process code repositories by providing file tree generation, code merging, and code analysis capabilities.
Why this server?
Enables language models to access code intelligence features like completions, definitions, and references across multiple programming languages through the Language Server Protocol.
Why this server?
A natural language interface for MLflow that allows users to query and manage their machine learning experiments and models using plain English.
Why this server?
Provides pre-defined prompt templates for AI assistants to generate comprehensive plans for TypeScript projects, API architectures, and GitHub workflows.
Why this server?
Facilitates LLMs to efficiently access and fetch structured documentation for packages in Go, Python, and NPM, enhancing software development with multi-language support and performance optimization.
Why this server?
ATLAS (Adaptive Task & Logic Automation System) provides hierarchical task management capabilities to Large Language Models for complex tasks with dependencies.
Why this server?
Enables browsing Git repositories through the Model Context Protocol, providing features like displaying directory structures, reading files, searching code, comparing branches, and viewing commit history.
Why this server?
Serves as a guardian of development knowledge, providing AI assistants with curated access to latest documentation and best practices.
Why this server?
Allows Claude to install other MCP servers from npm or PyPi, enabling easy expansion of Claude's capabilities with external tools.
Why this server?
Leverages Vim's native text editing commands and workflows, which Claude already understands, to create a lightweight code assistance layer.