Search for:
Why this server?
Provides intelligent summarization capabilities through a clean, extensible architecture. Designed for big repositories where large files can eat up the context window.
Why this server?
Provides tools for collecting and documenting code from directories.
Why this server?
A knowledge management system that allows you to build a persistent semantic graph from conversations with AI assistants. All knowledge is stored in standard Markdown files on your computer, giving you full control and ownership of your data. Integrates directly with Obsidan.md
Why this server?
This is a connector to allow Claude Desktop (or any MCP client) to read and search any directory containing Markdown notes (such as an Obsidian vault).
Why this server?
A server implementation that allows AI assistants to read, create, and manipulate notes in Obsidian vaults through the Model Context Protocol.
Why this server?
Enables AI assistants to interact with Obsidian vaults, providing tools for reading, creating, editing and managing notes and tags.
Why this server?
Leverages Vim's native text editing commands and workflows, which Claude already understands, to create a lightweight code assistance layer.
Why this server?
The server proxies requests from client to JetBrains IDE.
Why this server?
Serves as a guardian of development knowledge, providing AI assistants with curated access to latest documentation and best practices.
Why this server?
A Model Context Protocol server that connects GitHub code to Claude.ai. This server utilizes the Pera1 service to extract code from GitHub repositories and provide better context to Claude.