Search for:
Why this server?
This server provides basic file system operations like navigation, reading, writing, and file analysis.
Why this server?
Enables LLMs to read, search, and analyze code files with advanced caching and real-time file watching.
Why this server?
Enables AI models to perform file system operations (reading, creating, and listing files) on a local file system.
Why this server?
A custom server that gives LLMs access to file system operations and command execution capabilities through standardized tool interfaces.
Why this server?
Provides secure, read-only access and file search capabilities within a specified directory, while respecting .gitignore patterns.
Why this server?
Allows AI assistants to browse and read files from specified GitHub repositories, providing access to repository contents.
Why this server?
Allows AI coding agents to directly access and interact with Figma files and prototypes.
Why this server?
browse, list and edit filesystem. Implemented in Java/Quarkus with native image (Fast!) available
Why this server?
A line-oriented text file editor. Optimized for LLM tools with efficient partial file access to minimize token usage.
Why this server?
An MCP server that enables enhanced file system operations including reading, writing, copying, moving files with streaming capabilities, directory management, file watching, and change tracking.