Enables indexing and searching of public Git repositories to build snippet libraries for LLM ingestion
Provides access to repositories on GitHub for code indexing and snippet generation
Enables secure, private LLM processing through Helix.ML's platform for enterprise-grade code assistance
Supports integration with OpenAI's models for semantic search and code assistance capabilities
Uses SQLite as a database backend for storing and retrieving code snippets and indices
Helix Kodit is an MCP server that connects your AI coding assistant to external codebases. It can:
- Improve your AI-assisted code by providing canonical examples direct from the source
- Index local and public codebases
- Integrates with any AI coding assistant via MCP
- Search using keyword and semantic search
- Integrate with any OpenAI-compatible or custom API/model
If you're an engineer working with AI-powered coding assistants, Kodit helps by providing relevant and up-to-date examples of your task so that LLMs make less mistakes and produce fewer hallucinations.
✨ Features
Codebase Indexing
Kodit connects to a variety of local and remote codebases to build an index of your code. This index is used to build a snippet library, ready for ingestion into an LLM.
- Index local directories and public Git repositories
- Build comprehensive snippet libraries for LLM ingestion
- Support for multiple codebase types and languages
- Efficient indexing and search capabilities
- Privacy first: respects .gitignore and .noindex files.
MCP Server
Relevant snippets are exposed to an AI coding assistant via an MCP server. This allows the assistant to request relevant snippets by providing keywords, code, and semantic intent. Kodit has been tested to work well with:
- Seamless integration with popular AI coding assistants
- Tested and verified with:
- Please contribute more instructions! ... any other assistant is likely to work ...
Enterprise Ready
Out of the box, Kodit works with a local SQLite database and very small, local models. But enterprises can scale out with performant databases and dedicated models. Everything can even run securely, privately, with on-premise LLM platforms like Helix.
Supported databases:
- SQLite
- Vectorchord
Supported providers:
- Local (which uses tiny CPU-only open-source models)
- OpenAI
- Secure, private LLM enclave with Helix.
- Any other OpenAI compatible API
🚀 Quick Start
Documentation
Roadmap
The roadmap is currently maintained as a Github Project.
💬 Support
For commercial support, please contact Helix.ML. To ask a question, please open a discussion.
License
This server cannot be installed
hybrid server
The server is able to function both locally and remotely, depending on the configuration or use case.
A Code Indexing MCP Server that connects AI coding assistants to external codebases, providing accurate and up-to-date code snippets to reduce mistakes and hallucinations.
Related MCP Servers
- AsecurityAlicenseAqualityA MCP server for managing and storing code snippets in various programming languages, allowing users to create, list, and delete snippets via a standardized interface.Last updated -34JavaScriptMIT License
- -securityAlicense-qualityAn MCP server that implements Claude Code-like functionality, allowing the AI to analyze codebases, modify files, execute commands, and manage projects through direct file system interactions.Last updated -179PythonMIT License
- -securityAlicense-qualityAn MCP server that analyzes codebases and generates contextual prompts, making it easier for AI assistants to understand and work with code repositories.Last updated -10PythonMIT License
- AsecurityFlicenseAqualityAn MCP server that supercharges AI assistants with powerful tools for software development, enabling research, planning, code generation, and project scaffolding through natural language interaction.Last updated -1138TypeScript