Skip to main content
Glama

MCP Generix — Shared Documentation with Semantic Search

Custom MCP server that provides semantic search over documents in the docs/ folder. Uses ChromaDB for vector storage and OpenAI embeddings.

Setup

  1. Clone this repo

  2. Create a virtual environment and install dependencies:

    cd mcp_generix python3 -m venv .venv source .venv/bin/activate pip install "mcp[cli]" chromadb openai
  3. Set your OpenAI API key:

    export OPENAI_API_KEY="your-key-here"
  4. Add the MCP server to Claude Code:

    claude mcp add generix-docs -- /path/to/mcp_generix/.venv/bin/python /path/to/mcp_generix/server.py

Adding / Removing Documents

  1. Add markdown (.md) or text files to the docs/ folder

  2. Commit and push

  3. Other team members pull to get the latest documents

  4. The server re-indexes documents automatically on startup, or use the reindex_docs tool

Available Tools

Tool

Description

search_docs

Semantic search — find relevant passages by meaning, not just keywords

list_docs

List all documents in the docs folder

read_doc

Read the full contents of a specific document

reindex_docs

Re-index documents after adding/removing files

Folder Structure

mcp_generix/ ├── server.py ← MCP server with semantic search ├── pyproject.toml ← Python dependencies ├── docs/ ← Shared documentation (managed via git) │ └── (your documents here) ├── .chroma/ ← ChromaDB vector store (gitignored, local) └── .venv/ ← Python virtual environment (gitignored, local)
-
security - not tested
F
license - not found
-
quality - not tested

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/itsphily/mcp_generix'

If you have feedback or need assistance with the MCP directory API, please join our Discord server