Skip to main content
Glama

MCP Document Indexer

by yairwein
.env.example644 B
# Example environment configuration for MCP Document Indexer # Folders to monitor (comma-separated) WATCH_FOLDERS="/Users/me/Documents,/Users/me/Research" # LanceDB storage path LANCEDB_PATH="./vector_index" # Ollama model for summarization LLM_MODEL="llama3.2:3b" # Text chunking settings CHUNK_SIZE=1000 CHUNK_OVERLAP=200 # Embedding model (sentence-transformers) EMBEDDING_MODEL="all-MiniLM-L6-v2" # File types to index (comma-separated) FILE_EXTENSIONS=".pdf,.docx,.doc,.txt,.md,.rtf" # Maximum file size in MB MAX_FILE_SIZE_MB=100 # Ollama API URL OLLAMA_BASE_URL="http://localhost:11434" # Batch size for processing BATCH_SIZE=10

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/yairwein/document-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server