Skip to main content
Glama

DocuMCP

by YannickTM

Server Configuration

Describes the environment variables required to run the server.

NameRequiredDescriptionDefault
CHROMA_URLNoThe URL of the ChromaDB serverhttp://localhost:8000
LANCE_PATHNoThe path to store LanceDB data~/lanceDB
OLLAMA_URLNoThe URL of the Ollama serverhttp://localhost:11434
QDRANT_URLNoThe URL of the Qdrant serverhttp://localhost:6333
EMBEDDING_MODELNoThe embedding model to useall-MiniLM-L6-v2
EMBEDDING_PROVIDERNoThe embedding provider to use (buildin or ollama)buildin
VECTOR_DB_PROVIDERNoThe vector database provider to use (lance, chroma, or qdrant)lance
EMBEDDING_DIMENSIONNoThe dimension of embeddings generated by the model384

Schema

Prompts

Interactive templates invoked by user choice

NameDescription

No prompts

Resources

Contextual data attached and managed by the client

NameDescription

No resources

Tools

Functions exposed to the LLM to take actions

NameDescription

No tools

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/YannickTM/docu-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server