Istedlal MCP Server
The Istedlal MCP Server enables AI agents to manage file metadata, perform filtered searches, and run semantic searches over file embeddings.
Get File Metadata (
get_file_metadata): Retrieve detailed metadata for a specific file by its ID, including filename, MIME type, file size, processing status, and storage information — scoped to a specific tenant and project.Search Files by Metadata Filters (
search_files): Query files using metadata-based filters such as filename, MIME type, processing status, and upload date range, with pagination support — scoped to a specific tenant and project.Semantic Search Over Files (
semantic_search_files): Perform natural language semantic search across file embeddings to find relevant content, with options to limit results (top_k), filter by specific file IDs, set a similarity score threshold, and scope the search to a specific tenant and project.
All operations support multi-tenant isolation via tenant and project identifiers. The server supports Docker/Kubernetes deployment, PostgreSQL + pgvector for vector storage, Ollama for embedding generation, Bearer token authentication, and multiple transport options (HTTP or stdio for IDE/Cursor integration).
Click on "Install Server".
Wait a few minutes for the server to deploy. Once ready, it will show a "Started" state.
In the chat, type
@followed by the MCP server name and your instructions, e.g., "@Istedlal MCP Serversearch for files related to project budget using semantic search"
That's it! The server will respond to your query, and you can continue using it as needed.
Here is a step-by-step guide with screenshots.
Istedlal MCP Server
MCP Server for Istedlal AI Agents - file metadata, vector search, workflow metrics access.
Requirements
Python 3.10+
See
requirements.txtfor dependencies
Setup
# Create virtual environment
python -m venv venv
venv\Scripts\activate # Windows
# Install dependencies
pip install -r requirements.txt
# Create .env with required variables (see docs/ENV_SETUP.md)Run
Terminal testing (use streamable-http to avoid "Invalid JSON: EOF" errors):
# .env: MCP_TRANSPORT=streamable-http
python -m src.main
# Server at http://localhost:8000/mcpCursor/IDE integration (stdio - Cursor spawns the process, don't run manually):
# .env: MCP_TRANSPORT=stdio
# Add server to Cursor MCP settings; Cursor will start it automaticallyTools
get_file_metadata- Fetch metadata for a file by ID (real DB when VECTOR_PROVIDER=pgvector)search_files- Search files by metadata filters (real DB when pgvector)semantic_search_files- Semantic search over file embeddings (Ollama + pgvector)
Testing with MCP Inspector
See docs/MCP_INSPECTOR_GUIDE.md for the complete step-by-step guide.
npx -y @modelcontextprotocol/inspectorProduction
Production Checklist
Item | Required | Notes |
Dockerfile | Yes | Build container image |
.dockerignore | Yes | Exclude venv, .env, pycache |
Production .env | Yes | Set on server (never commit) |
Port 8000 | Yes | Expose for MCP endpoint |
PostgreSQL + pgvector | Phase 2 | document_metadata, document_embeddings (see data/vectordb_schema_documentation.pdf) |
Ollama | Phase 2 | For semantic search query embeddings |
What to Exclude from Deployment
.cursor/– Cursor IDE config only, not needed on servervenv/– Create fresh on server or use Docker.env– Contains secrets; set separately on server__pycache__/– Python cache, auto-generateddata/– Reference docs only, not runtime
Production Environment Variables
MCP_TRANSPORT=streamable-http
HTTP_HOST=0.0.0.0
HTTP_PORT=8000
DATABASE_URL=postgresql://user:password@db-host:5432/dbname
VECTOR_PROVIDER=pgvector # mock | pgvector | chromadb
OLLAMA_URL=https://your-ollama:11433
OLLAMA_EMBEDDING_MODEL=llama3.2
OLLAMA_USERNAME= # if Basic Auth required
OLLAMA_PASSWORD=
LOG_LEVEL=INFO
MCP_BEARER_TOKEN=your-secret-token # Required – Bearer token auth for /mcpDockerfile (Create if Deploying via Docker)
FROM python:3.12-slim
WORKDIR /app
COPY requirements.txt .
RUN pip install --no-cache-dir -r requirements.txt
COPY src/ ./src/
ENV MCP_TRANSPORT=streamable-http
ENV PYTHONUNBUFFERED=1
EXPOSE 8000
CMD ["python", "-m", "src.main"].dockerignore (Create to Exclude from Build)
venv/
.env
.git/
.cursor/
__pycache__/
*.pyc
data/
docs/
scripts/
tests/
infra/Deployment Steps
Build:
docker build -t istedlal-mcp .Run:
docker run -p 8000:8000 -e DATABASE_URL=... -e MCP_BEARER_TOKEN=your-secret istedlal-mcpVerify:
curl http://localhost:8000/(info page)MCP Endpoint:
http://your-server:8000/mcp
Kubernetes (Optional)
Use Deployment + Service manifests in
infra/k8s/Expose Service (ClusterIP/NodePort/LoadBalancer)
Set DATABASE_URL via Secret
Health & Monitoring
Root
/returns JSON with statusMCP endpoint:
/mcp(for MCP clients only)Logs: Set LOG_LEVEL=DEBUG for troubleshooting
Resources
Unclaimed servers have limited discoverability.
Looking for Admin?
If you are the server author, to access and configure the admin panel.
Latest Blog Posts
MCP directory API
We provide all the information about MCP servers via our MCP API.
curl -X GET 'https://glama.ai/api/mcp/v1/servers/AnshuML/MCP'
If you have feedback or need assistance with the MCP directory API, please join our Discord server