Skip to main content
Glama

Leave Manager MCP Tool Server

by ahmad-act
docker-compose.yml610 B
version: "3.9" services: open-webui: image: ghcr.io/open-webui/open-webui:main container_name: open-webui ports: - "3000:8080" volumes: - open-webui:/app/backend/data environment: - OLLAMA_API_BASE_URL=http://host.docker.internal:11434 - MCP_PLUGIN_URL=http://leave-manager:8000/openapi.json depends_on: - leave-manager networks: - mcp_network leave-manager: container_name: leave-manager build: . ports: - "8000:8000" networks: - mcp_network volumes: open-webui: networks: mcp_network: driver: bridge

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/ahmad-act/Local-AI-with-Ollama-Open-WebUI-MCP-on-Windows'

If you have feedback or need assistance with the MCP directory API, please join our Discord server