Skip to main content
Glama
docker-compose.ymlโ€ข821 B
version: '3.8' services: lpdp-mcp: image: ghcr.io/adityaldy/lpdp-mcp:latest container_name: lpdp-mcp-server environment: - GOOGLE_API_KEY=${GOOGLE_API_KEY} - PINECONE_API_KEY=${PINECONE_API_KEY} - PINECONE_INDEX_NAME=${PINECONE_INDEX_NAME:-lpdp-pencairan} restart: unless-stopped # MCP servers communicate via stdio, not network ports # If you need to expose as HTTP, uncomment below: # ports: # - "8080:8080" volumes: # Mount logs directory (optional) - ./logs:/app/logs logging: driver: "json-file" options: max-size: "10m" max-file: "3" # Resource limits deploy: resources: limits: cpus: '1' memory: 512M reservations: cpus: '0.25' memory: 128M

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/adityaldy/mcp-training'

If you have feedback or need assistance with the MCP directory API, please join our Discord server