Skip to main content
Glama

AI-Powered MCP Server

by larryfang
docker-compose.yml292 B
version: '3.9' services: mcp-server: build: context: . dockerfile: Dockerfile.mcp ports: - "3000:3000" env_file: - .env chat-api: build: context: . dockerfile: Dockerfile.chat ports: - "4000:4000" env_file: - .env

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/larryfang/sms-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server