Skip to main content
Glama

Fal.ai MCP Server

by raveenb
docker.yml1.83 kB
name: Docker Build and Publish on: push: branches: [main] tags: - 'v*' pull_request: branches: [main] workflow_dispatch: env: REGISTRY: ghcr.io IMAGE_NAME: ${{ github.repository }} jobs: build: runs-on: ubuntu-latest permissions: contents: read packages: write steps: - name: Checkout code uses: actions/checkout@v4 - name: Set up Docker Buildx uses: docker/setup-buildx-action@v3 - name: Log in to GitHub Container Registry if: github.event_name != 'pull_request' uses: docker/login-action@v3 with: registry: ${{ env.REGISTRY }} username: ${{ github.actor }} password: ${{ secrets.GITHUB_TOKEN }} - name: Extract metadata id: meta uses: docker/metadata-action@v5 with: images: ${{ env.REGISTRY }}/${{ env.IMAGE_NAME }} tags: | type=ref,event=branch type=ref,event=pr type=semver,pattern={{version}} type=semver,pattern={{major}}.{{minor}} type=semver,pattern={{major}} type=raw,value=latest,enable={{is_default_branch}} - name: Build and push Docker image uses: docker/build-push-action@v5 with: context: . platforms: linux/amd64,linux/arm64 push: ${{ github.event_name != 'pull_request' }} tags: ${{ steps.meta.outputs.tags }} labels: ${{ steps.meta.outputs.labels }} cache-from: type=gha cache-to: type=gha,mode=max - name: Test Docker image if: github.event_name == 'pull_request' run: | docker build -t test-image . docker run --rm test-image python -m fal_mcp_server.server --help

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/raveenb/fal-mcp-server'

If you have feedback or need assistance with the MCP directory API, please join our Discord server