Skip to main content
Glama

Wanaku MCP Server

main-deploy.yml1.44 kB
name: Deploy Wanaku Dev on: workflow_dispatch: workflow_run: workflows: ["Build Main"] types: - completed env: PROJECTS: ${{ github.workspace }} jobs: build: if: github.repository == 'wanaku-ai/wanaku' runs-on: ubuntu-latest steps: - uses: actions/checkout@v4 - name: Install CLI tools from Mirror uses: redhat-actions/openshift-tools-installer@v1 with: source: "mirror" oc: "latest" - name: Install CLI tools from GitHub uses: redhat-actions/openshift-tools-installer@v1 with: source: "github" kustomize: "latest" - name: Authenticate and set context uses: redhat-actions/oc-login@v1 with: openshift_server_url: ${{ secrets.OPENSHIFT_SERVER }} openshift_token: ${{ secrets.OPENSHIFT_TOKEN }} insecure_skip_tls_verify: true namespace: ${{ secrets.OPENSHIFT_NAMESPACE }} - name: Scale down HTTP service run: oc scale deployment wanaku-tool-service-http --replicas=0 - name: Scale down the Router service run: oc scale deployment wanaku-router-backend --replicas=0 - name: Deploy on the cloud run: kustomize build deploy/openshift/kustomize/overlays/dev | sed -e "s/oidc-url-replace/${{ secrets.QUARKUS_OIDC_CLIENT_AUTH_SERVER}}/g" -e "s/replace-me-with-the-client-credentials-secret/${{ secrets.QUARKUS_OIDC_CLIENT_CREDENTIALS_SECRET}}/g" | oc apply -f -

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/wanaku-ai/wanaku'

If you have feedback or need assistance with the MCP directory API, please join our Discord server