Skip to main content
Glama

cognee-mcp

test_mcp.yml1.63 kB
name: test | mcp on: workflow_call: jobs: test-mcp: name: Run MCP Test runs-on: ubuntu-22.04 steps: - name: Check out repository uses: actions/checkout@v4 - name: Set up Python uses: actions/setup-python@v5 with: python-version: ${{ inputs.python-version }} - name: Install UV shell: bash run: | python -m pip install --upgrade pip pip install uv # This will install all dependencies along with Cognee version deployed on PIP - name: Install dependencies shell: bash working-directory: cognee-mcp run: uv sync # NEW: swap in current local cognee branch version - name: Override with cognee branch checkout working-directory: cognee-mcp run: | # Remove Cognee wheel that came from PyPI uv pip uninstall cognee # Install of the freshly-checked-out Cognee branch uv pip install --force-reinstall -e ../ - name: Run MCP test env: ENV: 'dev' LLM_MODEL: ${{ secrets.LLM_MODEL }} LLM_ENDPOINT: ${{ secrets.LLM_ENDPOINT }} LLM_API_KEY: ${{ secrets.LLM_API_KEY }} LLM_API_VERSION: ${{ secrets.LLM_API_VERSION }} EMBEDDING_MODEL: ${{ secrets.EMBEDDING_MODEL }} EMBEDDING_ENDPOINT: ${{ secrets.EMBEDDING_ENDPOINT }} EMBEDDING_API_KEY: ${{ secrets.EMBEDDING_API_KEY }} EMBEDDING_API_VERSION: ${{ secrets.EMBEDDING_API_VERSION }} working-directory: cognee-mcp run: uv run --no-sync python ./src/test_client.py

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/topoteretes/cognee'

If you have feedback or need assistance with the MCP directory API, please join our Discord server