Skip to main content
Glama

cognee-mcp

test_llms.yml2.53 kB
name: LLM Test Suites permissions: contents: read on: workflow_call: env: RUNTIME__LOG_LEVEL: ERROR ENV: 'dev' jobs: test-gemini: name: Run Gemini Test runs-on: ubuntu-22.04 steps: - name: Check out repository uses: actions/checkout@v4 - name: Cognee Setup uses: ./.github/actions/cognee_setup with: python-version: '3.11.x' - name: Run Gemini Simple Example env: LLM_PROVIDER: "gemini" LLM_API_KEY: ${{ secrets.GEMINI_API_KEY }} LLM_MODEL: "gemini/gemini-1.5-flash" EMBEDDING_PROVIDER: "gemini" EMBEDDING_API_KEY: ${{ secrets.GEMINI_API_KEY }} EMBEDDING_MODEL: "gemini/text-embedding-004" EMBEDDING_DIMENSIONS: "768" EMBEDDING_MAX_TOKENS: "8076" run: uv run python ./examples/python/simple_example.py test-fastembed: name: Run Fastembed Test runs-on: ubuntu-22.04 steps: - name: Check out repository uses: actions/checkout@v4 - name: Cognee Setup uses: ./.github/actions/cognee_setup with: python-version: '3.11.x' - name: Run Fastembed Simple Example env: LLM_PROVIDER: "openai" LLM_API_KEY: ${{ secrets.LLM_API_KEY }} LLM_MODEL: ${{ secrets.LLM_MODEL }} LLM_ENDPOINT: ${{ secrets.LLM_ENDPOINT }} LLM_API_VERSION: ${{ secrets.LLM_API_VERSION }} EMBEDDING_PROVIDER: "fastembed" EMBEDDING_MODEL: "sentence-transformers/all-MiniLM-L6-v2" EMBEDDING_DIMENSIONS: "384" EMBEDDING_MAX_TOKENS: "256" run: uv run python ./examples/python/simple_example.py test-openrouter: name: Run OpenRouter Test runs-on: ubuntu-22.04 steps: - name: Check out repository uses: actions/checkout@v4 - name: Cognee Setup uses: ./.github/actions/cognee_setup with: python-version: '3.11.x' - name: Run OpenRouter Simple Example env: LLM_PROVIDER: "custom" LLM_API_KEY: ${{ secrets.OPENROUTER_API_KEY }} LLM_MODEL: "openrouter/x-ai/grok-code-fast-1" LLM_ENDPOINT: "https://openrouter.ai/api/v1" EMBEDDING_PROVIDER: "openai" EMBEDDING_API_KEY: ${{ secrets.OPENAI_API_KEY }} EMBEDDING_MODEL: "openai/text-embedding-3-large" EMBEDDING_DIMENSIONS: "3072" EMBEDDING_MAX_TOKENS: "8191" run: uv run python ./examples/python/simple_example.py

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/topoteretes/cognee'

If you have feedback or need assistance with the MCP directory API, please join our Discord server