Skip to main content
Glama

WeCom Bot MCP Server

codecov.yml1.62 kB
name: Codecov on: pull_request: branches: [ main ] jobs: codecov: name: Code Coverage runs-on: ubuntu-latest strategy: matrix: python-version: ['3.10'] steps: - uses: actions/checkout@v5 with: fetch-depth: 0 - name: Set up Python ${{ matrix.python-version }} uses: actions/setup-python@v6 with: python-version: ${{ matrix.python-version }} cache: 'pip' cache-dependency-path: '**/pyproject.toml' - name: Install dependencies run: | python -m pip install --upgrade pip pip install uv # - name: Cache nox environments uses: actions/cache@v4 with: path: .nox key: ${{ runner.os }}-nox-${{ matrix.python-version }}-${{ hashFiles('**/noxfile.py') }} restore-keys: | ${{ runner.os }}-nox-${{ matrix.python-version }}- - name: Install project run: | uv pip install --system -e . # Test that package can be imported after installation - name: Test package import run: | python -c "import wecom_bot_mcp_server; from wecom_bot_mcp_server import mcp, send_message, send_wecom_file, send_wecom_image; print('All imports successful')" - name: Run tests run: | uvx nox -s pytest - name: Upload coverage reports to Codecov uses: codecov/codecov-action@v5 with: token: ${{ secrets.CODECOV_TOKEN }} slug: loonghao/wecom-bot-mcp-server fail_ci_if_error: false verbose: true

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/loonghao/wecom-bot-mcp-server'

If you have feedback or need assistance with the MCP directory API, please join our Discord server