Skip to main content
Glama

OPNSense MCP Server

ci.yml1.15 kB
name: CI on: push: branches: [ main ] pull_request: branches: [ main ] jobs: build: runs-on: ubuntu-latest strategy: matrix: node-version: [18.x, 20.x] steps: - uses: actions/checkout@v4 - name: Use Node.js ${{ matrix.node-version }} uses: actions/setup-node@v4 with: node-version: ${{ matrix.node-version }} - name: Cache dependencies uses: actions/cache@v4 with: path: ~/.npm key: ${{ runner.os }}-node-${{ hashFiles('**/package-lock.json') }} restore-keys: | ${{ runner.os }}-node- - name: Install dependencies run: npm ci - name: Build project run: npm run build - name: Check for TypeScript errors run: npx tsc --noEmit - name: Run linter run: npm run lint || echo "Linting not configured" - name: Run tests run: npm test || echo "Tests not configured" - name: Upload build artifacts uses: actions/upload-artifact@v4 if: matrix.node-version == '20.x' with: name: build-artifacts path: dist/

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/vespo92/OPNSenseMCP'

If you have feedback or need assistance with the MCP directory API, please join our Discord server