Skip to main content
Glama

OParl MCP Server

by jtwolfe
test_integration_podman.shโ€ข1.02 kB
#!/bin/bash # Podman-specific integration test script set -e echo "๐Ÿงช Running Integration Tests with Podman" echo "========================================" # Check if podman is available if ! command -v podman &> /dev/null; then echo "โŒ Podman not found. Please install Podman first." exit 1 fi echo "โœ… Podman found: $(podman --version)" # Test Python imports first echo "" echo "๐Ÿ Testing Python imports..." python -c " from oparl_mcp import OParlMCPServer, OParlConfig config = OParlConfig() server = OParlMCPServer(config) info = server.get_server_info() print('โœ… Python imports successful') print(f'Server: {info[\"name\"]} v{info[\"version\"]}') " # Build container image echo "" echo "๐Ÿ”จ Building container image with Podman..." podman build -f docker/Dockerfile -t oparl-mcp-server:test . # Test container run echo "" echo "๐Ÿš€ Testing container run with Podman..." podman run --rm --name oparl-test oparl-mcp-server:test echo "" echo "๐ŸŽ‰ All integration tests passed with Podman!"

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/jtwolfe/oparl-mcp-server'

If you have feedback or need assistance with the MCP directory API, please join our Discord server