Skip to main content
Glama

Python Code Runner

by shibing624
docker-compose.yml1.1 kB
version: '3.8' services: mcp-run-python-code: build: . container_name: mcp-run-python-code-server restart: unless-stopped ports: - "8000:8000" volumes: # Optional: Mount local code for development/testing # - ./code_execution:/tmp/code_execution # Optional: Mount logs directory # - ./logs:/var/log environment: # Set base directory for code execution (optional) # BASE_DIR: /tmp/code_execution # Set log level (optional) # LOG_LEVEL: INFO stdin_open: true tty: true # MCP servers typically use stdio transport, so we need to run in interactive mode command: [ "mcp-run-python-code" ] fastapi-server: build: . container_name: mcp-fastapi-server restart: unless-stopped ports: - "8083:8083" volumes: # Optional: Mount local code for development/testing # - ./code_execution:/tmp/code_execution # Optional: Mount logs directory # - ./logs:/var/log environment: # Set log level (optional) # LOG_LEVEL: INFO command: [ "python3", "run_python_code/fastapi_server.py" ]

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/shibing624/mcp-run-python-code'

If you have feedback or need assistance with the MCP directory API, please join our Discord server