Skip to main content
Glama

debugpy_logs

Retrieve logs from Docker containers to analyze Python process behavior and generate breakpoint plans for debugging with debugpy.

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
containerYes
tailNo

Output Schema

TableJSON Schema
NameRequiredDescriptionDefault

No arguments

Implementation Reference

  • The main handler for the 'debugpy_logs' tool. Decorated with @mcp.tool(), it runs 'docker logs --tail' on the specified container, combines stdout and stderr, and returns a LogsResult with the combined logs.
    @mcp.tool()
    def debugpy_logs(container: str, tail: int = 250) -> dict[str, Any]:
        proc = run(["docker", "logs", "--tail", str(tail), container], timeout=30, check=False)
        combined = ""
        if proc.stdout:
            combined += proc.stdout
        if proc.stderr:
            if combined:
                combined += "\n"
            combined += proc.stderr
        return LogsResult(ok=proc.returncode == 0, container=container, tail=tail, logs=combined.strip()).model_dump()
  • Pydantic model defining the output schema for debugpy_logs. Contains fields: ok (bool), container (str), tail (int), and logs (str).
    class LogsResult(BaseModel):
        ok: bool
        container: str
        tail: int
        logs: str
  • The @mcp.tool() decorator registers the debugpy_logs function as an MCP tool with the FastMCP server instance.
    @mcp.tool()
  • Helper function 'run' that executes subprocess commands with configurable timeout and error checking. Used by debugpy_logs to invoke 'docker logs'.
    def run(cmd: list[str], *, timeout: int = DEFAULT_TIMEOUT, check: bool = True) -> subprocess.CompletedProcess[str]:
        proc = subprocess.run(cmd, capture_output=True, text=True, timeout=timeout)
        if check and proc.returncode != 0:
            raise ToolError(
                f"Command failed ({proc.returncode}): {' '.join(cmd)}\n"
                f"STDOUT:\n{proc.stdout}\nSTDERR:\n{proc.stderr}"
            )
        return proc

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/will-garrett/debugpy-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server