Skip to main content
Glama

deploy-compose

Deploy Docker Compose stacks using YAML configuration to manage containerized applications with defined project names.

Instructions

Deploy a Docker Compose stack

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
compose_yamlYes
project_nameYes

Implementation Reference

  • Main execution logic for the deploy-compose tool: validates input, processes YAML, saves compose file, deploys via _deploy_stack helper, handles errors and cleanup.
    async def handle_deploy_compose(arguments: Dict[str, Any]) -> List[TextContent]: debug_info = [] try: compose_yaml = arguments.get("compose_yaml") project_name = arguments.get("project_name") if not compose_yaml or not project_name: raise ValueError( "Missing required compose_yaml or project_name") yaml_content = DockerHandlers._process_yaml( compose_yaml, debug_info) compose_path = DockerHandlers._save_compose_file( yaml_content, project_name) try: result = await DockerHandlers._deploy_stack(compose_path, project_name, debug_info) return [TextContent(type="text", text=result)] finally: DockerHandlers._cleanup_files(compose_path) except Exception as e: debug_output = "\n".join(debug_info) return [TextContent(type="text", text=f"Error deploying compose stack: {str(e)}\n\nDebug Information:\n{debug_output}")]
  • Registration of the deploy-compose tool in list_tools(), including name, description, and input schema.
    types.Tool( name="deploy-compose", description="Deploy a Docker Compose stack", inputSchema={ "type": "object", "properties": { "compose_yaml": {"type": "string"}, "project_name": {"type": "string"} }, "required": ["compose_yaml", "project_name"] } ),
  • Helper method that orchestrates docker-compose down, up, and ps commands using DockerComposeExecutor.
    async def _deploy_stack(compose_path: str, project_name: str, debug_info: List[str]) -> str: compose = DockerComposeExecutor(compose_path, project_name) for command in [compose.down, compose.up]: try: code, out, err = await command() debug_info.extend([ f"\n=== {command.__name__.capitalize()} Command ===", f"Return Code: {code}", f"Stdout: {out}", f"Stderr: {err}" ]) if code != 0 and command == compose.up: raise Exception(f"Deploy failed with code {code}: {err}") except Exception as e: if command != compose.down: raise e debug_info.append(f"Warning during { command.__name__}: {str(e)}") code, out, err = await compose.ps() service_info = out if code == 0 else "Unable to list services" return (f"Successfully deployed compose stack '{project_name}'\n" f"Running services:\n{service_info}\n\n" f"Debug Info:\n{chr(10).join(debug_info)}")
  • Core executor class for running docker-compose commands (down, up, ps) cross-platform.
    class DockerComposeExecutor(DockerExecutorBase): def __init__(self, compose_file: str, project_name: str): super().__init__() self.compose_file = os.path.abspath(compose_file) self.project_name = project_name async def run_command(self, command: str, *args) -> Tuple[int, str, str]: if platform.system() == 'Windows': cmd = self._build_windows_command(command, *args) else: cmd = self._build_unix_command(command, *args) return await self.executor.execute(cmd) def _build_windows_command(self, command: str, *args) -> str: compose_file = self.compose_file.replace('\\', '/') return (f'cd "{os.path.dirname(compose_file)}" && docker compose ' f'-f "{os.path.basename(compose_file)}" ' f'-p {self.project_name} {command} {" ".join(args)}') def _build_unix_command(self, command: str, *args) -> list[str]: return [ self.docker_cmd, "compose", "-f", self.compose_file, "-p", self.project_name, command, *args ] async def down(self) -> Tuple[int, str, str]: return await self.run_command("down", "--volumes") async def pull(self) -> Tuple[int, str, str]: return await self.run_command("pull") async def up(self) -> Tuple[int, str, str]: return await self.run_command("up", "-d") async def ps(self) -> Tuple[int, str, str]: return await self.run_command("ps")

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/QuantGeekDev/docker-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server