Skip to main content
Glama

get_prompt_template

Retrieve structured prompt templates for LLM interactions with Airflow operations, enabling optimal guidance for DAG inspection, task monitoring, and cluster management.

Instructions

[Tool Role]: Provides comprehensive prompt template for LLM interactions with Airflow operations.

Args: section: Optional section name to get specific part of template mode: Optional mode (summary/detailed) to control response verbosity

Returns: Comprehensive template or specific section for optimal LLM guidance

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
sectionNo
modeNo

Implementation Reference

  • Handler function implementing the 'get_prompt_template' MCP tool. Loads the prompt template from file and returns full content or extracts specific sections by matching content.
    @mcp.tool()
    async def get_prompt_template(section: Optional[str] = None, mode: Optional[str] = None) -> str:
        """
        [Tool Role]: Provides comprehensive prompt template for LLM interactions with Airflow operations.
    
        Args:
            section: Optional section name to get specific part of template
            mode: Optional mode (summary/detailed) to control response verbosity
    
        Returns:
            Comprehensive template or specific section for optimal LLM guidance
        """
        template = read_prompt_template(PROMPT_TEMPLATE_PATH)
        
        if section:
            sections = parse_prompt_sections(template)
            for i, s in enumerate(sections):
                if section.lower() in s.lower():
                    return sections[i + 1]  # +1 to skip the title section
            return f"Section '{section}' not found."
    
        return template
  • Utility function to read the prompt template Markdown file from disk.
    def read_prompt_template(path: str) -> str:
        """
        Reads the MCP prompt template file and returns its content as a string.
        """
        with open(path, "r", encoding="utf-8") as f:
            return f.read()
  • Utility function to parse the prompt template string into list of headings and corresponding section contents.
    def parse_prompt_sections(template: str):
        """
        Parses the prompt template into section headings and sections.
        Returns (headings, sections).
        """
        lines = template.splitlines()
        sections = []
        current = []
        headings = []
        for line in lines:
            if line.startswith("## "):
                if current:
                    sections.append("\n".join(current))
                    current = []
                headings.append(line[3:].strip())
                current.append(line)
            else:
                current.append(line)
        if current:
            sections.append("\n".join(current))
        return headings, sections
  • Registration function for v1 API tools. Sets v1-specific airflow_request and calls register_common_tools(mcp), which defines and registers the get_prompt_template tool.
    def register_tools(mcp):
        """Register v1 tools by importing common tools with v1 request function."""
        
        logger.info("Initializing MCP server for Airflow API v1")
        logger.info("Loading Airflow API v1 tools (Airflow 2.x)")
        
        # Set the global request function to v1
        common_tools.airflow_request = airflow_request_v1
        
        # Register all 56 common tools (includes management tools)
        common_tools.register_common_tools(mcp)
        
        # V1 has no exclusive tools - all tools are shared with v2
        
        logger.info("Registered all Airflow API v1 tools (56 tools: 43 core + 13 management tools)")
  • Registration function for v2 API tools. Sets v2-specific airflow_request and calls register_common_tools(mcp), which defines and registers the get_prompt_template tool.
    def register_tools(mcp):
        """Register v2 tools: common tools + v2-exclusive asset tools."""
        
        logger.info("Initializing MCP server for Airflow API v2")
        logger.info("Loading Airflow API v2 tools (Airflow 3.0+)")
        
        # Set the global request function to v2
        common_tools.airflow_request = airflow_request_v2
        
        # Register all 43 common tools
        common_tools.register_common_tools(mcp)
        
        # Add V2-exclusive tools (2 tools)
        @mcp.tool()
        async def list_assets(limit: int = 20, offset: int = 0,
                             uri_pattern: Optional[str] = None) -> Dict[str, Any]:
            """
            [V2 New] List all assets in the system for data-aware scheduling.
            
            Assets are a key feature in Airflow 3.0 for data-aware scheduling.
            They enable workflows to be triggered by data changes rather than time schedules.
            
            Args:
                limit: Maximum number of assets to return (default: 20)
                offset: Number of assets to skip for pagination (default: 0)
                uri_pattern: Filter assets by URI pattern (optional)
                
            Returns:
                Dict containing assets list, pagination info, and metadata
            """
            params = {'limit': limit, 'offset': offset}
            if uri_pattern:
                params['uri_pattern'] = uri_pattern
                
            query_string = "&".join([f"{k}={v}" for k, v in params.items()])
            
            resp = await airflow_request_v2("GET", f"/assets?{query_string}")
            resp.raise_for_status()
            data = resp.json()
            
            return {
                "assets": data.get("assets", []),
                "total_entries": data.get("total_entries", 0),
                "limit": limit,
                "offset": offset,
                "api_version": "v2",
                "feature": "assets"
            }
    
        @mcp.tool()
        async def list_asset_events(limit: int = 20, offset: int = 0,
                                   asset_uri: Optional[str] = None,
                                   source_dag_id: Optional[str] = None) -> Dict[str, Any]:
            """
            [V2 New] List asset events for data lineage tracking.
            
            Asset events track when assets are created or updated by DAGs.
            This enables data lineage tracking and data-aware scheduling in Airflow 3.0.
            
            Args:
                limit: Maximum number of events to return (default: 20)
                offset: Number of events to skip for pagination (default: 0)
                asset_uri: Filter events by specific asset URI (optional)
                source_dag_id: Filter events by source DAG that produced the event (optional)
                
            Returns:
                Dict containing asset events list, pagination info, and metadata
            """
            params = {'limit': limit, 'offset': offset}
            if asset_uri:
                params['asset_uri'] = asset_uri
            if source_dag_id:
                params['source_dag_id'] = source_dag_id
                
            query_string = "&".join([f"{k}={v}" for k, v in params.items()])
            
            resp = await airflow_request_v2("GET", f"/assets/events?{query_string}")
            resp.raise_for_status()
            data = resp.json()
            
            return {
                "asset_events": data.get("asset_events", []),
                "total_entries": data.get("total_entries", 0),
                "limit": limit,
                "offset": offset,
                "api_version": "v2",
                "feature": "asset_events"
            }
    
        logger.info("Registered all Airflow API v2 tools (43 common + 2 assets + 4 management = 49 tools)")

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/call518/MCP-Airflow-API'

If you have feedback or need assistance with the MCP directory API, please join our Discord server