Skip to main content
Glama

polarion_github_requirements_coverage

Analyze requirements coverage between Polarion specifications and GitHub code to identify implemented and missing implementations for gap analysis and traceability validation.

Instructions

<purpose>Smart requirements coverage analysis between Polarion and connected GitHub repository</purpose> <when_to_use> - When you need to verify if requirements are implemented in the current codebase - For gap analysis between Polarion specifications and actual code implementation - When user asks "check if requirements are implemented" or "find missing implementations" - For requirements traceability and coverage validation - When you need to identify what's missing from the current code </when_to_use> <workflow_position> INTELLIGENT COVERAGE ANALYSIS TOOL: Use this for end-to-end requirements verification STEP 1: Automatically detects connected GitHub repository from context STEP 2: Fetches FRESH requirements from Polarion for specified topic STEP 3: Analyzes actual code files in GitHub repository STEP 4: Identifies implemented vs missing requirements based on code examination </workflow_position> <parameters> - project_id: Required. Polarion project ID (e.g., "AutoCar", "drivepilot") - topic: Required. Requirements topic to analyze (e.g., "HMI", "braking", "perception", "safety") - github_folder: Optional. Specific folder to focus analysis (e.g., "hmi", "braking"). Empty means analyze entire repository </parameters> <output> Comprehensive requirements coverage analysis </output>

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
github_folderNo
project_idYes
topicYes

Implementation Reference

  • The core handler function for the polarion_github_requirements_coverage tool, decorated with @mcp.tool() for automatic registration in the MCP server. Handles input validation, fetches live requirements from Polarion for the specified topic, and returns structured data for GitHub code coverage analysis.
    @mcp.tool() def polarion_github_requirements_coverage(project_id: str, topic: str, github_folder: str = "") -> str: """ <purpose>Smart requirements coverage analysis between Polarion and connected GitHub repository</purpose> <when_to_use> - When you need to verify if requirements are implemented in the current codebase - For gap analysis between Polarion specifications and actual code implementation - When user asks "check if requirements are implemented" or "find missing implementations" - For requirements traceability and coverage validation - When you need to identify what's missing from the current code </when_to_use> <workflow_position> INTELLIGENT COVERAGE ANALYSIS TOOL: Use this for end-to-end requirements verification STEP 1: Automatically detects connected GitHub repository from context STEP 2: Fetches FRESH requirements from Polarion for specified topic STEP 3: Analyzes actual code files in GitHub repository STEP 4: Identifies implemented vs missing requirements based on code examination </workflow_position> <parameters> - project_id: Required. Polarion project ID (e.g., "AutoCar", "drivepilot") - topic: Required. Requirements topic to analyze (e.g., "HMI", "braking", "perception", "safety") - github_folder: Optional. Specific folder to focus analysis (e.g., "hmi", "braking"). Empty means analyze entire repository </parameters> <output> Comprehensive requirements coverage analysis </output> """ logger.info(f"Starting SMART requirements coverage analysis for '{topic}' in project '{project_id}'") try: # Validate inputs validation_error = _validate_coverage_analysis_inputs(project_id, topic) if validation_error: return json.dumps(validation_error, indent=2) # Fetch FRESH requirements from Polarion (no caching) logger.info(f"📡 Fetching LIVE {topic} requirements from Polarion project {project_id}") requirements_result = _fetch_topic_requirements(project_id, topic) if "error" in requirements_result: return json.dumps(requirements_result, indent=2) requirements = requirements_result["requirements"] if not requirements: return json.dumps({ "status": "warning", "message": f"No requirements found for topic '{topic}' in project {project_id}", "suggestion": "Try different topic keywords or check Polarion project contents" }, indent=2) return json.dumps({ "status": "success", "message": f"✅ Found {len(requirements)} '{topic}' requirements from Polarion", "analysis_summary": { "project_id": project_id, "topic": topic, "total_requirements_found": len(requirements), "target_folder": github_folder or "entire repository" }, "polarion_requirements": requirements, "next_steps": [ "Use GitHub MCP tools to explore the repository structure", "Search for requirement IDs and implementation evidence in code", "Compare actual code implementation against requirement descriptions" ] }, indent=2) except Exception as e: logger.error(f"Requirements coverage analysis failed: {e}") return json.dumps({ "status": "error", "message": f"Requirements coverage analysis failed: {str(e)}" }, indent=2)
  • Supporting helper function that fetches fresh, unique requirements from Polarion for a given topic using multiple search queries and deduplication logic.
    def _fetch_topic_requirements(project_id: str, topic: str) -> Dict: """Fetch requirements related to a specific topic from Polarion (FRESH DATA - no caching)""" try: logger.info(f"🔄 Making LIVE API calls to Polarion - no cached data used") query_patterns = [f"{topic} AND type:requirement", f"title:{topic}", f"{topic}"] all_requirements = [] for i, query in enumerate(query_patterns, 1): logger.info(f"📡 API Call {i}/{len(query_patterns)}: Fetching with query '{query}'") work_items = polarion_client.get_work_items(project_id, limit=50, query=query) all_requirements.extend(work_items) logger.info(f"✅ Received {len(work_items)} items from API call {i}") unique_requirements = {} for item in all_requirements: if item.get('id') and 'type' in item: item_text = f"{item.get('title', '')} {item.get('description', '')}".lower() if (item.get('type', '').lower() in ['requirement', 'req'] or topic.lower() in item_text): unique_requirements[item['id']] = item requirements_list = list(unique_requirements.values()) logger.info(f"🎯 FRESH DATA PROCESSED: Found {len(requirements_list)} unique requirements for topic '{topic}'") return { "status": "success", "requirements": requirements_list, "count": len(requirements_list), "data_freshness": "live_api_fetch", "fetch_timestamp": time.time() } except Exception as e: return {"status": "error", "message": f"Failed to fetch requirements: {str(e)}"}
  • Input validation helper specifically for the coverage analysis tool, checking Polarion authentication and required parameters.
    def _validate_coverage_analysis_inputs(project_id: str, topic: str) -> Dict | None: """Validate inputs for coverage analysis""" if not (polarion_client.token or polarion_client.load_token()): return { "status": "error", "message": "Polarion authentication required", "next_steps": [ "Use open_polarion_login() to authenticate", "Then use set_polarion_token() with generated token", "Finally retry this analysis" ] } if not project_id or not topic: return { "status": "error", "message": "Missing required parameters", "required": ["project_id", "topic"] } return None

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/Sdunga1/MCP-Polarion'

If you have feedback or need assistance with the MCP directory API, please join our Discord server