Skip to main content
Glama

ShallowCodeResearch_get_performance_metrics

Collect and analyze performance metrics for the MCP Hub research system, including execution times, success rates, error counts, and resource utilization to monitor system efficiency.

Instructions

Get performance metrics and analytics for the MCP Hub system. Collects and returns performance metrics including execution times, success rates, error counts, and resource utilization. Provides basic information if advanced metrics collection is not available. Returns: A dictionary containing performance metrics and statistics

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault

No arguments

Implementation Reference

  • app.py:301-324 (handler)
    Main handler function for the performance metrics tool. Checks if advanced features are available and delegates to metrics_collector.get_metrics_summary() or returns basic status.
    def get_performance_metrics() -> Dict[str, Any]: """ Get performance metrics and analytics for the MCP Hub system. Collects and returns performance metrics including execution times, success rates, error counts, and resource utilization. Provides basic information if advanced metrics collection is not available. Returns: Dict[str, Any]: A dictionary containing performance metrics and statistics """ if not ADVANCED_FEATURES_AVAILABLE: return { "status": "basic_mode", "message": "Performance metrics not available. Install 'pip install psutil aiohttp' to enable advanced monitoring.", "basic_info": { "system_working": True, "features_loaded": False } } try: return metrics_collector.get_metrics_summary() except Exception as e: return {"error": f"Performance metrics failed: {str(e)}"}
  • app.py:1072-1076 (registration)
    Gradio MCP registration of the get_performance_metrics tool via button click handler with api_name 'get_performance_metrics_service'. This exposes it as an MCP tool, likely prefixed as ShallowCodeResearch_get_performance_metrics in the context of the ShallowCodeResearch HF space.
    fn=get_performance_metrics, inputs=[], outputs=metrics_output, api_name="get_performance_metrics_service" )
  • Core implementation of metrics summary calculation in MetricsCollector class. Computes average, min, max, etc. from recent metric points over the last N minutes.
    def get_metrics_summary(self, metric_name: Optional[str] = None, last_minutes: int = 5) -> Dict[str, Any]: """Get summary statistics for metrics.""" cutoff_time = datetime.now() - timedelta(minutes=last_minutes) with self.lock: if metric_name: metrics_to_analyze = {metric_name: self.metrics[metric_name]} else: metrics_to_analyze = dict(self.metrics) summary = {} for name, points in metrics_to_analyze.items(): recent_points = [p for p in points if p.timestamp >= cutoff_time] if not recent_points: continue values = [p.value for p in recent_points] summary[name] = { "count": len(values), "average": sum(values) / len(values), "min": min(values), "max": max(values), "latest": values[-1] if values else 0, "last_updated": recent_points[-1].timestamp.isoformat() if recent_points else None } return summary

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/CodeHalwell/gradio-mcp-agent-hack'

If you have feedback or need assistance with the MCP directory API, please join our Discord server