Skip to main content
Glama

detect_patterns

Identify patterns, communities, and anomalies in graph data by running multiple analyses. Returns a combined report with metrics like centrality, shortest paths, and detected anomalies for actionable insights.

Instructions

Identify patterns, communities, and anomalies within graphs. Runs all supported analyses and returns a combined report. Args: graph_id: ID of the graph to analyze ctx: MCP context for progress reporting Returns: Dictionary with results from all analyses that succeeded. Keys may include: - degree_centrality - betweenness_centrality - closeness_centrality - communities (if community detection is available) - shortest_path (if path finding is possible) - path_length - anomalies (if anomaly detection is available) - errors (dict of analysis_type -> error message)

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
ctxNo
graph_idYes

Implementation Reference

  • The main handler function for the 'detect_patterns' tool. It is decorated with @mcp.tool(), which registers it as an MCP tool. The function analyzes a graph by computing centrality measures, detecting communities using Louvain algorithm, finding shortest paths, and identifying basic anomalies (degree-1 nodes). Results are returned as a dictionary, with errors if any analysis fails.
    @mcp.tool() async def detect_patterns(graph_id: str, ctx: Optional[Context] = None) -> Dict[str, Any]: """ Identify patterns, communities, and anomalies within graphs. Runs all supported analyses and returns a combined report. Args: graph_id: ID of the graph to analyze ctx: MCP context for progress reporting Returns: Dictionary with results from all analyses that succeeded. Keys may include: - degree_centrality - betweenness_centrality - closeness_centrality - communities (if community detection is available) - shortest_path (if path finding is possible) - path_length - anomalies (if anomaly detection is available) - errors (dict of analysis_type -> error message) """ try: if graph_id not in graph_cache: raise ValueError(f"Graph not found: {graph_id}") if ctx: await ctx.info("Starting pattern detection (all analyses)...") graph_data = graph_cache[graph_id] nx_graph = graph_data["nx_graph"] edges_df = graph_data["edges_df"] source = graph_data["source"] destination = graph_data["destination"] # Convert to NetworkX graph if needed if nx_graph is None and edges_df is not None: nx_graph = nx.from_pandas_edgelist(edges_df, source=source, target=destination) if nx_graph is None: raise ValueError("Graph data not available for analysis") results = {} errors = {} # Centrality try: results["degree_centrality"] = nx.degree_centrality(nx_graph) results["betweenness_centrality"] = nx.betweenness_centrality(nx_graph) results["closeness_centrality"] = nx.closeness_centrality(nx_graph) except Exception as e: errors["centrality"] = str(e) # Community detection try: import community as community_louvain partition = community_louvain.best_partition(nx_graph) results["communities"] = partition except Exception as e: errors["community_detection"] = str(e) # Path finding (try between first two nodes if possible) try: nodes = list(nx_graph.nodes()) if len(nodes) >= 2: path = nx.shortest_path(nx_graph, nodes[0], nodes[1]) results["shortest_path"] = path results["path_length"] = len(path) - 1 except Exception as e: errors["path_finding"] = str(e) # Anomaly detection (placeholder) try: # Example: nodes with degree 1 as "anomalies" anomalies = [n for n, d in nx_graph.degree() if d == 1] results["anomalies"] = anomalies except Exception as e: errors["anomaly_detection"] = str(e) if errors: results["errors"] = errors if ctx: await ctx.info("Pattern detection complete!") return results except Exception as e: logger.error(f"Error in detect_patterns: {e}") raise
  • The @mcp.tool() decorator registers the detect_patterns function as an MCP tool named 'detect_patterns'.
    @mcp.tool()
  • The docstring provides the input schema (graph_id: str, ctx: Optional[Context]) and output schema (Dict[str, Any] with specific keys for results).
    """ Identify patterns, communities, and anomalies within graphs. Runs all supported analyses and returns a combined report. Args: graph_id: ID of the graph to analyze ctx: MCP context for progress reporting Returns: Dictionary with results from all analyses that succeeded. Keys may include: - degree_centrality - betweenness_centrality - closeness_centrality - communities (if community detection is available) - shortest_path (if path finding is possible) - path_length - anomalies (if anomaly detection is available) - errors (dict of analysis_type -> error message) """

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/graphistry/graphistry-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server