Skip to main content
Glama
K02D

MCP Tabular Data Analysis Server

by K02D

analyze_time_series

Analyze time series data to detect trends, seasonality patterns, and generate moving averages and forecasts from CSV or SQLite files.

Instructions

Perform time series analysis including trend detection, seasonality, and statistics. Args: file_path: Path to CSV or SQLite file date_column: Name of the date/datetime column value_column: Name of the numeric column to analyze freq: Frequency for resampling - 'D' (daily), 'W' (weekly), 'M' (monthly), 'Q' (quarterly), 'Y' (yearly) include_forecast: If True, include simple moving average forecast Returns: Dictionary containing: - trend: Overall trend direction and statistics - statistics: Time series statistics - moving_averages: 7, 30, 90 period moving averages - seasonality: Day of week / month patterns - forecast: Simple forecast if requested

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
file_pathYes
date_columnYes
value_columnYes
freqNoD
include_forecastNo

Implementation Reference

  • The main handler function for the 'analyze_time_series' tool, decorated with @mcp.tool() for automatic registration in the FastMCP server. It performs comprehensive time series analysis including trend detection via linear regression, seasonality patterns, moving averages, resampling, statistics, and optional simple forecasting.
    @mcp.tool() def analyze_time_series( file_path: str, date_column: str, value_column: str, freq: str = "D", include_forecast: bool = False, ) -> dict[str, Any]: """ Perform time series analysis including trend detection, seasonality, and statistics. Args: file_path: Path to CSV or SQLite file date_column: Name of the date/datetime column value_column: Name of the numeric column to analyze freq: Frequency for resampling - 'D' (daily), 'W' (weekly), 'M' (monthly), 'Q' (quarterly), 'Y' (yearly) include_forecast: If True, include simple moving average forecast Returns: Dictionary containing: - trend: Overall trend direction and statistics - statistics: Time series statistics - moving_averages: 7, 30, 90 period moving averages - seasonality: Day of week / month patterns - forecast: Simple forecast if requested """ df = _load_data(file_path) if date_column not in df.columns: raise ValueError(f"Date column '{date_column}' not found. Available: {df.columns.tolist()}") if value_column not in df.columns: raise ValueError(f"Value column '{value_column}' not found") # Parse dates df[date_column] = pd.to_datetime(df[date_column], errors='coerce') df = df.dropna(subset=[date_column, value_column]) df = df.sort_values(date_column) # Create time series ts = df.set_index(date_column)[value_column] if len(ts) < 3: raise ValueError("Need at least 3 data points for time series analysis") # Basic statistics date_range = { "start": str(ts.index.min().date()), "end": str(ts.index.max().date()), "periods": len(ts), "span_days": (ts.index.max() - ts.index.min()).days, } # Trend analysis using linear regression x = np.arange(len(ts)) slope, intercept, r_value, p_value, std_err = stats.linregress(x, ts.values) trend_direction = "increasing" if slope > 0 else "decreasing" if slope < 0 else "flat" trend_strength = abs(r_value) trend = { "direction": trend_direction, "slope": round(float(slope), 4), "r_squared": round(float(r_value ** 2), 4), "strength": "strong" if trend_strength > 0.7 else "moderate" if trend_strength > 0.4 else "weak", "pct_change_total": round(float((ts.iloc[-1] - ts.iloc[0]) / ts.iloc[0] * 100), 2) if ts.iloc[0] != 0 else None, } # Calculate moving averages ma_result = {} for window in [7, 30, 90]: if len(ts) >= window: ma = ts.rolling(window=window).mean() ma_result[f"ma_{window}"] = { "current": round(float(ma.iloc[-1]), 2) if not pd.isna(ma.iloc[-1]) else None, "min": round(float(ma.min()), 2), "max": round(float(ma.max()), 2), } # Resample by frequency resampled = ts.resample(freq).agg(['mean', 'sum', 'count', 'min', 'max']) # Seasonality analysis (if enough data) seasonality = {} if len(df) >= 7: df['dow'] = df[date_column].dt.day_name() dow_stats = df.groupby('dow')[value_column].mean().reindex([ 'Monday', 'Tuesday', 'Wednesday', 'Thursday', 'Friday', 'Saturday', 'Sunday' ]) seasonality['day_of_week'] = dow_stats.round(2).to_dict() best_day = dow_stats.idxmax() worst_day = dow_stats.idxmin() seasonality['insights'] = { 'best_day': best_day, 'worst_day': worst_day, 'variation_pct': round(float((dow_stats.max() - dow_stats.min()) / dow_stats.mean() * 100), 1) if dow_stats.mean() != 0 else 0, } if len(df) >= 30: df['month'] = df[date_column].dt.month_name() month_stats = df.groupby('month')[value_column].mean() seasonality['monthly'] = month_stats.round(2).to_dict() # Statistics statistics = { "mean": round(float(ts.mean()), 2), "median": round(float(ts.median()), 2), "std": round(float(ts.std()), 2), "min": round(float(ts.min()), 2), "max": round(float(ts.max()), 2), "volatility": round(float(ts.std() / ts.mean() * 100), 2) if ts.mean() != 0 else 0, } # Simple forecast using moving average forecast = None if include_forecast and len(ts) >= 7: forecast_window = min(7, len(ts)) forecast_value = ts.tail(forecast_window).mean() forecast = { "method": f"{forecast_window}-period moving average", "next_period_estimate": round(float(forecast_value), 2), "confidence_note": "Simple estimate based on recent average", } # Recent data sample recent_data = ts.tail(10).reset_index() recent_data.columns = ['date', 'value'] recent_data['date'] = recent_data['date'].dt.strftime('%Y-%m-%d') return { "date_column": date_column, "value_column": value_column, "date_range": date_range, "trend": trend, "statistics": statistics, "moving_averages": ma_result, "seasonality": seasonality, "resampled_by": freq, "resampled_periods": len(resampled), "forecast": forecast, "recent_data": recent_data.to_dict(orient='records'), }

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/K02D/mcp-tabular'

If you have feedback or need assistance with the MCP directory API, please join our Discord server