Skip to main content
Glama

get_weather_time_series

Retrieve historical weather data for a specific station to analyze trends over hours or days, with configurable time intervals and duration up to one week.

Instructions

Get time series weather data for a station.

Useful for analyzing weather trends over hours or days.

Args: station_code: Station code (e.g., '44132' for Tokyo) hours: Number of hours to fetch (default: 24, max: 168 for ~1 week) interval_minutes: Interval between data points in minutes (10, 30, or 60)

Returns: Time series weather data

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
station_codeYes
hoursNo
interval_minutesNo

Implementation Reference

  • The handler function for 'get_weather_time_series' tool. Decorated with @mcp.tool() for registration in FastMCP. Validates inputs, fetches data using helper from weather.py, adds station info, and returns the result.
    @mcp.tool()
    async def get_weather_time_series(
        station_code: str,
        hours: int = 24,
        interval_minutes: int = 60,
    ) -> dict:
        """Get time series weather data for a station.
    
        Useful for analyzing weather trends over hours or days.
    
        Args:
            station_code: Station code (e.g., '44132' for Tokyo)
            hours: Number of hours to fetch (default: 24, max: 168 for ~1 week)
            interval_minutes: Interval between data points in minutes (10, 30, or 60)
    
        Returns:
            Time series weather data
        """
        if interval_minutes not in [10, 30, 60]:
            return {"error": f"Invalid interval: {interval_minutes}. Must be 10, 30, or 60."}
    
        time_series_data = await fetch_time_series_data(
            station_code,
            hours=hours,
            interval_minutes=interval_minutes,
        )
    
        station_info = get_station(station_code)
        if station_info:
            time_series_data["station_info"] = station_info
    
        return time_series_data
  • Core helper function that implements the time series fetching logic by looping over past time intervals, querying JMA API, parsing data with _parse_station_data, handling errors, and returning structured time series data.
    async def fetch_time_series_data(
        station_code: str,
        hours: int = 24,
        interval_minutes: int = 60
    ) -> dict[str, Any]:
        """
        Fetch time series data for a station.
    
        Args:
            station_code: Station code (e.g., '44132' for Tokyo)
            hours: Number of hours to fetch (default: 24, max recommended: 168 for ~1 week)
            interval_minutes: Interval between data points in minutes (10, 30, or 60)
    
        Returns:
            Dictionary with time series data
        """
        if interval_minutes not in [10, 30, 60]:
            interval_minutes = 60
    
        # Limit hours to prevent too many requests
        hours = min(hours, 168)  # Max 1 week
    
        start_time = get_latest_data_time()
        data_points = []
        errors = []
    
        # Calculate number of points
        num_points = (hours * 60) // interval_minutes
    
        async with httpx.AsyncClient() as client:
            for i in range(num_points):
                target_time = start_time - timedelta(minutes=i * interval_minutes)
                time_str = format_time_for_api(target_time)
                url = f'https://www.jma.go.jp/bosai/amedas/data/map/{time_str}.json'
    
                try:
                    response = await client.get(url, timeout=10.0)
                    response.raise_for_status()
                    raw_data = response.json()
    
                    station_data = raw_data.get(station_code)
                    if station_data:
                        parsed = _parse_station_data(station_code, station_data)
                        parsed["observation_time"] = target_time.isoformat()
                        parsed["observation_time_jst"] = target_time.strftime('%Y-%m-%d %H:%M')
                        data_points.append(parsed)
                except httpx.HTTPStatusError as e:
                    if e.response.status_code == 404:
                        errors.append({
                            "time": target_time.isoformat(),
                            "error": "Data not available (past retention period)"
                        })
                        break  # Stop if we hit the retention limit
                    else:
                        errors.append({
                            "time": target_time.isoformat(),
                            "error": str(e)
                        })
                except Exception as e:
                    errors.append({
                        "time": target_time.isoformat(),
                        "error": str(e)
                    })
    
        # Reverse to chronological order
        data_points.reverse()
    
        return {
            "station_code": station_code,
            "requested_hours": hours,
            "interval_minutes": interval_minutes,
            "data_points": len(data_points),
            "time_series": data_points,
            "errors": errors if errors else None
        }
  • Supporting helper that parses raw JSON data from JMA API into structured weather observations for each time point in the series.
    def _parse_station_data(code: str, data: dict[str, Any]) -> dict[str, Any]:
        """Parse raw station data into structured format."""
        station_data: dict[str, Any] = {"code": code}
    
        # Temperature (℃)
        if "temp" in data:
            station_data["temperature"] = {
                "value": parse_observation_value(data["temp"]),
                "unit": "℃"
            }
    
        # Humidity (%)
        if "humidity" in data:
            station_data["humidity"] = {
                "value": parse_observation_value(data["humidity"]),
                "unit": "%"
            }
    
        # Pressure (hPa)
        if "pressure" in data:
            station_data["pressure"] = {
                "value": parse_observation_value(data["pressure"]),
                "unit": "hPa"
            }
    
        # Sea level pressure (hPa)
        if "normalPressure" in data:
            station_data["sea_level_pressure"] = {
                "value": parse_observation_value(data["normalPressure"]),
                "unit": "hPa"
            }
    
        # Wind
        if "wind" in data:
            wind_speed = parse_observation_value(data["wind"])
            wind_dir_code = parse_observation_value(data.get("windDirection", [None, None]))
            wind_dir = None
            wind_dir_ja = None
            if wind_dir_code is not None:
                wind_dir_code = int(wind_dir_code)
                wind_dir = WIND_DIRECTIONS.get(wind_dir_code)
                wind_dir_ja = WIND_DIRECTIONS_JA.get(wind_dir_code)
    
            station_data["wind"] = {
                "speed": wind_speed,
                "speed_unit": "m/s",
                "direction": wind_dir,
                "direction_ja": wind_dir_ja,
                "direction_code": wind_dir_code
            }
    
        # Precipitation
        precipitation = {}
        if "precipitation10m" in data:
            precipitation["10min"] = parse_observation_value(data["precipitation10m"])
        if "precipitation1h" in data:
            precipitation["1h"] = parse_observation_value(data["precipitation1h"])
        if "precipitation3h" in data:
            precipitation["3h"] = parse_observation_value(data["precipitation3h"])
        if "precipitation24h" in data:
            precipitation["24h"] = parse_observation_value(data["precipitation24h"])
        if precipitation:
            station_data["precipitation"] = {
                **precipitation,
                "unit": "mm"
            }
    
        # Sunshine
        if "sun1h" in data:
            station_data["sunshine"] = {
                "1h": parse_observation_value(data["sun1h"]),
                "unit": "hours"
            }
    
        # Snow
        snow = {}
        if "snow" in data:
            snow["depth"] = parse_observation_value(data["snow"])
        if "snow1h" in data:
            snow["1h"] = parse_observation_value(data["snow1h"])
        if "snow6h" in data:
            snow["6h"] = parse_observation_value(data["snow6h"])
        if "snow12h" in data:
            snow["12h"] = parse_observation_value(data["snow12h"])
        if "snow24h" in data:
            snow["24h"] = parse_observation_value(data["snow24h"])
        if snow:
            station_data["snow"] = {
                **snow,
                "unit": "cm"
            }
    
        return station_data

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/koizumikento/jma-data-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server