data_quantile
Calculate quantile values for each column in a dataset using specified interpolation methods. Ideal for data analysis and preprocessing tasks in structured data workflows.
Instructions
Quantile values for each column
Input Schema
TableJSON Schema
| Name | Required | Description | Default |
|---|---|---|---|
| input_data_file_path | Yes | Path to the input data file | |
| interpolation | No | Interpolation method | nearest |
| quantile | Yes | Quantile between 0.0 and 1.0 |
Implementation Reference
- The async handler function that parses input arguments using the schema, computes the quantile on the DataFrame with specified quantile and interpolation, formats results as JSON, and returns as TextContent.async def handle_data_quantile( arguments: dict[str, Any], ) -> list[types.TextContent | types.ImageContent | types.EmbeddedResource]: data_quantile_input = DataQuantileInputSchema.from_args(arguments) quantile_df = data_quantile_input.df.quantile( quantile=data_quantile_input.quantile, interpolation=data_quantile_input.interpolation, ) # Convert the DataFrame to a dictionary format quantile_dict = { "description": f"Quantile values for each column at {arguments['quantile']}", "quantile_values": { col: str(val) if val is not None else None for col, val in zip(quantile_df.columns, quantile_df.row(0)) }, } return [ types.TextContent( type="text", text=json.dumps(quantile_dict), ) ]
- Pydantic model extending Data for input validation. Provides input_schema() for MCP Tool schema, and static methods to construct from file path/args and load DataFrame.class DataQuantileInputSchema(Data): model_config = ConfigDict( validate_assignment=True, frozen=True, extra="forbid", arbitrary_types_allowed=True, ) quantile: float = Field(default=0.5, description="Quantile value between 0.0 and 1.0", gt=0.0, lt=1.0) interpolation: str = Field( default="nearest", description="Interpolation method for quantile. One of: 'nearest', 'higher', 'lower', 'midpoint', 'linear'", ) @staticmethod def input_schema() -> dict: return { "type": "object", "properties": { "input_data_file_path": { "type": "string", "description": "Path to the input data file", }, "quantile": { "type": "number", "description": "Quantile between 0.0 and 1.0", "minimum": 0.0, "maximum": 1.0, "default": 0.5, }, "interpolation": { "type": "string", "description": "Interpolation method", "enum": ["nearest", "higher", "lower", "midpoint", "linear"], "default": "nearest", }, }, "required": ["input_data_file_path", "quantile"], } @staticmethod def from_schema( input_data_file_path: str, quantile: float, interpolation: str = "nearest" ) -> "DataQuantileInputSchema": data = Data.from_file(input_data_file_path) return DataQuantileInputSchema( df=data.df, quantile=quantile, interpolation=interpolation, ) @staticmethod def from_args(arguments: dict[str, Any]) -> "DataQuantileInputSchema": input_data_file_path = arguments["input_data_file_path"] quantile = arguments["quantile"] interpolation = arguments.get("interpolation", "nearest") return DataQuantileInputSchema.from_schema( input_data_file_path=input_data_file_path, quantile=quantile, interpolation=interpolation, )
- src/mcp_server_data_wrangler/tools/tools.py:132-136 (registration)Registers the 'data_quantile' tool in the list returned by MCPServerDataWrangler.tools(), specifying name, description, and input schema.types.Tool( name=MCPServerDataWrangler.data_quantile.value[0], description=MCPServerDataWrangler.data_quantile.value[1], inputSchema=DataQuantileInputSchema.input_schema(), ),
- src/mcp_server_data_wrangler/tools/tools.py:165-165 (registration)Maps the tool name 'data_quantile' to its handler function in tool_to_handler() dictionary.MCPServerDataWrangler.data_quantile.value[0]: handle_data_quantile,
- src/mcp_server_data_wrangler/tools/tools.py:53-53 (registration)Defines the tool name and description constant in MCPServerDataWrangler enum.data_quantile = ("data_quantile", "Quantile values for each column")