Skip to main content
Glama

data_estimated_size

Calculate the estimated size of an input data file in specified units (b, kb, mb, gb, tb) using this tool for efficient data management and preprocessing.

Instructions

Estimated size of the input data

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
input_data_file_pathNoPath to the input data file
unitNoUnit for the estimated sizeb

Implementation Reference

  • The main handler function that executes the tool: parses input arguments, loads the dataframe, computes the estimated size in specified unit, formats result as JSON, and returns as TextContent.
    async def handle_data_estimated_size( arguments: dict[str, Any], ) -> list[types.TextContent | types.ImageContent | types.EmbeddedResource]: data_estimated_size_input = DataEstimatedSizeInputSchema.from_args(arguments) estimated_size = data_estimated_size_input.df.estimated_size(unit=data_estimated_size_input.unit) result_dict = { "description": "Estimated size of the input data", "size": estimated_size, "unit": data_estimated_size_input.unit, } return [ types.TextContent( type="text", text=json.dumps(result_dict), ) ]
  • Pydantic model defining input schema, including static methods for MCP inputSchema, from_schema (loads Data from file), and from_args (parses tool arguments).
    class DataEstimatedSizeInputSchema(Data): model_config = ConfigDict( validate_assignment=True, frozen=True, extra="forbid", arbitrary_types_allowed=True, ) unit: str = Field( default="b", description="Unit for the estimated size. One of: 'b' (bytes), 'kb', 'mb', 'gb', 'tb'", ) @staticmethod def input_schema() -> dict: return { "type": "object", "properties": { "input_data_file_path": { "type": "string", "description": "Path to the input data file", }, "unit": { "type": "string", "enum": ["b", "kb", "mb", "gb", "tb"], "description": "Unit for the estimated size", "default": "b", }, }, } @staticmethod def from_schema( input_data_file_path: str, unit: str = "b", ) -> "DataEstimatedSizeInputSchema": data = Data.from_file(input_data_file_path) return DataEstimatedSizeInputSchema( df=data.df, unit=unit, ) @staticmethod def from_args(arguments: dict[str, Any]) -> "DataEstimatedSizeInputSchema": input_data_file_path = arguments["input_data_file_path"] unit = arguments.get("unit", "b") return DataEstimatedSizeInputSchema.from_schema( input_data_file_path=input_data_file_path, unit=unit, )
  • Registers the tool schema (name, description, inputSchema) in the MCPServerDataWrangler.tools() method which returns the list of all tools.
    types.Tool( name=MCPServerDataWrangler.data_estimated_size.value[0], description=MCPServerDataWrangler.data_estimated_size.value[1], inputSchema=DataEstimatedSizeInputSchema.input_schema(), ),
  • Maps the tool name to its handler function in the tool_to_handler() dictionary used for dispatching.
    MCPServerDataWrangler.data_estimated_size.value[0]: handle_data_estimated_size,
  • Enum member in MCPServerDataWrangler defining the canonical tool name and description.
    data_estimated_size = ("data_estimated_size", "Estimated size of the input data")

Other Tools

Related Tools

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/shibuiwilliam/mcp-server-data-wrangler'

If you have feedback or need assistance with the MCP directory API, please join our Discord server