Skip to main content
Glama

json_dry_run

Analyze JSON data size breakdown by specifying fields or nested structures. Input a file path and shape object to get byte-level size details for each field, mirroring the shape in the output.

Instructions

Analyze the size breakdown of JSON data using a shape object to determine granularity. Returns size information in bytes for each specified field, mirroring the shape structure but with size values instead of data.

Input Schema

NameRequiredDescriptionDefault
filePathYesPath to the JSON file (local) or HTTP/HTTPS URL to analyze
shapeNoShape object (formatted as valid JSON) defining what to analyze for size. Use 'true' to get total size of a field, or nested objects for detailed breakdown. Examples: 1. Get size of single field: {"name": true} 2. Get sizes of multiple fields: {"name": true, "email": true, "age": true} 3. Get detailed breakdown: {"user": {"name": true, "profile": {"bio": true}}} 4. Analyze arrays: {"posts": {"title": true, "content": true}} - gets total size of all matching elements 5. Complex analysis: { "metadata": true, "users": { "name": true, "settings": { "theme": true } }, "posts": { "title": true, "tags": true } } Note: - Returns size in bytes for each specified field - Output structure mirrors the shape but with size values - Array analysis returns total size of all matching elements - Use json_schema tool to understand the JSON structure first

Input Schema (JSON Schema)

{ "$schema": "http://json-schema.org/draft-07/schema#", "additionalProperties": false, "properties": { "filePath": { "description": "Path to the JSON file (local) or HTTP/HTTPS URL to analyze", "type": "string" }, "shape": { "description": "Shape object (formatted as valid JSON) defining what to analyze for size. Use 'true' to get total size of a field, or nested objects for detailed breakdown.\n\nExamples:\n1. Get size of single field: {\"name\": true}\n2. Get sizes of multiple fields: {\"name\": true, \"email\": true, \"age\": true}\n3. Get detailed breakdown: {\"user\": {\"name\": true, \"profile\": {\"bio\": true}}}\n4. Analyze arrays: {\"posts\": {\"title\": true, \"content\": true}} - gets total size of all matching elements\n5. Complex analysis: {\n \"metadata\": true,\n \"users\": {\n \"name\": true,\n \"settings\": {\n \"theme\": true\n }\n },\n \"posts\": {\n \"title\": true,\n \"tags\": true\n }\n}\n\nNote: \n- Returns size in bytes for each specified field\n- Output structure mirrors the shape but with size values\n- Array analysis returns total size of all matching elements\n- Use json_schema tool to understand the JSON structure first" } }, "required": [ "filePath" ], "type": "object" }

Implementation Reference

  • The core handler function for the json_dry_run tool. Ingests JSON from file/URL, parses it, computes size breakdown according to the shape, total size, filtered size, and recommends number of chunks.
    async function processJsonDryRun(input: JsonDryRunInput): Promise<JsonDryRunResult> { try { // Use strategy pattern to ingest JSON content const ingestionResult = await jsonIngestionContext.ingest(input.filePath); if (!ingestionResult.success) { // Map strategy errors to existing error format for backward compatibility return { success: false, error: ingestionResult.error }; } // Parse JSON let parsedData: any; try { parsedData = JSON.parse(ingestionResult.content); } catch (error) { return { success: false, error: { type: 'invalid_json', message: 'Invalid JSON format in content', details: error } }; } // Calculate size breakdown based on shape try { const sizeBreakdown = calculateSizeWithShape(parsedData, input.shape); // Calculate total size of the entire parsed data const totalSize = calculateValueSize(parsedData); // Calculate filtered data size const filteredData = extractWithShape(parsedData, input.shape); const filteredJson = JSON.stringify(filteredData, null, 2); const filteredSize = new TextEncoder().encode(filteredJson).length; // Calculate recommended chunks (400KB threshold) const CHUNK_THRESHOLD = 400 * 1024; const recommendedChunks = Math.ceil(filteredSize / CHUNK_THRESHOLD); return { success: true, sizeBreakdown, totalSize, filteredSize, recommendedChunks }; } catch (error) { return { success: false, error: { type: 'validation_error', message: 'Failed to calculate size breakdown', details: error } }; } } catch (error) { return { success: false, error: { type: 'validation_error', message: 'Unexpected error during processing', details: error } }; } }
  • Zod input schema for json_dry_run tool, validating filePath and shape parameters.
    const JsonDryRunInputSchema = z.object({ filePath: z.string().min(1, "File path or HTTP/HTTPS URL is required").refine( (val) => val.length > 0 && (val.startsWith('./') || val.startsWith('/') || val.startsWith('http://') || val.startsWith('https://') || !val.includes('/')), "Must be a valid file path or HTTP/HTTPS URL" ), shape: z.any().describe("Shape object defining what to analyze") });
  • TypeScript type definition for the JsonDryRunResult, defining success and error cases with size breakdown information.
    type JsonDryRunResult = { readonly success: true; readonly sizeBreakdown: any; readonly totalSize: number; readonly filteredSize: number; readonly recommendedChunks: number; } | { readonly success: false; readonly error: JsonSchemaError; };
  • src/index.ts:683-784 (registration)
    MCP server.tool registration for 'json_dry_run', including description, inline input schema, and async executor that parses shape if needed, validates with JsonDryRunInputSchema, calls processJsonDryRun, and formats the response.
    server.tool( "json_dry_run", "Analyze the size breakdown of JSON data using a shape object to determine granularity. Returns size information in bytes for each specified field, mirroring the shape structure but with size values instead of data.", { filePath: z.string().describe("Path to the JSON file (local) or HTTP/HTTPS URL to analyze"), shape: z.unknown().describe(`Shape object (formatted as valid JSON) defining what to analyze for size. Use 'true' to get total size of a field, or nested objects for detailed breakdown. Examples: 1. Get size of single field: {"name": true} 2. Get sizes of multiple fields: {"name": true, "email": true, "age": true} 3. Get detailed breakdown: {"user": {"name": true, "profile": {"bio": true}}} 4. Analyze arrays: {"posts": {"title": true, "content": true}} - gets total size of all matching elements 5. Complex analysis: { "metadata": true, "users": { "name": true, "settings": { "theme": true } }, "posts": { "title": true, "tags": true } } Note: - Returns size in bytes for each specified field - Output structure mirrors the shape but with size values - Array analysis returns total size of all matching elements - Use json_schema tool to understand the JSON structure first`) }, async ({ filePath, shape }) => { try { // If shape is a string, parse it as JSON let parsedShape = shape; if (typeof shape === 'string') { try { parsedShape = JSON.parse(shape); } catch (e) { return { content: [ { type: "text", text: `Error: Invalid JSON in shape parameter: ${e instanceof Error ? e.message : String(e)}` } ], isError: true }; } } const validatedInput = JsonDryRunInputSchema.parse({ filePath, shape: parsedShape }); const result = await processJsonDryRun(validatedInput); if (result.success) { // Format the response with total size and breakdown const formatSize = (bytes: number): string => { if (bytes < 1024) return `${bytes} bytes`; if (bytes < 1024 * 1024) return `${(bytes / 1024).toFixed(1)} KB`; return `${(bytes / (1024 * 1024)).toFixed(1)} MB`; }; const header = `Total file size: ${formatSize(result.totalSize)} (${result.totalSize} bytes)\nFiltered size: ${formatSize(result.filteredSize)} (${result.filteredSize} bytes)\nRecommended chunks: ${result.recommendedChunks}\n\nSize breakdown:\n`; const breakdown = JSON.stringify(result.sizeBreakdown, null, 2); return { content: [ { type: "text", text: header + breakdown } ] }; } else { return { content: [ { type: "text", text: `Error: ${result.error.message}` } ], isError: true }; } } catch (error) { return { content: [ { type: "text", text: `Validation error: ${error instanceof Error ? error.message : String(error)}` } ], isError: true }; } } );
  • Key helper function that recursively traverses the JSON data according to the shape and computes a size breakdown mirroring the shape structure. Handles arrays by summing sizes across elements.
    function calculateSizeWithShape(data: any, shape: Shape): any { if (Array.isArray(data)) { // For arrays, apply the shape to each element and sum the results let totalSizeBreakdown: any = {}; let isFirstElement = true; for (const item of data) { const itemSizeBreakdown = calculateSizeWithShape(item, shape); if (isFirstElement) { totalSizeBreakdown = itemSizeBreakdown; isFirstElement = false; } else { // Sum up the sizes totalSizeBreakdown = addSizeBreakdowns(totalSizeBreakdown, itemSizeBreakdown); } } return totalSizeBreakdown; } const result: any = {}; for (const key in shape) { const rule = shape[key]; if (data[key] === undefined) { // If the key doesn't exist in data, skip it or set to 0 continue; } if (rule === true) { // Calculate total size of this key's value result[key] = calculateValueSize(data[key]); } else if (typeof rule === 'object') { // Recursively break down this key's value result[key] = calculateSizeWithShape(data[key], rule); } } return result; }

Other Tools

Related Tools

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/kehvinbehvin/json-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server