refactory_metrics
Calculate before/after metrics including Refactory Score to assess monolith decomposition quality, module health, and test preservation.
Instructions
Calculate before/after metrics and the Refactory Score (0-1). Measures health improvement, module quality, test preservation.
Input Schema
| Name | Required | Description | Default |
|---|---|---|---|
| original | Yes | Path to the original monolith | |
| moduleDir | Yes | Directory containing extracted modules | |
| testResults | No | Path to test results JSON (before/after) |
Implementation Reference
- src/tools/metrics.js:5-53 (handler)The core handler for the refactory_metrics tool. Calculates before/after metrics: compares original vs decomposed files (lines, functions), computes module clean rate, size reduction, and the Refactory Score (0-1). Score = cleanRate * min(sizeReduction, 1.0).
async function metrics(args) { const originalPath = path.resolve(args.original); const moduleDir = path.resolve(args.moduleDir); const originalSource = fs.readFileSync(originalPath, "utf8"); const originalLines = originalSource.split("\n").length; const originalFunctions = (originalSource.match(/^(?:async\s+)?function\s+\w+/gm) || []).length; const moduleFiles = fs.readdirSync(moduleDir).filter((f) => f.endsWith(".js")); let totalModuleLines = 0; let maxModuleLines = 0; let modulesClean = 0; const moduleStats = []; for (const file of moduleFiles) { const content = fs.readFileSync(path.join(moduleDir, file), "utf8"); const lines = content.split("\n").length; totalModuleLines += lines; maxModuleLines = Math.max(maxModuleLines, lines); let clean = false; try { require(path.join(moduleDir, file)); clean = true; modulesClean++; } catch {} moduleStats.push({ file, lines, clean }); } const cleanRate = moduleFiles.length > 0 ? modulesClean / moduleFiles.length : 0; const sizeReduction = maxModuleLines < originalLines ? 1 : originalLines / maxModuleLines; const score = cleanRate * Math.min(sizeReduction, 1.0); return { original: { file: originalPath, lines: originalLines, functions: originalFunctions, }, decomposed: { moduleCount: moduleFiles.length, totalLines: totalModuleLines, maxModuleLines, avgModuleLines: Math.round(totalModuleLines / Math.max(moduleFiles.length, 1)), modulesClean, cleanRate: Math.round(cleanRate * 100), modules: moduleStats, }, refactoryScore: Math.round(score * 100) / 100, timestamp: new Date().toISOString(), }; } - src/server.js:91-103 (schema)Input schema for refactory_metrics. Accepts 'original' (path to monolith), 'moduleDir' (extracted modules dir), and optional 'testResults'.
{ name: "refactory_metrics", description: "Calculate before/after metrics and the Refactory Score (0-1). Measures health improvement, module quality, test preservation.", inputSchema: { type: "object", properties: { original: { type: "string", description: "Path to the original monolith" }, moduleDir: { type: "string", description: "Directory containing extracted modules" }, testResults: { type: "string", description: "Path to test results JSON (before/after)" }, }, required: ["original", "moduleDir"], }, }, - src/server.js:36-180 (registration)TOOLS array registers refactory_metrics along with all other tools. Line 92 defines the name 'refactory_metrics'.
const TOOLS = [ { name: "refactory_analyze", description: "Analyze a source file for decomposition. Returns health score, function count, dependency graph, and recommended split points.", inputSchema: { type: "object", properties: { file: { type: "string", description: "Path to the monolith file to analyze" }, language: { type: "string", description: "Language (js, ts, py). Auto-detected if omitted." }, }, required: ["file"], }, }, { name: "refactory_plan", description: "Generate a decomposition plan — module boundaries, function assignments, dependency order. Uses AST analysis + LLM reasoning.", inputSchema: { type: "object", properties: { file: { type: "string", description: "Path to the monolith file" }, modules: { type: "number", description: "Target number of modules (auto if omitted)" }, maxLines: { type: "number", description: "Max lines per module (default: 500)" }, style: { type: "string", description: "Grouping style: 'functional' | 'domain' | 'layer'" }, }, required: ["file"], }, }, { name: "refactory_extract", description: "Extract one module from the monolith according to the plan. Routes to the cheapest capable free LLM API.", inputSchema: { type: "object", properties: { file: { type: "string", description: "Path to the monolith file" }, module: { type: "string", description: "Module name to extract (from the plan)" }, functions: { type: "array", items: { type: "string" }, description: "Function names to include" }, outputDir: { type: "string", description: "Output directory for extracted module" }, plan: { type: "string", description: "Path to the decomposition plan JSON" }, }, required: ["file", "module"], }, }, { name: "refactory_verify", description: "Verify a decomposed module: loads without errors, exports match plan, no circular deps, tests pass.", inputSchema: { type: "object", properties: { moduleDir: { type: "string", description: "Directory containing extracted modules" }, original: { type: "string", description: "Path to the original monolith (for export comparison)" }, testCmd: { type: "string", description: "Test command to run (e.g., 'npm test')" }, }, required: ["moduleDir"], }, }, { name: "refactory_metrics", description: "Calculate before/after metrics and the Refactory Score (0-1). Measures health improvement, module quality, test preservation.", inputSchema: { type: "object", properties: { original: { type: "string", description: "Path to the original monolith" }, moduleDir: { type: "string", description: "Directory containing extracted modules" }, testResults: { type: "string", description: "Path to test results JSON (before/after)" }, }, required: ["original", "moduleDir"], }, }, { name: "refactory_report", description: "Generate a decomposition report with metrics, dependency graphs, and Refactory Score. Outputs Markdown or HTML.", inputSchema: { type: "object", properties: { metricsFile: { type: "string", description: "Path to metrics JSON from refactory_metrics" }, format: { type: "string", description: "'markdown' (default) or 'html'" }, outputPath: { type: "string", description: "Where to write the report" }, }, required: ["metricsFile"], }, }, { name: "refactory_depmap", description: "Map dependencies for a file — who requires it (consumers), what it requires (dependencies), detect circular deps.", inputSchema: { type: "object", properties: { file: { type: "string", description: "Path to the file to map" }, projectDir: { type: "string", description: "Project root directory" }, }, required: ["file"], }, }, { name: "refactory_characterize", description: "Generate characterization tests and golden export snapshot BEFORE decomposition. Captures behavioral contract.", inputSchema: { type: "object", properties: { file: { type: "string", description: "Path to the module to characterize" }, outputDir: { type: "string", description: "Where to write test + golden files" }, }, required: ["file"], }, }, { name: "refactory_verify_exports", description: "Compare post-decomposition module against golden export snapshot. Reports missing, added, or type-changed exports.", inputSchema: { type: "object", properties: { goldenFile: { type: "string", description: "Path to .golden-exports.json from characterize" }, newFile: { type: "string", description: "Path to the new re-export module" }, }, required: ["goldenFile", "newFile"], }, }, { name: "refactory_fix_imports", description: "Mechanically fix broken require() paths after module extraction. No LLM needed — pure path resolution.", inputSchema: { type: "object", properties: { moduleDir: { type: "string", description: "Directory containing extracted modules" }, projectDir: { type: "string", description: "Project root to scan for consumers" }, dryRun: { type: "boolean", description: "Report changes without writing (default: false)" }, }, required: ["moduleDir"], }, }, { name: "refactory_decompose", description: "Full decomposition pipeline in one call: analyze, depmap, characterize, plan, extract ALL modules, fix-imports, verify, metrics, re-export, report. The 'just do it' tool.", inputSchema: { type: "object", properties: { file: { type: "string", description: "Path to the monolith file to decompose" }, outputDir: { type: "string", description: "Output directory (default: <dir>/lib/<basename>/ next to source)" }, maxLines: { type: "number", description: "Max lines per module (default: 500)" }, projectDir: { type: "string", description: "Project root for dependency mapping (optional)" }, }, required: ["file"], }, }, ]; - src/server.js:201-201 (registration)Switch-case dispatch in the CallToolRequestSchema handler routes 'refactory_metrics' to the metrics() function.
case "refactory_metrics": result = await metrics(args); break; - src/tools/decompose.js:188-195 (helper)The decompose pipeline invokes metrics() at line 191 during its full decomposition flow (Step 8), passing original and moduleDir.
// Step 8: Metrics let metricsResult; try { metricsResult = await metrics({ original: filePath, moduleDir: outputDir }); result.steps.metrics = metricsResult; } catch (err) { throw new Error(`Step metrics failed: ${err.message}`); }