run_js
Execute JavaScript code with npm dependencies in a secure sandbox container for complex workflows, ensuring file persistence through the ./files directory.
Instructions
Install npm dependencies and run JavaScript code inside a running sandbox container. After running, you must manually stop the sandbox to free resources. The code must be valid ESModules (import/export syntax). Best for complex workflows where you want to reuse the environment across multiple executions. When reading and writing from the Node.js processes, you always need to read from and write to the "./files" directory to ensure persistence on the mounted volume.
Input Schema
| Name | Required | Description | Default |
|---|---|---|---|
| code | Yes | JavaScript code to run inside the container. | |
| container_id | Yes | Docker container identifier | |
| dependencies | No | A list of npm dependencies to install before running the code. Each item must have a `name` (package) and `version` (range). If none, returns an empty array. | |
| listenOnPort | No | If set, leaves the process running and exposes this port to the host. |
Implementation Reference
- src/tools/runJs.ts:52-149 (handler)The core handler function for the 'run_js' tool. It prepares a workspace, copies to Docker container, installs dependencies if provided, executes the JS code (sync or with port listening), detects file changes in mounted volume, and returns output, changes, and telemetry.export default async function runJs({ container_id, code, dependencies = [], listenOnPort, }: { container_id: string; code: string; dependencies?: DependenciesArray; listenOnPort?: number; }): Promise<McpResponse> { if (!isDockerRunning()) { return { content: [textContent(DOCKER_NOT_RUNNING_ERROR)] }; } const telemetry: Record<string, unknown> = {}; const dependenciesRecord: Record<string, string> = Object.fromEntries( dependencies.map(({ name, version }) => [name, version]) ); // Create workspace in container const localWorkspace = await prepareWorkspace({ code, dependenciesRecord }); execSync(`docker cp ${localWorkspace.name}/. ${container_id}:/workspace`); let rawOutput: string = ''; // Generate snapshot of the workspace const snapshotStartTime = Date.now(); const snapshot = await getSnapshot(getMountPointDir()); if (listenOnPort) { if (dependencies.length > 0) { const installStart = Date.now(); const installOutput = execSync( `docker exec ${container_id} /bin/sh -c ${JSON.stringify( `npm install --omit=dev --prefer-offline --no-audit --loglevel=error` )}`, { encoding: 'utf8' } ); telemetry.installTimeMs = Date.now() - installStart; telemetry.installOutput = installOutput; } else { telemetry.installTimeMs = 0; telemetry.installOutput = 'Skipped npm install (no dependencies)'; } const { error, duration } = safeExecNodeInContainer({ containerId: container_id, command: `nohup node index.js > output.log 2>&1 &`, }); telemetry.runTimeMs = duration; if (error) return getContentFromError(error, telemetry); await waitForPortHttp(listenOnPort); rawOutput = `Server started in background; logs at /output.log`; } else { if (dependencies.length > 0) { const installStart = Date.now(); const fullCmd = `npm install --omit=dev --prefer-offline --no-audit --loglevel=error`; const installOutput = execSync( `docker exec ${container_id} /bin/sh -c ${JSON.stringify(fullCmd)}`, { encoding: 'utf8' } ); telemetry.installTimeMs = Date.now() - installStart; telemetry.installOutput = installOutput; } else { telemetry.installTimeMs = 0; telemetry.installOutput = 'Skipped npm install (no dependencies)'; } const { output, error, duration } = safeExecNodeInContainer({ containerId: container_id, }); if (output) rawOutput = output; telemetry.runTimeMs = duration; if (error) return getContentFromError(error, telemetry); } // Detect the file changed during the execution of the tool in the mounted workspace // and report the changes to the user const changes = await detectChanges( snapshot, getMountPointDir(), snapshotStartTime ); const extractedContents = await changesToMcpContent(changes); localWorkspace.removeCallback(); return { content: [ textContent(`Node.js process output:\n${rawOutput}`), ...extractedContents, textContent(`Telemetry:\n${JSON.stringify(telemetry, null, 2)}`), ], }; }
- src/tools/runJs.ts:26-48 (schema)Input schema for 'run_js' tool using Zod: requires container_id and code, optional dependencies list and listenOnPort.export const argSchema = { container_id: z.string().describe('Docker container identifier'), // We use an array of { name, version } items instead of a record // because the OpenAI function-calling schema doesn’t reliably support arbitrary // object keys. An explicit array ensures each dependency has a clear, uniform // structure the model can populate. // Schema for a single dependency item dependencies: z .array(NodeDependency) .default([]) .describe( 'A list of npm dependencies to install before running the code. ' + 'Each item must have a `name` (package) and `version` (range). ' + 'If none, returns an empty array.' ), code: z.string().describe('JavaScript code to run inside the container.'), listenOnPort: z .number() .optional() .describe( 'If set, leaves the process running and exposes this port to the host.' ), };
- src/server.ts:66-73 (registration)Registers the 'run_js' tool on the MCP server, providing name, description, input schema, and handler function.'run_js', `Install npm dependencies and run JavaScript code inside a running sandbox container. After running, you must manually stop the sandbox to free resources. The code must be valid ESModules (import/export syntax). Best for complex workflows where you want to reuse the environment across multiple executions. When reading and writing from the Node.js processes, you always need to read from and write to the "./files" directory to ensure persistence on the mounted volume.`, runJsSchema, runJs );