Skip to main content
Glama

local_dev_from_filesystem

Create a local development environment from a filesystem path to run tests and view coverage in sandboxed Python, Node, or Bun projects.

Instructions

Create a new local development environment from a filesystem path

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
pathYesLocal filesystem path

Implementation Reference

  • Handler for the local_dev_from_filesystem tool: calls create_environment_from_path and formats response with environment details.
    elif name == "local_dev_from_filesystem": env = await create_environment_from_path(arguments["path"]) return [ types.TextContent( type="text", text=json.dumps( { "success": True, "data": { "id": env.id, "working_dir": str(env.sandbox.work_dir), "created_at": env.created_at.isoformat(), "runtime": env.runtime_config.name.value, }, } ), ) ]
  • Input schema and metadata for the local_dev_from_filesystem tool, registered in the tools list.
    types.Tool( name="local_dev_from_filesystem", description="Create a new local development environment from a filesystem path", inputSchema={ "type": "object", "properties": { "path": {"type": "string", "description": "Local filesystem path"} }, "required": ["path"], }, ),
  • Main helper function implementing the environment creation from a local path: sandbox creation, file copy, runtime detection and installation.
    async def create_environment_from_path(path: Path) -> Environment: """Create new environment from filesystem path.""" env_id = b58_fuuid() sandbox = await create_sandbox(f"mcp-{env_id}-") shutil.copytree(path, sandbox.work_dir, dirs_exist_ok=True) os.chmod(sandbox.work_dir, 0o700) os.chmod(sandbox.bin_dir, 0o700) runtime_config = detect_runtime(sandbox) await install_runtime(sandbox, runtime_config) env = Environment( id=env_id, runtime_config=runtime_config, sandbox=sandbox, created_at=datetime.now(timezone.utc), ) _ENVIRONMENTS[env_id] = env return env

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/txbm/mcp-local-dev'

If you have feedback or need assistance with the MCP directory API, please join our Discord server