Skip to main content
Glama

deploy_local_folder

Deploy a local folder to Google Cloud Run by specifying the folder's absolute path, project ID, and optional region and service name. Simplifies deploying entire folder contents to a Cloud Run service.

Instructions

Deploy a local folder to Cloud Run. Takes an absolute folder path from the local filesystem that will be deployed. Use this tool if the entire folder content needs to be deployed.

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
folderPathYesAbsolute path to the folder to deploy (e.g. "/home/user/project/src")
projectYesGoogle Cloud project ID. Do not select it yourself, make sure the user provides or confirms the project ID.
regionNoRegion to deploy the service toeurope-west1
serviceNoName of the Cloud Run service to deploy toapp

Implementation Reference

  • The handler function for 'deploy_local_folder' tool. Validates inputs, creates progress callback, and invokes the 'deploy' helper with the folder path as files array.
    async (
      { project, region, service, folderPath },
      { sendNotification }
    ) => {
      if (typeof project !== 'string') {
        throw new Error(
          'Project must be specified, please prompt the user for a valid existing Google Cloud project ID.'
        );
      }
      if (typeof folderPath !== 'string' || folderPath.trim() === '') {
        throw new Error(
          'Folder path must be specified and be a non-empty string.'
        );
      }
    
      const progressCallback = createProgressCallback(sendNotification);
    
      // Deploy to Cloud Run
      try {
        await progressCallback({
          data: `Starting deployment of local folder for service ${service} in project ${project}...`,
        });
        const response = await deploy({
          projectId: project,
          serviceName: service,
          region: region,
          files: [folderPath],
          skipIamCheck: options.skipIamCheck, // Pass the new flag
          progressCallback,
        });
        return {
          content: [
            {
              type: 'text',
              text: `Cloud Run service ${service} deployed from folder ${folderPath} in project ${project}\nCloud Console: https://console.cloud.google.com/run/detail/${region}/${service}?project=${project}\nService URL: ${response.uri}`,
            },
          ],
        };
      } catch (error) {
        return {
          content: [
            {
              type: 'text',
              text: `Error deploying folder to Cloud Run: ${error.message || error}`,
            },
          ],
        };
      }
    }
  • Zod input schema defining parameters for the deploy_local_folder tool: project, region, service, folderPath.
    inputSchema: {
      project: z
        .string()
        .describe(
          'Google Cloud project ID. Do not select it yourself, make sure the user provides or confirms the project ID.'
        )
        .default(options.defaultProjectId),
      region: z
        .string()
        .optional()
        .default(options.defaultRegion)
        .describe('Region to deploy the service to'),
      service: z
        .string()
        .optional()
        .default(options.defaultServiceName)
        .describe('Name of the Cloud Run service to deploy to'),
      folderPath: z
        .string()
        .describe(
          'Absolute path to the folder to deploy (e.g. "/home/user/project/src")'
        ),
    },
  • The registerDeployLocalFolderTool function that registers the 'deploy_local_folder' tool with server, schema, and handler.
    function registerDeployLocalFolderTool(server, options) {
      server.registerTool(
        'deploy_local_folder',
        {
          description:
            'Deploy a local folder to Cloud Run. Takes an absolute folder path from the local filesystem that will be deployed. Use this tool if the entire folder content needs to be deployed.',
          inputSchema: {
            project: z
              .string()
              .describe(
                'Google Cloud project ID. Do not select it yourself, make sure the user provides or confirms the project ID.'
              )
              .default(options.defaultProjectId),
            region: z
              .string()
              .optional()
              .default(options.defaultRegion)
              .describe('Region to deploy the service to'),
            service: z
              .string()
              .optional()
              .default(options.defaultServiceName)
              .describe('Name of the Cloud Run service to deploy to'),
            folderPath: z
              .string()
              .describe(
                'Absolute path to the folder to deploy (e.g. "/home/user/project/src")'
              ),
          },
        },
        gcpTool(
          options.gcpCredentialsAvailable,
          async (
            { project, region, service, folderPath },
            { sendNotification }
          ) => {
            if (typeof project !== 'string') {
              throw new Error(
                'Project must be specified, please prompt the user for a valid existing Google Cloud project ID.'
              );
            }
            if (typeof folderPath !== 'string' || folderPath.trim() === '') {
              throw new Error(
                'Folder path must be specified and be a non-empty string.'
              );
            }
    
            const progressCallback = createProgressCallback(sendNotification);
    
            // Deploy to Cloud Run
            try {
              await progressCallback({
                data: `Starting deployment of local folder for service ${service} in project ${project}...`,
              });
              const response = await deploy({
                projectId: project,
                serviceName: service,
                region: region,
                files: [folderPath],
                skipIamCheck: options.skipIamCheck, // Pass the new flag
                progressCallback,
              });
              return {
                content: [
                  {
                    type: 'text',
                    text: `Cloud Run service ${service} deployed from folder ${folderPath} in project ${project}\nCloud Console: https://console.cloud.google.com/run/detail/${region}/${service}?project=${project}\nService URL: ${response.uri}`,
                  },
                ],
              };
            } catch (error) {
              return {
                content: [
                  {
                    type: 'text',
                    text: `Error deploying folder to Cloud Run: ${error.message || error}`,
                  },
                ],
              };
            }
          }
        )
      );
    }
  • tools/tools.js:36-36 (registration)
    Invocation of registerDeployLocalFolderTool during tool registration in the main tools module.
    registerDeployLocalFolderTool(server, options);
  • The 'deploy' helper function called by the tool handler. Handles zipping folder/files, Cloud Build, and Cloud Run deployment.
    export async function deploy({
      projectId,
      serviceName,
      region,
      files,
      progressCallback,
      skipIamCheck,
    }) {
      if (!projectId) {
        const errorMsg =
          'Error: projectId is required in the configuration object.';
        await logAndProgress(errorMsg, progressCallback, 'error');
        throw new Error(errorMsg);
      }
    
      if (!serviceName) {
        const errorMsg =
          'Error: serviceName is required in the configuration object.';
        await logAndProgress(errorMsg, progressCallback, 'error');
        throw new Error(errorMsg);
      }
    
      if (!files || !Array.isArray(files) || files.length === 0) {
        const errorMsg =
          'Error: files array is required in the configuration object.';
        await logAndProgress(errorMsg, progressCallback, 'error');
        if (typeof process !== 'undefined' && process.exit) {
          process.exit(1);
        } else {
          throw new Error(errorMsg);
        }
      }
    
      const path = await import('path');
      const fs = await import('fs');
      const { Storage } = await import('@google-cloud/storage');
      const { CloudBuildClient } = await import('@google-cloud/cloudbuild');
      const { ArtifactRegistryClient } = await import(
        '@google-cloud/artifact-registry'
      );
      const { v2: CloudRunV2Module } = await import('@google-cloud/run');
      const { ServicesClient } = CloudRunV2Module;
      const { ServiceUsageClient } = await import('@google-cloud/service-usage');
      const { Logging } = await import('@google-cloud/logging');
    
      try {
        const context = {
          storage: new Storage({ projectId }),
          cloudBuildClient: new CloudBuildClient({ projectId }),
          artifactRegistryClient: new ArtifactRegistryClient({ projectId }),
          runClient: new ServicesClient({ projectId }),
          serviceUsageClient: new ServiceUsageClient({ projectId }),
          loggingClient: new Logging({ projectId }),
        };
    
        await ensureApisEnabled(
          context,
          projectId,
          REQUIRED_APIS_FOR_SOURCE_DEPLOY,
          progressCallback
        );
    
        const bucketName = `${projectId}-source-bucket`;
        const imageUrl = `${region}-docker.pkg.dev/${projectId}/${REPO_NAME}/${serviceName}:${IMAGE_TAG}`;
    
        await logAndProgress(`Project: ${projectId}`, progressCallback);
        await logAndProgress(`Region: ${region}`, progressCallback);
        await logAndProgress(`Service Name: ${serviceName}`, progressCallback);
        await logAndProgress(`Files to deploy: ${files.length}`, progressCallback);
    
        let hasDockerfile = false;
        if (
          files.length === 1 &&
          typeof files[0] === 'string' &&
          fs.statSync(files[0]).isDirectory()
        ) {
          // Handle folder deployment: check for Dockerfile inside the folder
          const dockerfilePath = path.join(files[0], 'Dockerfile');
          const dockerfilePathLowerCase = path.join(files[0], 'dockerfile');
          if (
            fs.existsSync(dockerfilePath) ||
            fs.existsSync(dockerfilePathLowerCase)
          ) {
            hasDockerfile = true;
          }
        } else {
          // Handle file list deployment or file content deployment
          for (const file of files) {
            if (typeof file === 'string') {
              if (path.basename(file).toLowerCase() === 'dockerfile') {
                hasDockerfile = true;
                break;
              }
            } else if (typeof file === 'object' && file.filename) {
              if (path.basename(file.filename).toLowerCase() === 'dockerfile') {
                hasDockerfile = true;
                break;
              }
            }
          }
        }
        await logAndProgress(`Dockerfile: ${hasDockerfile}`, progressCallback);
    
        const bucket = await ensureStorageBucketExists(
          context,
          bucketName,
          region,
          progressCallback
        );
    
        const zipBuffer = await zipFiles(files, progressCallback);
        await uploadToStorageBucket(
          context,
          bucket,
          zipBuffer,
          ZIP_FILE_NAME,
          progressCallback
        );
        await logAndProgress('Source code uploaded successfully', progressCallback);
    
        await ensureArtifactRegistryRepoExists(
          context,
          projectId,
          region,
          REPO_NAME,
          'DOCKER',
          progressCallback
        );
    
        const buildResult = await triggerCloudBuild(
          context,
          projectId,
          region,
          bucketName,
          ZIP_FILE_NAME,
          REPO_NAME,
          imageUrl,
          hasDockerfile,
          progressCallback
        );
    
        const builtImageUrl = buildResult.results.images[0].name;
    
        const service = await deployToCloudRun(
          context,
          projectId,
          region,
          serviceName,
          builtImageUrl,
          progressCallback,
          skipIamCheck
        );
    
        await logAndProgress(`Deployment Completed Successfully`, progressCallback);
        return service;
      } catch (error) {
        const deployFailedMessage = `Deployment Failed: ${error.message}`;
        console.error(`Deployment Failed`, error);
        await logAndProgress(deployFailedMessage, progressCallback, 'error');
        throw error;
      }
    }
Behavior2/5

Does the description disclose side effects, auth requirements, rate limits, or destructive behavior?

With no annotations provided, the description carries the full burden of behavioral disclosure. It mentions deployment action but lacks critical details like required permissions, whether this creates or updates services, potential costs, time estimates, or error handling. For a deployment tool with zero annotation coverage, this is insufficient.

Agents need to know what a tool does to the world before calling it. Descriptions should go beyond structured annotations to explain consequences.

Conciseness5/5

Is the description appropriately sized, front-loaded, and free of redundancy?

The description is extremely concise with two sentences that are front-loaded and waste-free. Every word earns its place by stating the core purpose and key usage condition without redundancy or fluff.

Shorter descriptions cost fewer tokens and are easier for agents to parse. Every sentence should earn its place.

Completeness2/5

Given the tool's complexity, does the description cover enough for an agent to succeed on first attempt?

Given this is a deployment tool with no annotations, no output schema, and complex implications (resource creation/modification, cloud costs, permissions), the description is incomplete. It doesn't address what happens after deployment, success indicators, or potential side effects, leaving significant gaps for agent understanding.

Complex tools with many parameters or behaviors need more documentation. Simple tools need less. This dimension scales expectations accordingly.

Parameters3/5

Does the description clarify parameter syntax, constraints, interactions, or defaults beyond what the schema provides?

Schema description coverage is 100%, so the schema already documents all 4 parameters thoroughly. The description only mentions 'absolute folder path' without adding meaningful context beyond what the schema provides for 'folderPath' or other parameters. Baseline 3 is appropriate when schema does the heavy lifting.

Input schemas describe structure but not intent. Descriptions should explain non-obvious parameter relationships and valid value ranges.

Purpose5/5

Does the description clearly state what the tool does and how it differs from similar tools?

The description clearly states the specific action ('Deploy a local folder to Cloud Run') and resource ('local folder'), distinguishing it from siblings like 'deploy_file_contents' or 'deploy_local_files' by specifying 'entire folder content' deployment. This provides precise verb+resource differentiation.

Agents choose between tools based on descriptions. A clear purpose with a specific verb and resource helps agents select the right tool.

Usage Guidelines4/5

Does the description explain when to use this tool, when not to, or what alternatives exist?

The description includes explicit guidance to 'Use this tool if the entire folder content needs to be deployed,' which helps differentiate from tools handling individual files. However, it doesn't mention when NOT to use it or provide alternatives like 'deploy_file_contents' for partial deployments, leaving some contextual gaps.

Agents often have multiple tools that could apply. Explicit usage guidance like "use X instead of Y when Z" prevents misuse.

Install Server

Other Tools

Related Tools

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/GoogleCloudPlatform/cloud-run-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server