Skip to main content
Glama

uploadFile

Upload files to IPFS storage via Pinata MCP server using file paths or encoded content, supporting both public and private networks with metadata options.

Instructions

Upload a file to Pinata IPFS. Provide either a file:// URI or base64-encoded content.

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
resourceUriNoThe file:// URI of the file to upload (e.g., file:///path/to/file.jpg)
fileContentNoBase64-encoded file content (use this if not providing resourceUri)
fileNameNoName for the uploaded file (auto-detected from path if using resourceUri)
mimeTypeNoMIME type of the file (auto-detected if not provided)
networkNoWhether to upload to public or private IPFSpublic
group_idNoID of a group to add the file to
keyvaluesNoMetadata key-value pairs for the file

Implementation Reference

  • The complete uploadFile tool implementation including registration, schema definition, and the async handler function that processes file uploads to Pinata IPFS. Supports both file:// URI and base64-encoded content modes.
    server.tool(
      "uploadFile",
      "Upload a file to Pinata IPFS. Provide either a file:// URI or base64-encoded content.",
      {
        resourceUri: z
          .string()
          .optional()
          .describe("The file:// URI of the file to upload (e.g., file:///path/to/file.jpg)"),
        fileContent: z
          .string()
          .optional()
          .describe("Base64-encoded file content (use this if not providing resourceUri)"),
        fileName: z
          .string()
          .optional()
          .describe("Name for the uploaded file (auto-detected from path if using resourceUri)"),
        mimeType: z
          .string()
          .optional()
          .describe("MIME type of the file (auto-detected if not provided)"),
        network: z
          .enum(["public", "private"])
          .default("public")
          .describe("Whether to upload to public or private IPFS"),
        group_id: z
          .string()
          .optional()
          .describe("ID of a group to add the file to"),
        keyvalues: z
          .record(z.string())
          .optional()
          .describe("Metadata key-value pairs for the file"),
      },
      async ({ resourceUri, fileContent, fileName, mimeType, network, group_id, keyvalues }) => {
        try {
          let fileBuffer: Buffer;
          let finalFileName: string;
    
          if (resourceUri) {
            // File path mode
            if (!resourceUri.startsWith("file://")) {
              throw new Error("resourceUri must be a file:// URI");
            }
    
            let filePath: string;
            if (process.platform === "win32") {
              filePath = decodeURIComponent(
                resourceUri.replace(/^file:\/\/\//, "").replace(/\//g, "\\")
              );
            } else {
              filePath = decodeURIComponent(resourceUri.replace(/^file:\/\//, ""));
            }
    
            // Validate path is allowed
            filePath = await validatePath(filePath);
            fileBuffer = await fs.readFile(filePath);
            finalFileName = fileName || path.basename(filePath);
          } else if (fileContent) {
            // Base64 content mode
            if (!fileName) {
              throw new Error("fileName is required when using fileContent");
            }
            fileBuffer = Buffer.from(fileContent, "base64");
            finalFileName = fileName;
          } else {
            throw new Error("Either resourceUri or fileContent must be provided");
          }
    
          const detectedMimeType = mimeType || getMimeType(finalFileName);
    
          const formData = new FormData();
          const blob = new Blob([new Uint8Array(fileBuffer)], { type: detectedMimeType });
          formData.append("file", blob, finalFileName);
          formData.append("network", network);
    
          if (group_id) {
            formData.append("group_id", group_id);
          }
    
          if (keyvalues) {
            formData.append("keyvalues", JSON.stringify(keyvalues));
          }
    
          const response = await fetch("https://uploads.pinata.cloud/v3/files", {
            method: "POST",
            headers: {
              Authorization: `Bearer ${PINATA_JWT}`,
            },
            body: formData,
          });
    
          if (!response.ok) {
            const errorText = await response.text();
            throw new Error(
              `Failed to upload file: ${response.status} ${response.statusText}\n${errorText}`
            );
          }
    
          const data = await response.json();
          return {
            content: [
              {
                type: "text",
                text: `✅ File uploaded successfully!\n\n${JSON.stringify(data, null, 2)}`,
              },
            ],
          };
        } catch (error) {
          return errorResponse(error);
        }
      }
    );
  • Input parameter schema for uploadFile tool using Zod validation. Defines parameters: resourceUri, fileContent, fileName, mimeType, network, group_id, and keyvalues.
    {
      resourceUri: z
        .string()
        .optional()
        .describe("The file:// URI of the file to upload (e.g., file:///path/to/file.jpg)"),
      fileContent: z
        .string()
        .optional()
        .describe("Base64-encoded file content (use this if not providing resourceUri)"),
      fileName: z
        .string()
        .optional()
        .describe("Name for the uploaded file (auto-detected from path if using resourceUri)"),
      mimeType: z
        .string()
        .optional()
        .describe("MIME type of the file (auto-detected if not provided)"),
      network: z
        .enum(["public", "private"])
        .default("public")
        .describe("Whether to upload to public or private IPFS"),
      group_id: z
        .string()
        .optional()
        .describe("ID of a group to add the file to"),
      keyvalues: z
        .record(z.string())
        .optional()
        .describe("Metadata key-value pairs for the file"),
    },
  • The core uploadFile handler logic that validates input, reads file content (either from path or base64), constructs multipart form data, and sends POST request to Pinata's upload endpoint.
    async ({ resourceUri, fileContent, fileName, mimeType, network, group_id, keyvalues }) => {
      try {
        let fileBuffer: Buffer;
        let finalFileName: string;
    
        if (resourceUri) {
          // File path mode
          if (!resourceUri.startsWith("file://")) {
            throw new Error("resourceUri must be a file:// URI");
          }
    
          let filePath: string;
          if (process.platform === "win32") {
            filePath = decodeURIComponent(
              resourceUri.replace(/^file:\/\/\//, "").replace(/\//g, "\\")
            );
          } else {
            filePath = decodeURIComponent(resourceUri.replace(/^file:\/\//, ""));
          }
    
          // Validate path is allowed
          filePath = await validatePath(filePath);
          fileBuffer = await fs.readFile(filePath);
          finalFileName = fileName || path.basename(filePath);
        } else if (fileContent) {
          // Base64 content mode
          if (!fileName) {
            throw new Error("fileName is required when using fileContent");
          }
          fileBuffer = Buffer.from(fileContent, "base64");
          finalFileName = fileName;
        } else {
          throw new Error("Either resourceUri or fileContent must be provided");
        }
    
        const detectedMimeType = mimeType || getMimeType(finalFileName);
    
        const formData = new FormData();
        const blob = new Blob([new Uint8Array(fileBuffer)], { type: detectedMimeType });
        formData.append("file", blob, finalFileName);
        formData.append("network", network);
    
        if (group_id) {
          formData.append("group_id", group_id);
        }
    
        if (keyvalues) {
          formData.append("keyvalues", JSON.stringify(keyvalues));
        }
    
        const response = await fetch("https://uploads.pinata.cloud/v3/files", {
          method: "POST",
          headers: {
            Authorization: `Bearer ${PINATA_JWT}`,
          },
          body: formData,
        });
    
        if (!response.ok) {
          const errorText = await response.text();
          throw new Error(
            `Failed to upload file: ${response.status} ${response.statusText}\n${errorText}`
          );
        }
    
        const data = await response.json();
        return {
          content: [
            {
              type: "text",
              text: `✅ File uploaded successfully!\n\n${JSON.stringify(data, null, 2)}`,
            },
          ],
        };
      } catch (error) {
        return errorResponse(error);
      }
    }
  • validatePath helper function used by uploadFile to ensure file paths are within allowed directories for security. Resolves symlinks and validates access permissions.
    async function validatePath(requestedPath: string): Promise<string> {
      // If no directories specified, allow current working directory
      const dirsToCheck =
        allowedDirectories.length > 0
          ? allowedDirectories
          : [normalizePath(process.cwd())];
    
      const expandedPath = expandHome(requestedPath);
      const absolute = path.isAbsolute(expandedPath)
        ? path.resolve(expandedPath)
        : path.resolve(process.cwd(), expandedPath);
    
      const normalizedRequested = normalizePath(absolute);
    
      const isAllowed = dirsToCheck.some((dir) =>
        normalizedRequested.startsWith(dir)
      );
      if (!isAllowed) {
        throw new Error(
          `Access denied - path outside allowed directories: ${absolute}`
        );
      }
    
      try {
        const realPath = await fs.realpath(absolute);
        const normalizedReal = normalizePath(realPath);
        const isRealPathAllowed = dirsToCheck.some((dir) =>
          normalizedReal.startsWith(dir)
        );
        if (!isRealPathAllowed) {
          throw new Error(
            "Access denied - symlink target outside allowed directories"
          );
        }
        return realPath;
      } catch (error) {
        const parentDir = path.dirname(absolute);
        try {
          const realParentPath = await fs.realpath(parentDir);
          const normalizedParent = normalizePath(realParentPath);
          const isParentAllowed = dirsToCheck.some((dir) =>
            normalizedParent.startsWith(dir)
          );
          if (!isParentAllowed) {
            throw new Error(
              "Access denied - parent directory outside allowed directories"
            );
          }
          return absolute;
        } catch {
          throw new Error(`Parent directory does not exist: ${parentDir}`);
        }
      }
    }
  • getMimeType helper function used by uploadFile to auto-detect MIME types based on file extensions when not explicitly provided.
    function getMimeType(filePath: string): string {
      const extension = filePath.split(".").pop()?.toLowerCase() || "";
      const mimeTypes: Record<string, string> = {
        txt: "text/plain",
        html: "text/html",
        css: "text/css",
        js: "application/javascript",
        json: "application/json",
        xml: "application/xml",
        pdf: "application/pdf",
        jpg: "image/jpeg",
        jpeg: "image/jpeg",
        png: "image/png",
        gif: "image/gif",
        svg: "image/svg+xml",
        webp: "image/webp",
        mp3: "audio/mpeg",
        mp4: "video/mp4",
        webm: "video/webm",
        zip: "application/zip",
        doc: "application/msword",
        docx: "application/vnd.openxmlformats-officedocument.wordprocessingml.document",
        xls: "application/vnd.ms-excel",
        xlsx: "application/vnd.openxmlformats-officedocument.spreadsheetml.sheet",
        ppt: "application/vnd.ms-powerpoint",
        pptx: "application/vnd.openxmlformats-officedocument.presentationml.presentation",
      };
    
      return mimeTypes[extension] || "application/octet-stream";
    }
Behavior2/5

Does the description disclose side effects, auth requirements, rate limits, or destructive behavior?

With no annotations provided, the description carries full burden but offers minimal behavioral context. It doesn't disclose whether this is a mutating operation (implied by 'upload'), what permissions are required, rate limits, error conditions, or what happens on success (e.g., returns a CID). The mention of 'public' vs 'private' IPFS hints at access control but lacks detail.

Agents need to know what a tool does to the world before calling it. Descriptions should go beyond structured annotations to explain consequences.

Conciseness5/5

Is the description appropriately sized, front-loaded, and free of redundancy?

The description is a single, efficient sentence that immediately states the core purpose and the two key input options. Every word earns its place with zero redundancy or unnecessary elaboration, making it easy to parse quickly.

Shorter descriptions cost fewer tokens and are easier for agents to parse. Every sentence should earn its place.

Completeness2/5

Given the tool's complexity, does the description cover enough for an agent to succeed on first attempt?

For a 7-parameter mutation tool with no annotations and no output schema, the description is insufficient. It doesn't explain what the tool returns (e.g., IPFS hash, file ID), error handling, authentication requirements, or how it differs from sibling tools like 'createSignedUploadUrl'. The context signals indicate high complexity that isn't adequately addressed.

Complex tools with many parameters or behaviors need more documentation. Simple tools need less. This dimension scales expectations accordingly.

Parameters3/5

Does the description clarify parameter syntax, constraints, interactions, or defaults beyond what the schema provides?

Schema description coverage is 100%, so the schema fully documents all 7 parameters. The description adds marginal value by clarifying the two primary input methods (URI vs base64) and mentioning auto-detection for fileName and mimeType, but doesn't provide additional semantic context beyond what's already in the schema descriptions.

Input schemas describe structure but not intent. Descriptions should explain non-obvious parameter relationships and valid value ranges.

Purpose4/5

Does the description clearly state what the tool does and how it differs from similar tools?

The description clearly states the action ('Upload a file') and target ('to Pinata IPFS'), which is specific and unambiguous. It distinguishes from siblings like 'createSignedUploadUrl' or 'vectorizeFile' by focusing on direct file upload. However, it doesn't explicitly differentiate from 'updateFile' or 'addFileToGroup', which could be related operations.

Agents choose between tools based on descriptions. A clear purpose with a specific verb and resource helps agents select the right tool.

Usage Guidelines2/5

Does the description explain when to use this tool, when not to, or what alternatives exist?

The description provides no guidance on when to use this tool versus alternatives like 'createSignedUploadUrl' for pre-signed URLs or 'addFileToGroup' for adding existing files to groups. It mentions the two input options (URI or base64) but doesn't explain when one method is preferred over the other or any prerequisites for using this tool.

Agents often have multiple tools that could apply. Explicit usage guidance like "use X instead of Y when Z" prevents misuse.

Install Server

Other Tools

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/PinataCloud/pinata-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server