Skip to main content
Glama

create_files

Create up to 200 markdown files in a single batch call. Supports knowledge, project, or ctxnest destinations with automatic folder creation and partial success handling.

Instructions

Batch variant of create_file — create up to 200 markdown files in one call. Each item follows the same rules (project_id required if destination is project or ctxnest). SIDE EFFECTS: writes new files to disk and inserts FTS5 rows; missing folders are mkdir'd. Per-item failures are isolated to errors[] and the rest of the batch still commits — partial success is the norm, always inspect error_count. No external auth or rate limits. Returns {created_count, error_count, created, errors}. Use for bulk ingestion; for >200 items, page yourself.

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
filesYesFiles to create (max 200 per call)

Implementation Reference

  • MCP server tool registration for 'create_files' — batch variant of create_file. Accepts an array of up to 200 file definitions (each with title, content, destination, optional project_id, folder, tags). Iterates over them, calling the core createFile() per item, collecting created results and per-item errors. Returns {created_count, error_count, created, errors}.
    server.tool(
      "create_files",
      "Batch variant of `create_file` — create up to 200 markdown files in one call. Each item follows the same rules (project_id required if destination is `project` or `ctxnest`). SIDE EFFECTS: writes new files to disk and inserts FTS5 rows; missing folders are mkdir'd. Per-item failures are isolated to `errors[]` and the rest of the batch still commits — partial success is the norm, always inspect `error_count`. No external auth or rate limits. Returns `{created_count, error_count, created, errors}`. Use for bulk ingestion; for >200 items, page yourself.",
      {
        files: z
          .array(
            z.object({
              title: z.string(),
              content: z.string(),
              destination: z.enum(["knowledge", "project", "ctxnest"]),
              project_id: z.number().optional(),
              folder: z.string().optional(),
              tags: z.array(z.string()).optional(),
            })
          )
          .min(1)
          .max(200)
          .describe("Files to create (max 200 per call)"),
      },
      async ({ files }) => {
        const created: any[] = [];
        const errors: { index: number; title: string; error: string }[] = [];
        for (let i = 0; i < files.length; i++) {
          const f = files[i];
          try {
            const result = await createFile({
              title: f.title,
              content: f.content,
              destination: f.destination,
              projectId: f.project_id,
              folder: f.folder,
              tags: f.tags,
              dataDir,
            });
            created.push(annotateTokens(result));
          } catch (e: any) {
            errors.push({ index: i, title: f.title, error: e?.message ?? String(e) });
          }
        }
        return {
          content: [
            {
              type: "text",
              text: JSON.stringify(
                { created_count: created.length, error_count: errors.length, created, errors },
                null,
                2
              ),
            },
          ],
        };
      }
    );
  • Zod schema for the create_files tool input. Defines an array of objects with title (string), content (string), destination (enum: knowledge|project|ctxnest), optional project_id (number), optional folder (string), and optional tags (string[]). Array is limited to 1-200 items.
    {
      files: z
        .array(
          z.object({
            title: z.string(),
            content: z.string(),
            destination: z.enum(["knowledge", "project", "ctxnest"]),
            project_id: z.number().optional(),
            folder: z.string().optional(),
            tags: z.array(z.string()).optional(),
          })
        )
        .min(1)
        .max(200)
        .describe("Files to create (max 200 per call)"),
  • Registration of 'create_files' tool on the MCP server via server.tool(). The description explains it as a batch variant of create_file supporting up to 200 files, with per-item failure isolation.
    server.tool(
      "create_files",
      "Batch variant of `create_file` — create up to 200 markdown files in one call. Each item follows the same rules (project_id required if destination is `project` or `ctxnest`). SIDE EFFECTS: writes new files to disk and inserts FTS5 rows; missing folders are mkdir'd. Per-item failures are isolated to `errors[]` and the rest of the batch still commits — partial success is the norm, always inspect `error_count`. No external auth or rate limits. Returns `{created_count, error_count, created, errors}`. Use for bulk ingestion; for >200 items, page yourself.",
      {
        files: z
          .array(
            z.object({
              title: z.string(),
              content: z.string(),
              destination: z.enum(["knowledge", "project", "ctxnest"]),
              project_id: z.number().optional(),
              folder: z.string().optional(),
              tags: z.array(z.string()).optional(),
            })
          )
          .min(1)
          .max(200)
          .describe("Files to create (max 200 per call)"),
      },
      async ({ files }) => {
        const created: any[] = [];
        const errors: { index: number; title: string; error: string }[] = [];
        for (let i = 0; i < files.length; i++) {
          const f = files[i];
          try {
            const result = await createFile({
              title: f.title,
              content: f.content,
              destination: f.destination,
              projectId: f.project_id,
              folder: f.folder,
              tags: f.tags,
              dataDir,
            });
            created.push(annotateTokens(result));
          } catch (e: any) {
            errors.push({ index: i, title: f.title, error: e?.message ?? String(e) });
          }
        }
        return {
          content: [
            {
              type: "text",
              text: JSON.stringify(
                { created_count: created.length, error_count: errors.length, created, errors },
                null,
                2
              ),
            },
          ],
        };
      }
    );
  • Core createFile() function that creates a single file. Handles destination routing (knowledge, project, ctxnest), file path resolution, directory creation, disk write, SQLite insert (with FTS indexing and tags), git auto-commit, and rollback on DB failure. This is the underlying helper called for each item in the create_files batch.
    export async function createFile(opts: CreateFileOptions): Promise<FileRecordWithContent> {
      const db = getDatabase();
      const { title, content, destination, projectId, folder, tags = [], dataDir, sourcePath } = opts;
    
      let filePath: string;
      let storageType: StorageType;
    
      const slug = slugify(title);
      if (!slug) {
        throw new Error("Invalid title: produces empty slug");
      }
      const filename = `${slug}.md`;
    
      if (destination === "knowledge") {
        const knowledgeDir = join(dataDir, "knowledge");
        mkdirSync(knowledgeDir, { recursive: true });
        filePath = folder
          ? assertPathInside(knowledgeDir, join(folder, filename))
          : assertPathInside(knowledgeDir, filename);
        storageType = "local";
      } else if (destination === "ctxnest") {
        if (!projectId) {
          throw new Error("projectId is required for ctxnest destination");
        }
        const project = db.prepare("SELECT slug FROM projects WHERE id = ?").get(projectId) as { slug: string } | undefined;
        if (!project) {
          throw new Error(`Project not found: ${projectId}`);
        }
        const projectDir = join(dataDir, "projects", project.slug);
        mkdirSync(projectDir, { recursive: true });
        filePath = folder
          ? assertPathInside(projectDir, join(folder, filename))
          : assertPathInside(projectDir, filename);
        storageType = "local";
      } else if (destination === "project") {
        if (!projectId) {
          throw new Error("projectId is required for project destination");
        }
        const project = db.prepare("SELECT path FROM projects WHERE id = ?").get(projectId) as { path: string | null } | undefined;
        if (!project || !project.path) {
          throw new Error(`Project path not found for project: ${projectId}`);
        }
        filePath = folder
          ? assertPathInside(project.path, join(folder, filename))
          : assertPathInside(project.path, filename);
        storageType = "reference";
      } else {
        throw new Error(`Unknown destination: ${destination}`);
      }
    
      mkdirSync(dirname(filePath), { recursive: true });
      // Stash pre-existing content (if any) so we can fully restore on a
      // DB-txn failure — otherwise writeFileSync below would leave the
      // caller's content sitting under a row that points elsewhere.
      let preExistingContent: Buffer | null = null;
      if (existsSync(filePath)) {
        try {
          preExistingContent = readFileSync(filePath);
        } catch (e) {
          console.warn("createFile: failed to stash pre-existing content for rollback:", e);
        }
      }
      writeFileSync(filePath, content, "utf8");
      const contentHash = computeHash(content);
    
      // Files + FTS + tag links in one txn so a partial failure can't leave
      // a row that's unsearchable forever.
      const insertStmt = db.prepare(`
        INSERT INTO files (path, title, project_id, storage_type, source_path, content_hash)
        VALUES (?, ?, ?, ?, ?, ?)
      `);
      const insertTagStmt = db.prepare("INSERT OR IGNORE INTO tags (name) VALUES (?)");
      const getTagStmt = db.prepare("SELECT id FROM tags WHERE name = ?");
      const linkTagStmt = db.prepare("INSERT INTO file_tags (file_id, tag_id) VALUES (?, ?)");
      const ftsStmt = db.prepare("INSERT INTO fts_index (rowid, title, content) VALUES (?, ?, ?)");
    
      let fileId: number;
      try {
        fileId = db.transaction(() => {
          const result = insertStmt.run(filePath, title, projectId || null, storageType, sourcePath || null, contentHash);
          const id = Number(result.lastInsertRowid);
    
          if (tags.length > 0) {
            for (const tagName of tags) {
              insertTagStmt.run(tagName);
              const tag = getTagStmt.get(tagName) as { id: number };
              linkTagStmt.run(id, tag.id);
            }
          }
    
          ftsStmt.run(id, title, content);
          return id;
        })();
      } catch (dbError) {
        // Roll back the file write so the watcher doesn't adopt the orphan
        // as an untitled record (losing the user-supplied title and tags).
        if (preExistingContent !== null) {
          try { writeFileSync(filePath, preExistingContent); } catch (e) {
            console.error("createFile: failed to restore pre-existing content after DB error:", e);
          }
        } else {
          try { unlinkSync(filePath); } catch {}
        }
        throw dbError;
      }
    
      const fileRecord = db.prepare("SELECT * FROM files WHERE id = ?").get(fileId) as FileRecord;
    
      // Reference files version against their project's own git, not the data dir.
      let gitWarning: string | undefined;
      try {
        let repoDir = dataDir;
        if (storageType === "reference" && projectId) {
          const project = db.prepare("SELECT path FROM projects WHERE id = ?").get(projectId) as { path: string | null } | undefined;
          if (project?.path) repoDir = project.path;
        }
        await commitFile(repoDir, filePath, `Create context file: ${title}`);
      } catch (error) {
        gitWarning = error instanceof Error ? error.message : String(error);
        console.warn("Git auto-commit failed during creation:", error);
      }
    
      return {
        ...fileRecord,
        content,
        ...(gitWarning ? { git_warning: gitWarning } : {}),
      };
    }
  • CreateFileOptions interface — the typed input structure for the core createFile() helper. Contains title, content, destination, optional projectId, folder, tags, dataDir, and sourcePath.
    export interface CreateFileOptions {
      title: string;
      content: string;
      destination: Destination;
      projectId?: number;
      folder?: string;
      tags?: string[];
      dataDir: string;
      sourcePath?: string;
    }
Behavior5/5

Does the description disclose side effects, auth requirements, rate limits, or destructive behavior?

With no annotations, description covers all critical behaviors: writes files, inserts FTS5 rows, creates missing folders, per-item isolation, partial success, no auth/rate limits, and return shape. Comprehensive for a batch tool.

Agents need to know what a tool does to the world before calling it. Descriptions should go beyond structured annotations to explain consequences.

Conciseness4/5

Is the description appropriately sized, front-loaded, and free of redundancy?

Information-dense but well-structured and front-loaded. Could be slightly more concise, but every sentence adds value.

Shorter descriptions cost fewer tokens and are easier for agents to parse. Every sentence should earn its place.

Completeness4/5

Given the tool's complexity, does the description cover enough for an agent to succeed on first attempt?

Given no output schema, the description explains return shape and side effects. Complete enough for a batch creation tool with one parameter.

Complex tools with many parameters or behaviors need more documentation. Simple tools need less. This dimension scales expectations accordingly.

Parameters4/5

Does the description clarify parameter syntax, constraints, interactions, or defaults beyond what the schema provides?

Schema coverage is 100%, but description adds meaning about project_id requirement based on destination and that each item follows same rules. Adds value beyond schema.

Input schemas describe structure but not intent. Descriptions should explain non-obvious parameter relationships and valid value ranges.

Purpose5/5

Does the description clearly state what the tool does and how it differs from similar tools?

Clearly states it's a batch variant of create_file, creates up to 200 markdown files, and distinguishes from sibling create_file by mentioning batch ingestion.

Agents choose between tools based on descriptions. A clear purpose with a specific verb and resource helps agents select the right tool.

Usage Guidelines4/5

Does the description explain when to use this tool, when not to, or what alternatives exist?

Explicitly states 'Use for bulk ingestion; for >200 items, page yourself.' and mentions partial success and inspecting error_count. Provides clear context, though no explicit exclusions for small batches.

Agents often have multiple tools that could apply. Explicit usage guidance like "use X instead of Y when Z" prevents misuse.

Install Server

Other Tools

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/safiyu/ctxnest'

If you have feedback or need assistance with the MCP directory API, please join our Discord server