Skip to main content
Glama

MCP File Context Server

by bsmi021

get_chunk_count

Determine the total number of chunks returned for a read_context request. Use this tool to plan chunk requests by specifying file path, encoding, size limits, and file types.

Instructions

Get the total number of chunks that will be returned for a read_context request. Use this tool FIRST before reading content to determine how many chunks you need to request. The parameters should match what you'll use in read_context.

Input Schema

NameRequiredDescriptionDefault
encodingNoFile encoding (e.g., utf8, ascii, latin1)utf8
fileTypesNoFile extension(s) to include WITHOUT dots (e.g. ["ts", "js", "py"] or just "ts"). Empty/undefined means all files.
maxSizeNoMaximum file size in bytes. Files larger than this will be chunked.
pathYesPath to file or directory
recursiveNoWhether to read directories recursively (includes subdirectories)

Input Schema (JSON Schema)

{ "properties": { "encoding": { "default": "utf8", "description": "File encoding (e.g., utf8, ascii, latin1)", "type": "string" }, "fileTypes": { "default": [], "description": "File extension(s) to include WITHOUT dots (e.g. [\"ts\", \"js\", \"py\"] or just \"ts\"). Empty/undefined means all files.", "items": { "type": "string" }, "type": [ "array", "string" ] }, "maxSize": { "default": 1048576, "description": "Maximum file size in bytes. Files larger than this will be chunked.", "type": "number" }, "path": { "description": "Path to file or directory", "type": "string" }, "recursive": { "default": true, "description": "Whether to read directories recursively (includes subdirectories)", "type": "boolean" } }, "required": [ "path" ], "type": "object" }
Install Server

Other Tools from MCP File Context Server

Related Tools

    MCP directory API

    We provide all the information about MCP servers via our MCP API.

    curl -X GET 'https://glama.ai/api/mcp/v1/servers/bsmi021/mcp-file-context-server'

    If you have feedback or need assistance with the MCP directory API, please join our Discord server