Skip to main content
Glama

Server Configuration

Describes the environment variables required to run the server.

NameRequiredDescriptionDefault

No arguments

Capabilities

Features and capabilities supported by this server

CapabilityDetails
tools
{}

Tools

Functions exposed to the LLM to take actions

NameDescription
read_file

Read a file and mark it as the active file. When you switch to a different file, the previous file is automatically summarized to just its public interface, reducing context size.

Supported languages for summarization: .rs, .py, .ts, .tsx, .js, .jsx, .php, .cs, .gd

For unsupported file types, returns full contents without tracking (same as standard file read).

peek_file

Get a summary of a file's public interface without changing the active file. Useful for checking APIs of files you've already worked on.

Returns:

  • For the active file: full contents

  • For previously read files: cached summary (public structs, functions, traits, etc.)

  • For unsupported file types: full contents

edit_file

Edit a file by replacing a specific string. The file becomes (or remains) the active file.

The old_string must:

  • Match exactly, including whitespace and indentation

  • Appear exactly once in the file (for safety)

After editing, the file's cached summary is updated.

write_file

Write content to a file, creating it if it doesn't exist. The file becomes the active file.

Creates parent directories if needed.

file_status

Show the status of all tracked files including:

  • The currently active file (full contents in context)

  • Cached summaries with size comparison

  • Total context savings from compaction

forget_file

Remove a file from tracking. Useful for cleanup or when you no longer need a file's interface in context.

Prompts

Interactive templates invoked by user choice

NameDescription

No prompts

Resources

Contextual data attached and managed by the client

NameDescription

No resources

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/UBTCodeNinja/mcp-file-compaction'

If you have feedback or need assistance with the MCP directory API, please join our Discord server