Skip to main content
Glama

Server Configuration

Describes the environment variables required to run the server.

NameRequiredDescriptionDefault

No arguments

Capabilities

Features and capabilities supported by this server

CapabilityDetails
tools
{
  "listChanged": false
}
prompts
{
  "listChanged": false
}
resources
{
  "subscribe": false,
  "listChanged": false
}
experimental
{}

Tools

Functions exposed to the LLM to take actions

NameDescription
scanA

Scan a directory for LLM API calls and estimate monthly costs.

Finds all LLM API call sites (OpenAI, Anthropic, etc.) in the given path and produces a cost estimate based on token counts and pricing.

Args: path: Directory or file path to scan. Defaults to current directory. calls_per_month: Assumed monthly call volume per call site. If not provided, the CLI default (1000) is used.

Returns: JSON string with the scan results including call sites and cost estimates.

diffA

Compare LLM costs between two git refs.

Shows which LLM call sites were added, removed, or changed between the base and head refs, along with the cost impact of those changes.

Args: base_ref: The base git ref (branch, tag, or commit) to compare from. head_ref: The head git ref to compare to. Defaults to HEAD.

Returns: JSON string with the diff results including cost changes.

Prompts

Interactive templates invoked by user choice

NameDescription

No prompts

Resources

Contextual data attached and managed by the client

NameDescription

No resources

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/Jwrede/tokentoll'

If you have feedback or need assistance with the MCP directory API, please join our Discord server