tokentoll
Server Configuration
Describes the environment variables required to run the server.
| Name | Required | Description | Default |
|---|---|---|---|
No arguments | |||
Capabilities
Features and capabilities supported by this server
| Capability | Details |
|---|---|
| tools | {
"listChanged": false
} |
| prompts | {
"listChanged": false
} |
| resources | {
"subscribe": false,
"listChanged": false
} |
| experimental | {} |
Tools
Functions exposed to the LLM to take actions
| Name | Description |
|---|---|
| scanA | Scan a directory for LLM API calls and estimate monthly costs. Finds all LLM API call sites (OpenAI, Anthropic, etc.) in the given path and produces a cost estimate based on token counts and pricing. Args: path: Directory or file path to scan. Defaults to current directory. calls_per_month: Assumed monthly call volume per call site. If not provided, the CLI default (1000) is used. Returns: JSON string with the scan results including call sites and cost estimates. |
| diffA | Compare LLM costs between two git refs. Shows which LLM call sites were added, removed, or changed between the base and head refs, along with the cost impact of those changes. Args: base_ref: The base git ref (branch, tag, or commit) to compare from. head_ref: The head git ref to compare to. Defaults to HEAD. Returns: JSON string with the diff results including cost changes. |
Prompts
Interactive templates invoked by user choice
| Name | Description |
|---|---|
No prompts | |
Resources
Contextual data attached and managed by the client
| Name | Description |
|---|---|
No resources | |
Latest Blog Posts
MCP directory API
We provide all the information about MCP servers via our MCP API.
curl -X GET 'https://glama.ai/api/mcp/v1/servers/Jwrede/tokentoll'
If you have feedback or need assistance with the MCP directory API, please join our Discord server