Skip to main content
Glama
atriumn
by atriumn

Install in 30 seconds

Claude Code:

claude mcp add tokencost-dev -- npx -y tokencost-dev

Then ask: "How much would 1M input tokens cost on claude-sonnet-4-5?"

Cursor (.cursor/mcp.json):

{
  "mcpServers": {
    "tokencost-dev": {
      "command": "npx",
      "args": ["-y", "tokencost-dev"]
    }
  }
}

No API keys. No accounts. No configuration files. Pricing data is fetched from the LiteLLM community registry and cached locally for 24 hours.

Tools

get_model_details

Look up pricing, context window, and capabilities for any model. Fuzzy matching means "sonnet 4.5" works just as well as "claude-sonnet-4-5-20250514".

> "What are Claude Sonnet 4.5's pricing and capabilities?"

Model: claude-sonnet-4-5
Provider: anthropic | Mode: chat

Pricing (per 1M tokens):
  Input:  $3.00
  Output: $15.00

Context Window:
  Max Input:  200K
  Max Output: 8K

Capabilities: vision, function_calling, parallel_function_calling

calculate_estimate

Estimate cost for a given number of input and output tokens.

> "How much will 1000 input + 500 output tokens cost on Claude Sonnet 4.5?"

Cost Estimate for claude-sonnet-4-5

  Input:  1K tokens × $3.00/1M  = $0.003000
  Output: 500 tokens × $15.00/1M = $0.007500
  ─────────────────────────────
  Total:  $0.0105

compare_models

Find the most cost-effective models matching your requirements.

> "What are the cheapest OpenAI chat models?"

Top 2 most cost-effective models (provider: openai) (mode: chat):

1. gpt-4o-mini
   Provider: openai | Mode: chat
   Input: $0.15/1M | Output: $0.60/1M
   Context: 128K in / 16K out

2. gpt-4o
   Provider: openai | Mode: chat
   Input: $5.00/1M | Output: $15.00/1M
   Context: 128K in / 16K out

refresh_prices

Force re-fetch pricing data from the LiteLLM registry (cache is refreshed automatically every 24h).

Docs

Full documentation at tokencost.dev

License

MIT

-
security - not tested
A
license - permissive license
-
quality - not tested

Resources

Unclaimed servers have limited discoverability.

Looking for Admin?

If you are the server author, to access and configure the admin panel.

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/atriumn/tokencost-dev'

If you have feedback or need assistance with the MCP directory API, please join our Discord server