tokencost-dev
Provides pricing information, context window specifications, and capabilities for OpenAI models (e.g., GPT-4o, GPT-4o-mini), including tools for calculating cost estimates based on input/output token counts.
Install in 30 seconds
Claude Code:
claude mcp add tokencost-dev -- npx -y tokencost-devThen ask: "How much would 1M input tokens cost on claude-sonnet-4-5?"
Cursor (.cursor/mcp.json):
{
"mcpServers": {
"tokencost-dev": {
"command": "npx",
"args": ["-y", "tokencost-dev"]
}
}
}No API keys. No accounts. No configuration files. Pricing data is fetched from the LiteLLM community registry and cached locally for 24 hours.
Tools
get_model_details
Look up pricing, context window, and capabilities for any model. Fuzzy matching means "sonnet 4.5" works just as well as "claude-sonnet-4-5-20250514".
> "What are Claude Sonnet 4.5's pricing and capabilities?"
Model: claude-sonnet-4-5
Provider: anthropic | Mode: chat
Pricing (per 1M tokens):
Input: $3.00
Output: $15.00
Context Window:
Max Input: 200K
Max Output: 8K
Capabilities: vision, function_calling, parallel_function_callingcalculate_estimate
Estimate cost for a given number of input and output tokens.
> "How much will 1000 input + 500 output tokens cost on Claude Sonnet 4.5?"
Cost Estimate for claude-sonnet-4-5
Input: 1K tokens × $3.00/1M = $0.003000
Output: 500 tokens × $15.00/1M = $0.007500
─────────────────────────────
Total: $0.0105compare_models
Find the most cost-effective models matching your requirements.
> "What are the cheapest OpenAI chat models?"
Top 2 most cost-effective models (provider: openai) (mode: chat):
1. gpt-4o-mini
Provider: openai | Mode: chat
Input: $0.15/1M | Output: $0.60/1M
Context: 128K in / 16K out
2. gpt-4o
Provider: openai | Mode: chat
Input: $5.00/1M | Output: $15.00/1M
Context: 128K in / 16K outrefresh_prices
Force re-fetch pricing data from the LiteLLM registry (cache is refreshed automatically every 24h).
Docs
Full documentation at tokencost.dev
License
MIT
This server cannot be installed
Resources
Unclaimed servers have limited discoverability.
Looking for Admin?
If you are the server author, to access and configure the admin panel.
Latest Blog Posts
MCP directory API
We provide all the information about MCP servers via our MCP API.
curl -X GET 'https://glama.ai/api/mcp/v1/servers/atriumn/tokencost-dev'
If you have feedback or need assistance with the MCP directory API, please join our Discord server