tokencost-mcp-server
The TokenCost MCP Server provides real-time LLM token pricing data and cost estimation tools for 60+ AI models across 15+ providers (OpenAI, Anthropic, Google, xAI, Meta, Mistral, DeepSeek, and more).
Get Model Pricing (
tokencost_get_model_pricing): Retrieve detailed pricing for a specific model (e.g., GPT-5, Claude Sonnet, Gemini Pro), including input/output costs per 1M tokens, context window size, and max output tokens.Compare Models (
tokencost_compare_models): Side-by-side pricing comparison of 2–10 models simultaneously, showing cost differences to help choose the best option.Estimate Cost (
tokencost_estimate_cost): Calculate the exact USD cost for a given number of input and output tokens with a specific model, broken down by input, output, and total.Find Cheapest Models (
tokencost_find_cheapest): Discover the most cost-effective models, with optional filters by provider, minimum context window, and sorting by input, output, or combined cost.List All Models (
tokencost_list_models): Browse all 60+ available models with IDs, names, and providers, optionally filtered by provider.List Providers (
tokencost_list_providers): View all supported providers with their model counts and pricing ranges.
Pricing data is kept up-to-date within days of official changes and can be queried via natural language through AI assistants like Claude Desktop, Cursor, or Windsurf.
Provides real-time pricing data, cost estimation, and model comparisons for Amazon AI models like Nova.
Provides real-time pricing data, cost estimation, and model comparisons for Google AI models like Gemini.
Provides real-time pricing data, cost estimation, and model comparisons for Meta AI models like Llama.
Provides real-time pricing data, cost estimation, and model comparisons for NVIDIA AI models.
Provides real-time pricing data, cost estimation, and model comparisons for OpenAI AI models like GPT-4 and GPT-5.
Provides real-time pricing data, cost estimation, and model comparisons for Perplexity AI models.
TokenCost MCP Server
An MCP (Model Context Protocol) server that provides real-time LLM token pricing data for 60+ AI models across 15 providers.
Query, compare, and estimate costs for models from OpenAI, Anthropic, Google, Meta, xAI, Mistral, DeepSeek, and more — directly from your AI assistant.
Built by TokenCost — the free LLM token cost calculator.
Tools
Tool | Description |
| Get pricing for a specific model |
| Side-by-side pricing comparison |
| Calculate cost for given token counts |
| Find cheapest models with filters |
| List all available models |
| List all providers with pricing ranges |
Quick Start
Claude Desktop / Cursor / Windsurf
Add to your MCP config:
{
"mcpServers": {
"tokencost": {
"command": "npx",
"args": ["-y", "tokencost-mcp-server"]
}
}
}From Source
git clone https://github.com/ankit-aglawe/tokencost-mcp-server
cd tokencost-mcp-server
npm install
npm run build
npm startExample Usage
"How much would it cost to process 1M input tokens with GPT-5?"
→ Uses tokencost_estimate_cost with model="gpt-5", input_tokens=1000000, output_tokens=0
"Compare Claude Sonnet 4.6 vs GPT-5 vs Gemini 3 Pro pricing"
→ Uses tokencost_compare_models with ["claude-sonnet-4.6", "gpt-5", "gemini-3-pro"]
"What's the cheapest model with at least 200K context?"
→ Uses tokencost_find_cheapest with min_context=200000
Supported Providers
OpenAI, Anthropic, Google, xAI, Meta, Mistral, DeepSeek, Alibaba (Qwen), Amazon (Nova), NVIDIA, Cohere, Perplexity, Moonshot (Kimi), Zhipu (GLM), MiniMax
Pricing Data
Pricing is kept accurate and up to date by the TokenCost team. We track official provider announcements and update pricing as soon as changes are published — new models, price cuts, and deprecations are reflected within days.
If you notice outdated pricing or a missing model, open an issue and we'll get it updated.
License
MIT
Latest Blog Posts
MCP directory API
We provide all the information about MCP servers via our MCP API.
curl -X GET 'https://glama.ai/api/mcp/v1/servers/ankit-aglawe/tokencost-mcp-server'
If you have feedback or need assistance with the MCP directory API, please join our Discord server