Skip to main content
Glama

llm-token-tracker

package.json•1.3 kB
{ "name": "llm-token-tracker", "version": "2.3.4", "description": "Token usage tracker for OpenAI and Claude APIs with MCP support - Updated 2025 pricing", "type": "module", "main": "dist/index.js", "bin": { "llm-token-tracker": "./dist/mcp-server.js" }, "files": [ "dist/**/*", "README.md", "LICENSE" ], "scripts": { "build": "tsc", "dev": "tsc --watch", "prepare": "npm run build" }, "keywords": [ "openai", "claude", "anthropic", "token-tracking", "mcp", "cost-tracking" ], "author": "wn01011", "license": "MIT", "engines": { "node": ">=20.0.0" }, "dependencies": { "@modelcontextprotocol/sdk": "^1.0.0", "tiktoken": "^1.0.10" }, "devDependencies": { "@types/node": "^20.0.0", "typescript": "^5.0.0" }, "peerDependencies": { "openai": "^4.0.0", "@anthropic-ai/sdk": "^0.20.0" }, "peerDependenciesMeta": { "openai": { "optional": true }, "@anthropic-ai/sdk": { "optional": true } }, "repository": { "type": "git", "url": "https://github.com/wn01011/llm-token-tracker.git" }, "bugs": { "url": "https://github.com/wn01011/llm-token-tracker/issues" }, "homepage": "https://github.com/wn01011/llm-token-tracker#readme" }

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/wn01011/llm-token-tracker'

If you have feedback or need assistance with the MCP directory API, please join our Discord server