Skip to main content
Glama

TOON MCP Server

npm version License: MIT

MCP (Model Context Protocol) server for TOON (Token-Oriented Object Notation) encoding. Reduce LLM token usage by 50-70% when sending structured data.

What is TOON?

TOON is a compact data format optimized for LLM input. Instead of repeating field names for every object, it uses a header-based format:

JSON (1041 tokens):

[ {"id": 1, "name": "Product A", "price": 99.99}, {"id": 2, "name": "Product B", "price": 149.99} ]

TOON (389 tokens):

[id,name,price] 1,Product A,99.99 2,Product B,149.99

Result: 62% fewer tokens = 62% cost savings

Installation

Quick Start (npx - no install needed)

Add to your MCP settings:

Claude Desktop (~/Library/Application Support/Claude/claude_desktop_config.json):

{ "mcpServers": { "toon": { "command": "npx", "args": ["-y", "toon-mcp-server"] } } }

Claude Code (~/.claude/settings.json):

{ "mcpServers": { "toon": { "command": "npx", "args": ["-y", "toon-mcp-server"] } } }

Global Install

npm install -g toon-mcp-server

Then add to your MCP settings:

{ "mcpServers": { "toon": { "command": "toon-mcp" } } }

As Claude Code Skill

# Download the skill curl -o ~/.claude/skills/toon.md https://raw.githubusercontent.com/elminson/toon-mcp/main/skills/toon.md

Then use /toon in Claude Code.

Available Tools

toon_encode

Convert data to TOON format.

Supported formats: JSON, CSV, TSV, XML, HTML tables, YAML

Input: [{"name":"Alice","age":30},{"name":"Bob","age":25}] Output: [name,age] Alice,30 Bob,25

toon_decode

Convert TOON back to JSON.

toon_analyze

Analyze data and show potential token/cost savings.

toon_optimize_prompt

Find data sections in a prompt and convert them to TOON automatically.

Usage Examples

In Claude Desktop/Code (with MCP)

Just ask Claude to use the tools:

  • "Encode this JSON to TOON: [...]"

  • "Analyze how much I'd save converting this data to TOON"

  • "Optimize this prompt for token efficiency"

Programmatic (Node.js)

const { ToonEncoder } = require('toon-mcp-server/src/toon-encoder'); // Encode const data = [ { id: 1, name: 'Test', price: 99.99 }, { id: 2, name: 'Test 2', price: 149.99 }, ]; const toon = ToonEncoder.encode(data); // Get stats const json = JSON.stringify(data); const stats = ToonEncoder.getStats(json, toon); console.log(stats.savings.percent); // "64.5%" // Decode const decoded = ToonEncoder.decode(toon);

Benchmarks

Tested with OpenAI GPT-4o-mini:

Dataset Size

JSON Tokens

TOON Tokens

Savings

5 items

383

192

49.9%

20 items

1,394

530

62%

50 items

3,412

1,204

64.7%

100 items

6,800

2,400

~65%

Cost Savings at Scale

Volume

GPT-4o-mini

GPT-4o

Claude Sonnet

1M requests

$489 saved

$8,158 saved

$9,789 saved

10M requests

$4,890 saved

$81,580 saved

$97,890 saved

When to Use TOON

Best for:

  • Arrays of objects with same structure (tables, lists, records)

  • API responses, database results

  • Large datasets sent to LLMs

  • Cost optimization at scale

⚠️ Less effective for:

  • Deeply nested, non-uniform data

  • Small payloads (<5 items)

  • Data with many unique field structures

Contributing

Pull requests welcome! Please open an issue first to discuss changes.

License

MIT

Install Server
A
security – no known vulnerabilities
A
license - permissive license
A
quality - confirmed to work

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/elminson/toon-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server