Artificial Analysis MCP Server
Click on "Install Server".
Wait a few minutes for the server to deploy. Once ready, it will show a "Started" state.
In the chat, type
@followed by the MCP server name and your instructions, e.g., "@Artificial Analysis MCP ServerList the top 5 models with the highest intelligence index"
That's it! The server will respond to your query, and you can continue using it as needed.
Here is a step-by-step guide with screenshots.
Artificial Analysis MCP Server
An MCP (Model Context Protocol) server that provides LLM model pricing, speed metrics, and benchmark scores from Artificial Analysis.
Features
Get real-time pricing for 300+ LLM models (input/output/blended rates)
Compare speed metrics (tokens/sec, time to first token)
Access benchmark scores (Intelligence Index, Coding Index, MMLU-Pro, GPQA, and more)
Filter by provider (OpenAI, Anthropic, Google, etc.)
Sort by any metric
Installation
Claude Code
claude mcp add artificial-analysis -e AA_API_KEY=your-key -- npx -y artificial-analysis-mcpOr install from GitHub:
claude /mcp add https://github.com/davidhariri/artificial-analysis-mcpManual Configuration
Add to your Claude settings (~/.claude/settings.json):
{
"mcpServers": {
"artificial-analysis": {
"command": "npx",
"args": ["-y", "artificial-analysis-mcp"],
"env": {
"AA_API_KEY": "your-api-key"
}
}
}
}Configuration
Environment Variable | Required | Description |
| Yes | Your Artificial Analysis API key |
Get your API key at artificialanalysis.ai.
Tools
list_models
List all available LLM models with optional filtering and sorting.
Parameters:
Name | Type | Required | Description |
| string | No | Filter by model creator (e.g., "OpenAI", "Anthropic") |
| string | No | Sort field (see below) |
| string | No | "asc" or "desc" (default: "desc") |
| number | No | Maximum results to return |
Sort fields: price_input, price_output, price_blended, speed, ttft, intelligence_index, coding_index, math_index, mmlu_pro, gpqa, release_date
Example usage:
"List the top 5 fastest models"
"Show me Anthropic models sorted by price"
"What are the cheapest models with high intelligence scores?"
get_model
Get detailed information about a specific model.
Parameters:
Name | Type | Required | Description |
| string | Yes | Model name or slug (e.g., "gpt-4o", "claude-4-5-sonnet") |
Returns: Complete model details including pricing, speed metrics, and all benchmark scores.
Example usage:
"Get pricing for GPT-4o"
"What are Claude 4.5 Sonnet's benchmark scores?"
Model Data
Each model includes:
Pricing: Input/output/blended rates per 1M tokens (USD)
Speed: Output tokens per second, time to first token
Benchmarks: Intelligence Index, Coding Index, Math Index, MMLU-Pro, GPQA, LiveCodeBench, and more
Development
# Install dependencies
npm install
# Build
npm run build
# Run locally
AA_API_KEY=your-key node dist/index.jsLicense
MIT
Resources
Unclaimed servers have limited discoverability.
Looking for Admin?
If you are the server author, to access and configure the admin panel.
Tools
Latest Blog Posts
MCP directory API
We provide all the information about MCP servers via our MCP API.
curl -X GET 'https://glama.ai/api/mcp/v1/servers/davidhariri/artificial-analysis-mcp'
If you have feedback or need assistance with the MCP directory API, please join our Discord server