Provides tools to search and retrieve detailed metadata for OpenAI models, including pricing, context limits, and supported capabilities like tool calling and reasoning.
Features
Models PLUS provides a comprehensive AI model catalog with modern tooling:
Core Features
Unified REST API - Advanced search and filtering for 100+ AI models
Model Context Protocol (MCP) - Native MCP support with 4 powerful tools
Real-time Data - Fresh data from models.dev database
Lightning Fast - Built with Bun runtime and SST v3
Developer Experience
Zero Config - Biome + Ultracite for ultra-fast formatting and linting
TypeScript - Full type safety with strict TypeScript configuration
Cloudflare Workers - Global edge deployment with SST
Rich Metadata
Comprehensive Model Info - Pricing, limits, capabilities, modalities
Provider Details - Environment variables, documentation, integrations
Advanced Filtering - Search by cost, context length, features, and more
Public API: https://modelsplus.quivr.tech
Quick Start
Try the Public API
# List latest models
curl "https://modelsplus.quivr.tech/v1/models?limit=5"
# Find reasoning-capable models
curl "https://modelsplus.quivr.tech/v1/models?reasoning=true"
# Get specific model details
curl "https://modelsplus.quivr.tech/v1/models/openai:gpt-4o"Local Development
# Install dependencies
bun install
# Start development server
bun run dev
# Build for production
bun run buildInstallation
📋 Requirements
Bun
1.2.21- Runtime and package managerNode.js types - For tooling compatibility (bundled via SST)
Quick Install
# Install dependencies
bun install
# Generate JSON assets from vendor data
cd packages/api && bun run generate && bun run buildDevelopment
Useful Scripts
bun run build— Build all workspacesbun run dev— SST Dev with Cloudflare Worker locallybun run dev:api— Direct Worker dev for API onlybun run deploy— Deploy via SST to Cloudflare Workersbun run sync:upstream— Sync vendor subtree
Development Setup
Generate JSON assets from vendor TOML files:
cd packages/api bun run generate bun run buildRun development servers:
# SST Dev (recommended) bun run dev # Direct Worker dev cd packages/api && bun run dev
Note: SST config (sst.config.ts) auto-builds @modelsplus/api and exposes the Worker URL.
API Guide
Authentication
No authentication required. The API is publicly accessible.
Base URL
https://modelsplus.quivr.techResponse Format
All API responses return JSON. Error responses include:
{
"error": "Error message",
"status": 400
}Rate Limits
Currently no rate limiting is enforced, but please be respectful.
Query Parameters
Models API (/v1/models)
Parameter | Type | Description | Example |
| string | Search query (model name, provider, etc.) |
|
| string | Filter by provider |
|
| boolean | Filter by tool calling support |
|
| boolean | Filter by attachment support |
|
| boolean | Filter by reasoning capabilities |
|
| boolean | Filter by temperature support |
|
| boolean | Filter by open weights availability |
|
| number | Minimum input cost filter |
|
| number | Maximum input cost filter |
|
| number | Minimum output cost filter |
|
| number | Maximum output cost filter |
|
| number | Minimum context length |
|
| number | Maximum context length |
|
| number | Minimum output limit |
|
| number | Maximum output limit |
|
| string | Comma-separated modalities |
|
| string | Released after date (ISO) |
|
| string | Released before date (ISO) |
|
| string | Updated after date (ISO) |
|
| string | Updated before date (ISO) |
|
| string | Sort field |
|
| string | Sort order |
|
| number | Maximum results (default: unlimited) |
|
| number | Skip number of results |
|
| string | Comma-separated fields to return |
|
Providers API (/v1/providers)
Parameter | Type | Description | Example |
| string | Search query (provider name) |
|
| string | Filter by environment variable |
|
| string | Filter by npm package |
|
| number | Maximum results |
|
| number | Skip number of results |
|
Model Object Schema
{
"id": "openai:gpt-4o",
"provider": "openai",
"name": "GPT-4o",
"release_date": "2024-05-13",
"last_updated": "2024-08-06",
"attachment": true,
"reasoning": false,
"temperature": true,
"tool_call": true,
"open_weights": false,
"knowledge": "2023-10",
"cost": {
"input": 0.0025,
"output": 0.01,
"cache_read": 0.00125,
"cache_write": 0.00625
},
"limit": {
"context": 128000,
"output": 16384
},
"modalities": {
"input": ["text", "image"],
"output": ["text"]
}
}Provider Object Schema
{
"id": "openai",
"name": "OpenAI",
"env": ["OPENAI_API_KEY"],
"npm": "openai",
"api": "https://api.openai.com/v1",
"doc": "https://platform.openai.com/docs"
}🔗 API Endpoints
Base URL: https://modelsplus.quivr.tech
Method | Endpoint | Description |
|
| Health/status check |
|
| MCP discovery |
|
| List/search models |
|
| Count models after filters |
|
| Get specific model details |
|
| List/search providers |
|
| Count providers after filters |
|
| MCP over HTTP (JSON-RPC) |
|
| Alternate MCP endpoint |
Code Examples
JavaScript/TypeScript:
// Search models
const models = await fetch('https://modelsplus.quivr.tech/v1/models?reasoning=true&limit=5')
.then(res => res.json());
// Get specific model
const model = await fetch('https://modelsplus.quivr.tech/v1/models/openai:gpt-4o')
.then(res => res.json());Python:
import requests
# Find vision-capable models
response = requests.get('https://modelsplus.quivr.tech/v1/models',
params={'modalities': 'image', 'limit': 5})
models = response.json()MCP Integration
Models PLUS provides native Model Context Protocol (MCP) support for seamless integration with AI assistants.
Available Tools
search_models- Advanced search and filtering for AI modelsget_model- Detailed information about specific modelssearch_providers- Search and filter AI providersget_provider- Detailed provider information
Quick Setup
Claude Desktop
Add to your claude_desktop_config.json:
{
"mcpServers": {
"models-plus": {
"command": "npx",
"args": ["-y", "@modelcontextprotocol/sdk", "server", "https://modelsplus.quivr.tech/mcp"]
}
}
}Cursor
Configure MCP server with URL: https://modelsplus.quivr.tech/mcp
Other MCP Clients
For any MCP-compatible client, use: https://modelsplus.quivr.tech/mcp
Usage Examples
Once integrated, use natural language:
"Find all GPT-4 models from OpenAI"
"Show me reasoning-capable models under $1 per million tokens"
"What are the specs for Claude 3 Opus?"
"Which providers support tool calling?"
Direct HTTP API
# Discover capabilities
curl "https://modelsplus.quivr.tech/mcp"
# List available tools
curl -s "https://modelsplus.quivr.tech/mcp" \
-X POST \
-H 'Content-Type: application/json' \
-d '{"jsonrpc":"2.0","id":1,"method":"tools/list","params":{}}'Data Source
Model and provider metadata sourced from models.dev TOML files. The build process (packages/api/src/generate.ts) converts these into optimized JSON artifacts for the API and MCP handlers.
Deployment
Deploys via SST to Cloudflare Workers:
bun run deploySST config creates a sst.cloudflare.Worker with global edge deployment.
Contributing
We welcome contributions! Here's how to get started:
Fork and create a feature branch
Install dependencies:
bun installBuild and ensure tests pass:
bun run buildFormat code:
npx ultracite format && npx ultracite lintTest your changes thoroughly
Submit a pull request with a clear description
Acknowledgments
Built on top of models.dev - a comprehensive open-source database of AI model specifications, pricing, and capabilities maintained by the SST team.
This server cannot be installed
Resources
Unclaimed servers have limited discoverability.
Looking for Admin?
If you are the server author, to access and configure the admin panel.