model-hub-mcp
An MCP (Model Context Protocol) server that fetches AI model information from OpenAI, Anthropic, and Google.
Features
Multi-provider Support: Supports three providers - OpenAI, Anthropic, and Google AI
List Models: Retrieve a list of available models from each provider
Get Model Details: Fetch detailed information about specific models
Unified Retrieval: Batch fetch model information from all configured providers
Related MCP server: OpenAI and Claude MCP
Quick Start (npx)
Note: The package will be downloaded from npm on first run.
Installation
Global Installation
Local Installation
Configuration
Copy
.env.exampleto.env:
Set API keys for each provider in the
.envfile:
Note: You can leave API keys empty for providers you don't plan to use.
Build
Compile TypeScript code:
Usage
This MCP server is not meant to be run directly. It should be configured in your MCP client configuration.
See the "MCP Client Configuration Examples" section below for setup instructions.
Available Tools
list_models
Retrieve a list of available models from a specific provider.
Parameters:
provider: "openai" | "anthropic" | "google"
get_model
Fetch detailed information about a specific model.
Parameters:
provider: "openai" | "anthropic" | "google"model_id: Model ID (e.g., "gpt-4", "claude-3-opus", "gemini-pro")
list_all_models
Batch fetch model information from all configured providers.
MCP Client Configuration Examples
Claude Code
You can easily add this MCP server to Claude Code using the following command:
This command assumes you have environment variables set in your shell:
$GEMINI_API_KEY- Your Google AI API key$OPENAI_API_KEY- Your OpenAI API key$ANTHROPIC_API_KEY- Your Anthropic API key
Using npx
License
MIT