Skip to main content
Glama
hoangdn3
by hoangdn3

search_models

Find and filter AI models on OpenRouter.ai by criteria like price, context length, provider, and capabilities including vision or function calling.

Instructions

Search and filter OpenRouter.ai models based on various criteria

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
queryNoOptional search query to filter by name, description, or provider
providerNoFilter by specific provider (e.g., "anthropic", "openai", "cohere")
minContextLengthNoMinimum context length in tokens
maxContextLengthNoMaximum context length in tokens
maxPromptPriceNoMaximum price per 1K tokens for prompts
maxCompletionPriceNoMaximum price per 1K tokens for completions
capabilitiesNoFilter by model capabilities
limitNoMaximum number of results to return (default: 10)

Implementation Reference

  • The main handler function that implements the core logic of the 'search_models' tool. It refreshes the model cache if necessary, performs the search using ModelCache.searchModels, formats results as JSON, and handles errors.
    export async function handleSearchModels( request: { params: { arguments: SearchModelsToolRequest } }, apiClient: OpenRouterAPIClient, modelCache: ModelCache ) { const args = request.params.arguments; try { // Refresh the cache if needed if (!modelCache.isCacheValid()) { const models = await apiClient.getModels(); modelCache.setModels(models); } // Search models based on criteria const results = modelCache.searchModels({ query: args.query, provider: args.provider, minContextLength: args.minContextLength, maxContextLength: args.maxContextLength, maxPromptPrice: args.maxPromptPrice, maxCompletionPrice: args.maxCompletionPrice, capabilities: args.capabilities, limit: args.limit || 10, }); return { content: [ { type: 'text', text: JSON.stringify(results, null, 2), }, ], }; } catch (error) { if (error instanceof Error) { return { content: [ { type: 'text', text: `Error searching models: ${error.message}`, }, ], isError: true, }; } throw error; } }
  • TypeScript interface defining the expected input shape (SearchModelsToolRequest) for the search_models tool, used for type checking in the handler.
    export interface SearchModelsToolRequest { query?: string; provider?: string; minContextLength?: number | string; maxContextLength?: number | string; maxPromptPrice?: number | string; maxCompletionPrice?: number | string; capabilities?: { functions?: boolean; tools?: boolean; vision?: boolean; json_mode?: boolean; }; limit?: number | string; }
  • Registration of the 'search_models' tool in the ListTools response, specifying name, description, and full inputSchema matching the handler's expected arguments.
    // Search Models Tool { name: 'search_models', description: 'Search and filter OpenRouter.ai models based on various criteria', inputSchema: { type: 'object', properties: { query: { type: 'string', description: 'Optional search query to filter by name, description, or provider', }, provider: { type: 'string', description: 'Filter by specific provider (e.g., "anthropic", "openai", "cohere")', }, minContextLength: { type: 'number', description: 'Minimum context length in tokens', }, maxContextLength: { type: 'number', description: 'Maximum context length in tokens', }, maxPromptPrice: { type: 'number', description: 'Maximum price per 1K tokens for prompts', }, maxCompletionPrice: { type: 'number', description: 'Maximum price per 1K tokens for completions', }, capabilities: { type: 'object', description: 'Filter by model capabilities', properties: { functions: { type: 'boolean', description: 'Requires function calling capability', }, tools: { type: 'boolean', description: 'Requires tools capability', }, vision: { type: 'boolean', description: 'Requires vision capability', }, json_mode: { type: 'boolean', description: 'Requires JSON mode capability', } } }, limit: { type: 'number', description: 'Maximum number of results to return (default: 10)', minimum: 1, maximum: 50 } } }, },
  • Dispatch logic in the CallToolRequest handler that routes 'search_models' tool calls to the handleSearchModels function with appropriate parameters and dependencies.
    case 'search_models': return handleSearchModels({ params: { arguments: request.params.arguments as SearchModelsToolRequest } }, this.apiClient, this.modelCache);

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/hoangdn3/mcp-ocr-fallback'

If you have feedback or need assistance with the MCP directory API, please join our Discord server