Skip to main content
Glama

OpenRouter MCP Multimodal Server

by hoangdn3
search-models.d.ts904 B
import { ModelCache } from '../model-cache.js'; import { OpenRouterAPIClient } from '../openrouter-api.js'; export interface SearchModelsToolRequest { query?: string; provider?: string; minContextLength?: number | string; maxContextLength?: number | string; maxPromptPrice?: number | string; maxCompletionPrice?: number | string; capabilities?: { functions?: boolean; tools?: boolean; vision?: boolean; json_mode?: boolean; }; limit?: number | string; } export declare function handleSearchModels(request: { params: { arguments: SearchModelsToolRequest; }; }, apiClient: OpenRouterAPIClient, modelCache: ModelCache): Promise<{ content: { type: string; text: string; }[]; isError?: undefined; } | { content: { type: string; text: string; }[]; isError: boolean; }>;

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/hoangdn3/mcp-ocr-fallback'

If you have feedback or need assistance with the MCP directory API, please join our Discord server