Skip to main content
Glama

Gemini MCP Server

openai.pyโ€ข622 B
"""Registry loader for OpenAI model capabilities.""" from __future__ import annotations from ..shared import ProviderType from .base import CapabilityModelRegistry class OpenAIModelRegistry(CapabilityModelRegistry): """Capability registry backed by ``conf/openai_models.json``.""" def __init__(self, config_path: str | None = None) -> None: super().__init__( env_var_name="OPENAI_MODELS_CONFIG_PATH", default_filename="openai_models.json", provider=ProviderType.OPENAI, friendly_prefix="OpenAI ({model})", config_path=config_path, )

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/BeehiveInnovations/gemini-mcp-server'

If you have feedback or need assistance with the MCP directory API, please join our Discord server