Skip to main content
Glama

Gemini MCP Server

__init__.pyโ€ข677 B
"""Model provider abstractions for supporting multiple AI providers.""" from .azure_openai import AzureOpenAIProvider from .base import ModelProvider from .gemini import GeminiModelProvider from .openai import OpenAIModelProvider from .openai_compatible import OpenAICompatibleProvider from .openrouter import OpenRouterProvider from .registry import ModelProviderRegistry from .shared import ModelCapabilities, ModelResponse __all__ = [ "ModelProvider", "ModelResponse", "ModelCapabilities", "ModelProviderRegistry", "AzureOpenAIProvider", "GeminiModelProvider", "OpenAIModelProvider", "OpenAICompatibleProvider", "OpenRouterProvider", ]

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/BeehiveInnovations/gemini-mcp-server'

If you have feedback or need assistance with the MCP directory API, please join our Discord server