Skip to main content
Glama
akiani

Epic Patient API MCP Server

by akiani
gemini.py893 B
"""Gemini LLM client implementation.""" from google import genai from .base import LLMClient class GeminiClient(LLMClient): """Gemini API client.""" def __init__(self, api_key: str, model: str = "gemini-2.0-flash-exp"): """Initialize Gemini client. Args: api_key: Gemini API key model: Model name to use """ self.client = genai.Client(api_key=api_key) self.model = model def generate(self, prompt: str) -> str: """Generate a response using Gemini. Args: prompt: The prompt to send to Gemini Returns: The generated text response Raises: Exception: If the Gemini API call fails """ response = self.client.models.generate_content( model=self.model, contents=prompt ) return response.text

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/akiani/mock-epic-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server