Skip to main content
Glama
gemma.py732 B
"""EmbeddingGemma model implementation.""" from typing import Optional from embeddings.sentence_transformer import SentenceTransformerModel class GemmaEmbeddingModel(SentenceTransformerModel): """EmbeddingGemma model - specialized SentenceTransformer implementation.""" def __init__( self, cache_dir: Optional[str] = None, device: str = "auto" ): """Initialize GemmaEmbeddingModel. Args: cache_dir: Directory to cache the model device: Device to load model on ("auto", "cuda", "mps", "cpu") """ super().__init__( model_name="google/embeddinggemma-300m", cache_dir=cache_dir, device=device )

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/FarhanAliRaza/claude-context-local'

If you have feedback or need assistance with the MCP directory API, please join our Discord server