Skip to main content
Glama

FHIR MCP Server

by the-momentum
vector_store_schemas.pyβ€’643 B
from typing import Any from pydantic import BaseModel class PineconeSearchResponse(BaseModel): chunk_text: str | None = None chunk_index: int | None = None fhir_document_id: str | None = None source_url: str | None = None score: float | None = None class PineconeSearchRequest(BaseModel): embedded_query: list[float] class Vector(BaseModel): id: str values: list[float] metadata: dict[str, Any] class PineconeUpsertRequest(BaseModel): vector: list[Vector] namespace: str class Embeddings(BaseModel): vectors: list[list[float]] class PineconeError(BaseModel): error_message: str

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/the-momentum/fhir-mcp-server'

If you have feedback or need assistance with the MCP directory API, please join our Discord server