Skip to main content
Glama
teste.py430 B
#from llama_index.embeddings.huggingface import HuggingFaceEmbedding #import llama_index #print(llama_index.__version__) #embed_model = HuggingFaceEmbedding(model_name="sentence-transformers/all-MiniLM-L6-v2") from rag_code import QdrantVDB, EmbedData, Retriever retriever = Retriever(QdrantVDB("ml_faq_collection"), EmbedData()) print(retriever.search("Como evitar overfitting em modelos de machine learning?"))

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/sandovalmedeiros/mcp_agentic_rag'

If you have feedback or need assistance with the MCP directory API, please join our Discord server