Skip to main content
Glama
apatoliya

MCP-RAG Server

by apatoliya

Server Configuration

Describes the environment variables required to run the server.

NameRequiredDescriptionDefault
BUCKET_IDYesYour bucket ID for document storage
OPENAI_API_KEYYesYour OpenAI API key
GROUNDX_API_KEYYesYour GroundX API key

Capabilities

Features and capabilities supported by this server

CapabilityDetails
tools
{
  "listChanged": false
}
prompts
{
  "listChanged": false
}
resources
{
  "subscribe": false,
  "listChanged": false
}
experimental
{}

Tools

Functions exposed to the LLM to take actions

NameDescription
process_search_query
Process a search query using GroundX and OpenAI. Args: query: The search query string config: Optional SearchConfig object for customization Returns: SearchResponse object containing the query, score, and result
search_doc_for_rag_context
Searches and retrieves relevant context from a knowledge base, based on the user's query. Args: query: The search query supplied by the user. Returns: str: Relevant text content that can be used by the LLM to answer the query.
ingest_documents
Ingest documents from a local file into the knowledge base. Args: local_file_path: The path to the local file containing the documents to ingest. Returns: str: A message indicating the documents have been ingested.

Prompts

Interactive templates invoked by user choice

NameDescription

No prompts

Resources

Contextual data attached and managed by the client

NameDescription

No resources

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/apatoliya/mcp-rag'

If you have feedback or need assistance with the MCP directory API, please join our Discord server