Skip to main content
Glama

Server Configuration

Describes the environment variables required to run the server.

NameRequiredDescriptionDefault
LLM_MODELNoOverride the default LLM model.
AWS_REGIONNoAWS region for Bedrock embeddings (amazon.titan-embed-text-v2:0). Requires IAM permissions.
GEMINI_API_KEYNoAPI key for Google Gemini. Used for both LLM (gemini-2.0-flash) and embeddings (text-embedding-004) by default.
OPENAI_API_KEYNoAPI key for OpenAI. Used for both LLM (gpt-4o-mini) and embeddings (text-embedding-3-small) by default.
EMBEDDING_MODELNoOverride the default embedding model.
ANTHROPIC_API_KEYNoAPI key for Anthropic. Used for LLM (claude-sonnet-4-5-20250929) by default.

Capabilities

Features and capabilities supported by this server

CapabilityDetails
tools
{
  "listChanged": true
}

Tools

Functions exposed to the LLM to take actions

NameDescription
ragchat_setup

Initialize a domain with a knowledge base from markdown content. Each ## section becomes a searchable document with vector embeddings. This is the first step — run this before testing or serving.

ragchat_test

Send a test message to a domain's chat. Uses RAG search + LLM to generate a response, same as production. Good for verifying the knowledge base works.

ragchat_serve

Start a local HTTP chat server for a domain. The server runs on localhost and handles POST /chat requests. Use ragchat_widget to get the embed code that connects to this server.

ragchat_widget

Generate an embeddable chat widget. Returns a tag that creates a floating chat bubble on any webpage. Connects to the chat server started with ragchat_serve.

ragchat_status

List all configured domains with document counts and config status. Shows what's been set up and what's ready to serve.

Prompts

Interactive templates invoked by user choice

NameDescription

No prompts

Resources

Contextual data attached and managed by the client

NameDescription

No resources

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/gogabrielordonez/mcp-ragchat'

If you have feedback or need assistance with the MCP directory API, please join our Discord server