Skip to main content
Glama

Server Configuration

Describes the environment variables required to run the server.

NameRequiredDescriptionDefault
OLLAMA_MODELNoThe Ollama model to use for summarization (e.g., llama3.2:1b, mistral:7b, qwen2.5:0.5b)llama3.2:1b
OLLAMA_TIMEOUTNoTimeout in milliseconds for Ollama API requests30000
OLLAMA_ENDPOINTNoThe endpoint URL for Ollama LLM service used for content summarizationhttp://localhost:11434
SCRAPER_API_KEYNoOptional API key for ScraperAPI (paid fallback search provider)

Tools

Functions exposed to the LLM to take actions

NameDescription
rag

Busca web com extração inteligente de conteúdo (igual Apify RAG Web Browser)

fetchFullContent

Busca conteúdo completo de um resultado anterior de RAG obtido em contentMode=preview

scrape

Extrai conteúdo inteligente de uma URL específica

screenshot

Captura screenshot de uma página web

Prompts

Interactive templates invoked by user choice

NameDescription

No prompts

Resources

Contextual data attached and managed by the client

NameDescription

No resources

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/alucardeht/isis-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server