Skip to main content
Glama

Server Configuration

Describes the environment variables required to run the server.

NameRequiredDescriptionDefault
GITHUB_REPONoGitHub repository nameouroboros
GITHUB_USERYesGitHub username
GITHUB_TOKENYesGitHub personal access token with repo scope
TOTAL_BUDGETYesSpending limit in USD
OPENAI_API_KEYNoEnables the web_search tool
OUROBOROS_MODELNoPrimary LLM model (via OpenRouter)anthropic/claude-sonnet-4.6
ANTHROPIC_API_KEYNoEnables Claude Code CLI for code editing
OPENROUTER_API_KEYYesOpenRouter API key for LLM calls
TELEGRAM_BOT_TOKENYesTelegram Bot API token
OUROBOROS_MAX_ROUNDSNoMaximum LLM rounds per task200
OUROBOROS_MODEL_CODENoModel for code editing tasksanthropic/claude-sonnet-4.6
OUROBOROS_MAX_WORKERSNoMaximum number of parallel worker processes5
OUROBOROS_MODEL_LIGHTNoModel for lightweight tasks (dedup, compaction)google/gemini-3-pro-preview
OUROBOROS_BG_BUDGET_PCTNoPercentage of total budget allocated to background consciousness10
OUROBOROS_WEBSEARCH_MODELNoModel for web search (OpenAI Responses API)gpt-5
OUROBOROS_MODEL_FALLBACK_LISTNoFallback model chain for empty responsesgoogle/gemini-2.5-pro-preview,openai/o3,anthropic/claude-sonnet-4.6

Capabilities

Server capabilities have not been inspected yet.

Tools

Functions exposed to the LLM to take actions

NameDescription

No tools

Prompts

Interactive templates invoked by user choice

NameDescription

No prompts

Resources

Contextual data attached and managed by the client

NameDescription

No resources

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/rplryan/ouroboros'

If you have feedback or need assistance with the MCP directory API, please join our Discord server