Skip to main content
Glama
Siddharth-Basale

Agentic CI/CD MCP Orchestrator

Server Configuration

Describes the environment variables required to run the server.

NameRequiredDescriptionDefault
RUN_IDNoWorkflow run ID to inspect.
REPOSITORYNoTarget GitHub repository in 'owner/repo' format.
BASE_BRANCHNoThe base branch for repair operations.main
GITHUB_TOKENYesGitHub personal access token for repository inspection and PR creation.
WORKFLOW_NAMENoFilter latest failed run by workflow name (e.g., 'ci').
OPENAI_API_KEYYesOpenAI API key used for the LLM diagnosis flow.
PATCH_STRATEGYNoPatch format expected from the model.unified_diff
FORCE_AUTOFIX_ALLNoIf set to 'true', bypasses thresholds and forces the auto-fix path.
LLM_PATCH_MAX_CHARSNoUpper bound on patch payload size.
MAX_REPAIR_ATTEMPTSNoNumber of patch generation/application retries.3
RISK_AUTO_FIX_THRESHOLDNoScore threshold below which fixes are applied autonomously.
RISK_HUMAN_REVIEW_THRESHOLDNoScore threshold below which human review is required.

Capabilities

Server capabilities have not been inspected yet.

Tools

Functions exposed to the LLM to take actions

NameDescription

No tools

Prompts

Interactive templates invoked by user choice

NameDescription

No prompts

Resources

Contextual data attached and managed by the client

NameDescription

No resources

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/Siddharth-Basale/mcpserver'

If you have feedback or need assistance with the MCP directory API, please join our Discord server