Skip to main content
Glama

Server Configuration

Describes the environment variables required to run the server.

NameRequiredDescriptionDefault
RUNDECK_URLNoRundeck server URLhttp://localhost:4440
RUNDECK_API_TOKENYesAPI token for authentication
RUNDECK_API_VERSIONNoAPI version number44

Tools

Functions exposed to the LLM to take actions

NameDescription
list_jobs

List jobs in a Rundeck project with optional filtering.

Returns a numbered markdown table of jobs. Use the # column to reference jobs in subsequent commands (e.g., "run job 3"). Args: query: Query parameters for filtering jobs Returns: Markdown table with numbered jobs Examples: List all jobs in a project: >>> result = list_jobs(JobQuery(project="myproject")) Filter by group: >>> result = list_jobs(JobQuery(project="myproject", group_path="deploy/prod")) Search by name: >>> result = list_jobs(JobQuery(project="myproject", job_filter="backup"))
get_job

Get detailed information about a specific job.

Returns the full job definition including all options displayed in a table showing required status, defaults, and allowed values. Args: job_id: The job UUID Returns: Formatted string with job details and options table Examples: >>> result = get_job("abc-123-def") >>> print(result) '## Deploy Application...'
list_executions

List job executions with optional filtering.

Returns a list of executions matching the specified criteria. Filter by project, job_id, status, or time range. Results are ordered by start time (most recent first). Args: query: Query parameters for filtering executions Returns: List of Execution objects matching the query Examples: List recent executions in a project: >>> result = list_executions(ExecutionQuery(project="myproject")) List failed executions for a specific job: >>> result = list_executions(ExecutionQuery( ... job_id="abc-123-def", ... status="failed" ... )) List executions from the last hour: >>> result = list_executions(ExecutionQuery( ... project="myproject", ... recent_filter="1h" ... ))
get_execution

Get detailed information about a specific execution.

Returns the full execution details including status, timing, node results, and the arguments used. Args: execution_id: The execution ID (integer) Returns: Execution object with full details Examples: >>> execution = get_execution(12345) >>> print(execution.status) 'succeeded' >>> print(execution.duration_seconds) 45.2
get_execution_output

Get the log output from a job execution.

Retrieves log entries from the execution. For running executions, use the 'offset' parameter to poll for new output. The 'completed' field indicates whether the execution has finished. Args: execution_id: The execution ID (integer) last_lines: Return only the last N lines (overrides offset) max_lines: Maximum number of lines to return from offset offset: Byte offset to start reading from (for tailing) node: Filter output to a specific node Returns: ExecutionOutput with log entries and metadata Examples: Get all output: >>> output = get_execution_output(12345) >>> for entry in output.entries: ... print(f"[{entry.level}] {entry.log}") Get last 50 lines: >>> output = get_execution_output(12345, last_lines=50) Tail running execution: >>> output = get_execution_output(12345, offset=0) >>> while not output.completed: ... output = get_execution_output(12345, offset=output.offset) ... # process new entries

Prompts

Interactive templates invoked by user choice

NameDescription

No prompts

Resources

Contextual data attached and managed by the client

NameDescription

No resources

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/justynroberts/rundeck-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server