Skip to main content
Glama

Server Configuration

Describes the environment variables required to run the server.

NameRequiredDescriptionDefault
GITHUB_TOKENYesGitHub personal access token for CI polling

Capabilities

Features and capabilities supported by this server

CapabilityDetails
tools
{}
resources
{}

Tools

Functions exposed to the LLM to take actions

NameDescription
watch_ciB

Watch a GitHub Actions workflow run and emit events on status changes

watch_processA

Spawn and monitor a local process, capturing stdout/stderr and exit code

watch_fileC

Watch a file or directory for changes using chokidar

watch_urlB

Poll an HTTP endpoint and emit events when status or body changes

watch_webhookC

Start an HTTP server to receive incoming webhooks

check_consequencesB

Get events associated with an agent action, or all recent events

list_watchesB

List all active watchers

cancel_watchC

Stop and remove a watcher

poll_eventsA

Get new events since a timestamp (push notification fallback)

Prompts

Interactive templates invoked by user choice

NameDescription

No prompts

Resources

Contextual data attached and managed by the client

NameDescription
Active WatchesAll currently active watchers
Recent TimelineLast 100 events across all watches

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/jarvisassistantux/loopsense'

If you have feedback or need assistance with the MCP directory API, please join our Discord server