Skip to main content
Glama

AI Development Pipeline MCP

by theburgerllc

Server Configuration

Describes the environment variables required to run the server.

NameRequiredDescriptionDefault
VERCEL_TOKENYesYour Vercel authentication token
AIRTABLE_API_KEYYesYour Airtable API key
AIRTABLE_BASE_IDYesYour Airtable base ID
ANALYTICS_SECRETYesYour analytics secret key
VERCEL_PROJECT_IDYesYour Vercel project ID
AIRTABLE_TABLE_NAMEYesYour Airtable table name
NEXT_PUBLIC_APP_URLYesThe public URL of your application
SQUARE_ACCESS_TOKENYesYour Square access token
SQUARE_APPLICATION_IDYesYour Square application ID

Schema

Prompts

Interactive templates invoked by user choice

NameDescription

No prompts

Resources

Contextual data attached and managed by the client

NameDescription

No resources

Tools

Functions exposed to the LLM to take actions

NameDescription
read_project_file

Read a local file from the VS Code workspace (restricted to workspace directory)

write_project_file

Write to a local file in the VS Code workspace (restricted to workspace directory)

run_shell_command

Run a whitelisted shell command in the workspace (npm, yarn, git, node, npx, tsc, eslint, prettier)

check_file_exists

Check if a local file exists (restricted to workspace directory)

list_directory_files

List files in a workspace directory (restricted to workspace directory)

run_augment_prompt

Send a prompt to the local Augment coding agent

run_project_tests

Run project tests (npm test, yarn test, etc.)

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/theburgerllc/ai-development-pipeline-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server