Skip to main content
Glama
theburgerllc

AI Development Pipeline MCP

by theburgerllc

Server Configuration

Describes the environment variables required to run the server.

NameRequiredDescriptionDefault
VERCEL_TOKENYesYour Vercel authentication token
AIRTABLE_API_KEYYesYour Airtable API key
AIRTABLE_BASE_IDYesYour Airtable base ID
ANALYTICS_SECRETYesYour analytics secret key
VERCEL_PROJECT_IDYesYour Vercel project ID
AIRTABLE_TABLE_NAMEYesYour Airtable table name
NEXT_PUBLIC_APP_URLYesThe public URL of your application
SQUARE_ACCESS_TOKENYesYour Square access token
SQUARE_APPLICATION_IDYesYour Square application ID

Capabilities

Server capabilities have not been inspected yet.

Tools

Functions exposed to the LLM to take actions

NameDescription
read_project_fileB

Read a local file from the VS Code workspace (restricted to workspace directory)

write_project_fileB

Write to a local file in the VS Code workspace (restricted to workspace directory)

run_shell_commandB

Run a whitelisted shell command in the workspace (npm, yarn, git, node, npx, tsc, eslint, prettier)

check_file_existsB

Check if a local file exists (restricted to workspace directory)

list_directory_filesC

List files in a workspace directory (restricted to workspace directory)

run_augment_promptC

Send a prompt to the local Augment coding agent

run_project_testsB

Run project tests (npm test, yarn test, etc.)

Prompts

Interactive templates invoked by user choice

NameDescription

No prompts

Resources

Contextual data attached and managed by the client

NameDescription

No resources

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/theburgerllc/ai-development-pipeline-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server