Skip to main content
Glama

RunPod MCP Server

by runpod
MIT License
21
  • Apple
  • Linux

create-pod

Deploys a GPU-enabled container pod on RunPod infrastructure by specifying Docker image, GPU requirements, storage, and network settings to run compute workloads.

Input Schema

NameRequiredDescriptionDefault
cloudTypeNoSECURE or COMMUNITY cloud
containerDiskInGbNoContainer disk size in GB
dataCenterIdsNoList of data centers
envNoEnvironment variables
gpuCountNoNumber of GPUs
gpuTypeIdsNoList of acceptable GPU types
imageNameYesDocker image to use
nameNoName for the pod
portsNoPorts to expose (e.g., '8888/http', '22/tcp')
volumeInGbNoVolume size in GB
volumeMountPathNoPath to mount the volume

Input Schema (JSON Schema)

{ "properties": { "cloudType": { "description": "SECURE or COMMUNITY cloud", "enum": [ "SECURE", "COMMUNITY" ], "type": "string" }, "containerDiskInGb": { "description": "Container disk size in GB", "type": "number" }, "dataCenterIds": { "description": "List of data centers", "items": { "type": "string" }, "type": "array" }, "env": { "additionalProperties": { "type": "string" }, "description": "Environment variables", "type": "object" }, "gpuCount": { "description": "Number of GPUs", "type": "number" }, "gpuTypeIds": { "description": "List of acceptable GPU types", "items": { "type": "string" }, "type": "array" }, "imageName": { "description": "Docker image to use", "type": "string" }, "name": { "description": "Name for the pod", "type": "string" }, "ports": { "description": "Ports to expose (e.g., '8888/http', '22/tcp')", "items": { "type": "string" }, "type": "array" }, "volumeInGb": { "description": "Volume size in GB", "type": "number" }, "volumeMountPath": { "description": "Path to mount the volume", "type": "string" } }, "required": [ "imageName" ], "type": "object" }

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/runpod/runpod-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server