Skip to main content
Glama

OpenAI MCP Server

server.json2.25 kB
{ "$schema": "https://registry.nimbletools.ai/schemas/2025-09-22/nimbletools-server.schema.json", "name": "ai.nimbletools/openai", "version": "1.0.0", "description": "OpenAI API: chat, embeddings, DALL-E, TTS, Whisper, vision, and moderation", "status": "active", "repository": { "url": "https://github.com/NimbleBrainInc/mcp-openai", "source": "github", "branch": "main" }, "websiteUrl": "https://platform.openai.com/", "packages": [ { "registryType": "oci", "registryBaseUrl": "https://docker.io", "identifier": "nimbletools/mcp-openai", "version": "1.0.0", "transport": { "type": "streamable-http", "url": "https://mcp.nimbletools.ai/mcp" }, "environmentVariables": [ { "name": "OPENAI_API_KEY", "description": "OpenAI API key for accessing GPT models (get key at platform.openai.com/api-keys)", "isRequired": true, "isSecret": true, "example": "sk-..." } ] } ], "_meta": { "ai.nimbletools.mcp/v1": { "container": { "healthCheck": { "path": "/health", "port": 8000 } }, "capabilities": { "tools": true, "resources": false, "prompts": false }, "resources": { "limits": { "memory": "512Mi", "cpu": "500m" }, "requests": { "memory": "256Mi", "cpu": "100m" } }, "deployment": { "protocol": "http", "port": 8000, "mcpPath": "/mcp" }, "display": { "name": "OpenAI", "category": "ai-ml", "tags": [ "openai", "gpt", "gpt-4", "chat", "embeddings", "dall-e", "whisper", "tts", "vision", "requires-api-key" ], "branding": { "logoUrl": "https://static.nimbletools.ai/logos/openai.png", "iconUrl": "https://static.nimbletools.ai/icons/openai.png" }, "documentation": { "readmeUrl": "https://raw.githubusercontent.com/NimbleBrainInc/mcp-openai/main/README.md" } } } } }

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/NimbleBrainInc/mcp-openai'

If you have feedback or need assistance with the MCP directory API, please join our Discord server