Skip to main content
Glama

MCP Server with OpenAI Integration

by code-wgl

Server Configuration

Describes the environment variables required to run the server.

NameRequiredDescriptionDefault
MCP_PORTNoReserved for future transports7337
LOG_LEVELNoLog level: fatal → traceinfo
UI_DEMO_PORTNoOptional port for the browser UI demo4399
OPENAI_API_KEYYesAPI key for OpenAI
MCP_SERVER_NAMENoName advertised to MCP clients
OPENAI_BASE_URLNoOverride base URL for Azure/OpenAI proxies
MCP_TOOL_MODULESNoComma-separated absolute paths to extra tool modules
OPENAI_TIMEOUT_MSNoTimeout (ms) applied to OpenAI API calls20000

Schema

Prompts

Interactive templates invoked by user choice

NameDescription

No prompts

Resources

Contextual data attached and managed by the client

NameDescription

No resources

Tools

Functions exposed to the LLM to take actions

NameDescription

No tools

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/code-wgl/McpServer'

If you have feedback or need assistance with the MCP directory API, please join our Discord server