Skip to main content
Glama

Server Configuration

Describes the environment variables required to run the server.

NameRequiredDescriptionDefault
PORTNoThe port the server will run on (defaults to 3001).3001
HEADLESSNoEnables headless mode for browser automation.true
OPENAI_MODELNoThe primary OpenAI model to use.gpt-4.1
OPENAI_API_KEYYesOpenAI API key required for AI-powered WordPress tasks.
REQUIRE_API_KEYNoWhether to require the master API key for requests (recommended for production).true
CORS_ALLOW_ORIGINNoThe CORS allowed origin.*
OPENAI_MAX_TOKENSNoThe maximum number of tokens for OpenAI responses.4096
OPENAI_NANO_MODELNoThe model used for nano tasks.gpt-4.1-nano
CORS_ALLOW_HEADERSNoThe CORS allowed headers.Content-Type,Authorization,Accept,Origin,X-Requested-With,X-Api-Key
CORS_ALLOW_METHODSNoThe CORS allowed HTTP methods.GET,HEAD,PUT,PATCH,POST,DELETE,OPTIONS
OPENAI_BASIC_MODELNoThe model used for basic tasks.gpt-4.1-mini
OPENAI_TEMPERATURENoThe sampling temperature for OpenAI.0.7
TANUKIMCP_MASTER_KEYYesThe master API key for the TanukiMCP server.
OPENAI_ADVANCED_MODELNoThe model used for advanced tasks.gpt-4.1
OPENAI_MAX_CONTEXT_TOKENSNoThe maximum context tokens for OpenAI.128000
ANALYTICS_DETAILED_LOGGINGNoWhether to enable detailed logging for analytics.false
ANALYTICS_SAVE_INTERVAL_MSNoThe interval in milliseconds for saving analytics.300000

Capabilities

Server capabilities have not been inspected yet.

Tools

Functions exposed to the LLM to take actions

NameDescription

No tools

Prompts

Interactive templates invoked by user choice

NameDescription

No prompts

Resources

Contextual data attached and managed by the client

NameDescription

No resources

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/AppleJax2/wordpress-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server