Skip to main content
Glama

Llama 4 Maverick MCP Server

by YobieBen

Server Configuration

Describes the environment variables required to run the server.

NameRequiredDescriptionDefault
SEEDNoRandom seed for reproducible results42
DEBUGNoEnable debug modefalse
TOP_KNoTop-k sampling parameter (1-100)40
TOP_PNoTop-p sampling parameter (0.0-1.0)0.9
CACHE_TTLNoCache time-to-live in seconds3600
PYTHONPATHNoPython path for module imports
TEMPERATURENoModel temperature setting (0.0-2.0)0.7
ENABLE_VISIONNoEnable vision processing capabilitiesfalse
LLAMA_API_KEYNoOptional API key for Ollama authentication
LLAMA_API_URLNoURL for the Ollama API serverhttp://localhost:11434
MCP_LOG_LEVELNoLogging level for the MCP serverINFO
CACHE_MAX_SIZENoMaximum cache size1000
REPEAT_PENALTYNoRepetition penalty for text generation1.1
MCP_SERVER_HOSTNoHost address for the MCP serverlocalhost
MCP_SERVER_PORTNoPort number for the MCP server3000
VERBOSE_LOGGINGNoEnable verbose loggingfalse
ENABLE_STREAMINGNoEnable streaming support for real-time token generationtrue
LLAMA_MODEL_NAMENoName of the Llama model to usellama3:latest
ALLOW_FILE_WRITESNoAllow file write operationstrue
ENABLE_WEB_SEARCHNoEnable web search functionalitytrue
MAX_CONTEXT_LENGTHNoMaximum context length for the model128000
REQUEST_TIMEOUT_MSNoRequest timeout in milliseconds30000
ENABLE_CODE_EXECUTIONNoEnable code execution (security risk)false
FILE_SYSTEM_BASE_PATHNoBase path for file system operations
ENABLE_FUNCTION_CALLINGNoEnable function calling capabilitiestrue
MAX_CONCURRENT_REQUESTSNoMaximum number of concurrent requests10

Schema

Prompts

Interactive templates invoked by user choice

NameDescription

No prompts

Resources

Contextual data attached and managed by the client

NameDescription

No resources

Tools

Functions exposed to the LLM to take actions

NameDescription

No tools

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/YobieBen/llama4-maverick-mcp-python'

If you have feedback or need assistance with the MCP directory API, please join our Discord server