Skip to main content
Glama

Llama Maverick Hub MCP Server

by YobieBen

Server Configuration

Describes the environment variables required to run the server.

NameRequiredDescriptionDefault
GITHUB_TOKENNoYour GitHub token
LLAMA_HUB_NAMENoHub configuration namellama-maverick-hub
LLAMA_HUB_PORTNoHub configuration port8080
STRIPE_API_KEYNoYour Stripe API key
LLAMA_HUB_API_KEYSNoAPI keys for security (comma-separated)
LLAMA_HUB_LOG_LEVELNoHub configuration log levelinfo
LLAMA_HUB_ENABLE_AUTHNoEnable authentication security settingfalse
LLAMA_HUB_LLAMA_MODELNoLlama model configurationllama3.2
LLAMA_HUB_LLAMA_BASE_URLNoLlama base URL configurationhttp://localhost:11434
LLAMA_HUB_SERVICE_GITHUB_ENABLEDNoEnable GitHub service configurationtrue
LLAMA_HUB_SERVICE_STRIPE_COMMANDNoStripe service command configurationnpx
LLAMA_HUB_SERVICE_STRIPE_ENABLEDNoEnable Stripe service configurationtrue

Schema

Prompts

Interactive templates invoked by user choice

NameDescription

No prompts

Resources

Contextual data attached and managed by the client

NameDescription

No resources

Tools

Functions exposed to the LLM to take actions

NameDescription

No tools

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/YobieBen/llama-maverick-hub-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server