Skip to main content
Glama

Self-hosted LLM MCP Server

Server Configuration

Describes the environment variables required to run the server.

NameRequiredDescriptionDefault
LLM_MODELNoThe LLM model to usellama2
LOG_LEVELNoLogging level (debug, info, warn, error)info
LOG_FORMATNoLog format (text or json)json
LLM_TIMEOUTNoTimeout for LLM requests in milliseconds30000
LLM_BASE_URLNoBase URL for the self-hosted LLM servicehttp://localhost:11434
SUPABASE_URLYesYour Supabase project URL
MCP_SERVER_HOSTNoHost for the MCP serverlocalhost
MCP_SERVER_PORTNoPort for the MCP server3000
SUPABASE_ANON_KEYYesYour Supabase anonymous key
SUPABASE_SERVICE_ROLE_KEYYesYour Supabase service role key

Schema

Prompts

Interactive templates invoked by user choice

NameDescription

No prompts

Resources

Contextual data attached and managed by the client

NameDescription

No resources

Tools

Functions exposed to the LLM to take actions

NameDescription

No tools

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/Krishnahuex28/MCP'

If you have feedback or need assistance with the MCP directory API, please join our Discord server