Skip to main content
Glama

Server Configuration

Describes the environment variables required to run the server.

NameRequiredDescriptionDefault
NLM_MCP_BASE_URLNoPublic URL for the server
NLM_MCP_DATA_DIRNoRuntime data directory~/.local/share/nlm-mcp
NLM_MCP_AUTH_MODENoAuthentication mode: none, token, or github-oauthnone
NLM_MCP_HTTP_HOSTNoHTTP bind host0.0.0.0
NLM_MCP_HTTP_PATHNoMCP endpoint path/mcp
NLM_MCP_HTTP_PORTNoHTTP bind port8080
NLM_MCP_LOG_LEVELNoLog levelINFO
NLM_MCP_TRANSPORTNoTransport mode: stdio or httpstdio
NLM_MCP_LOG_FORMATNoLog format: json or consolejson
NLM_MCP_BEARER_TOKENNoBearer token for token authentication
NLM_MCP_GITHUB_CLIENT_IDNoGitHub OAuth client ID
NLM_MCP_OAUTH_ALLOWED_USERSNoComma-separated list of allowed GitHub usernames
NLM_MCP_GITHUB_CLIENT_SECRETNoGitHub OAuth client secret
NLM_MCP_NOTEBOOKLM_AUTH_FILENoPath to NotebookLM auth file~/.config/nlm-mcp/notebooklm_auth.json
NLM_MCP_NOTEBOOKLM_AUTH_JSONNoInline NotebookLM auth JSON (secret)

Capabilities

Server capabilities have not been inspected yet.

Tools

Functions exposed to the LLM to take actions

NameDescription

No tools

Prompts

Interactive templates invoked by user choice

NameDescription

No prompts

Resources

Contextual data attached and managed by the client

NameDescription

No resources

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/oaslananka/notebooklm-mcp-pro'

If you have feedback or need assistance with the MCP directory API, please join our Discord server