Skip to main content
Glama

Server Configuration

Describes the environment variables required to run the server.

NameRequiredDescriptionDefault
NODE_ENVNoEnvironment mode (e.g., 'production', 'development')
LOG_LEVELNoLogging level (e.g., 'error', 'info', 'debug'). Default is 'error' in production or when NODE_ENV is unset; 'info' in development
LOGSEQ_API_URLYesThe URL of the Logseq HTTP API serverhttp://127.0.0.1:12315
LOGSEQ_TIMEOUTNoRequest timeout in milliseconds30000
LOGSEQ_API_TOKENYesYour Logseq API token (generated in Logseq: Settings → Features → HTTP APIs server → API → Authorization tokens → Add new token)
LOGSEQ_MAX_RETRIESNoMaximum number of retry attempts for failed requests5

Tools

Functions exposed to the LLM to take actions

NameDescription

No tools

Prompts

Interactive templates invoked by user choice

NameDescription

No prompts

Resources

Contextual data attached and managed by the client

NameDescription

No resources

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/eugeneyvt/logseq-mcp-server'

If you have feedback or need assistance with the MCP directory API, please join our Discord server