Skip to main content
Glama
linxule
by linxule

Server Configuration

Describes the environment variables required to run the server.

NameRequiredDescriptionDefault
LOTUS_DEBUGNoEnable debug mode for developmentfalse

Tools

Functions exposed to the LLM to take actions

NameDescription
lotuswisdom

Contemplative reasoning tool. Use for complex problems needing multi-perspective understanding, contradictions requiring integration, or questions holding their own wisdom.

Workflow: Always start with tag='begin' (returns framework). Then continue with contemplation tags. Do NOT output wisdom until status='WISDOM_READY'.

Tags: begin (FIRST - receives framework), then: open/engage/express (process), examine/reflect/verify/refine/complete (meta-cognitive), recognize/transform/integrate/transcend/embody (non-dual), upaya/expedient/direct/gradual/sudden (skillful-means), meditate (pause).

lotuswisdom_summary

Get a summary of the current contemplative journey

Prompts

Interactive templates invoked by user choice

NameDescription

No prompts

Resources

Contextual data attached and managed by the client

NameDescription

No resources

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/linxule/lotus-wisdom-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server