Skip to main content
Glama

Server Configuration

Describes the environment variables required to run the server.

NameRequiredDescriptionDefault
OPENAI_API_KEYNoOpenAI API key (sk-...). Used for gpt-* / openai-* / o1-* / o3-* / o4-* models. BYOK — Lucairn does not store it.
LUCAIRN_API_KEYNoLucairn API key starting with lcr_live_. Get one free at https://lucairn.eu/account/signup
ANTHROPIC_API_KEYNoAnthropic API key (sk-ant-...). Used for claude-* / anthropic-* models. BYOK — Lucairn does not store it.

Capabilities

Features and capabilities supported by this server

CapabilityDetails
tools
{}

Tools

Functions exposed to the LLM to take actions

NameDescription
chat_via_lucairnA

Send a chat request through the Lucairn privacy gateway with cross-provider BYOK (Anthropic + OpenAI). PII is detected and replaced with placeholders before reaching the upstream LLM. The gateway picks the upstream provider based on the model parameter: claude-* / anthropic-* use ANTHROPIC_API_KEY; gpt-* / openai-* / o1-* / o3-* / o4-* use OPENAI_API_KEY. Wire format follows the Anthropic Messages API. Developer-tier responses contain raw placeholders; Pro and Enterprise tiers can enable automatic re-linking back to the original values.

Prompts

Interactive templates invoked by user choice

NameDescription

No prompts

Resources

Contextual data attached and managed by the client

NameDescription

No resources

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/Declade/lucairn-sdks'

If you have feedback or need assistance with the MCP directory API, please join our Discord server