Skip to main content
Glama

Server Configuration

Describes the environment variables required to run the server.

NameRequiredDescriptionDefault
KIT_MCP_KIT_ROOTNoPath to a custom kit folder. Overrides the bundled kit.
ANTHROPIC_API_KEYNoAnthropic API key for LLM-driven reflect feature.
KIT_REFLECT_MODELNoOverride the model used for reflect (default depends on implementation).
KIT_REFLECT_MAX_TOKENSNoOverride max tokens for reflect.

Capabilities

Features and capabilities supported by this server

CapabilityDetails
tools
{}

Tools

Functions exposed to the LLM to take actions

NameDescription
kitC

Browse the personal kit: agents, commands, skills.

syncC

Project the kit into an IDE-specific layout (markdown references by default).

reverse-syncC

Detect and apply edits made directly in an IDE back to the canonical kit/.

gatesB

List, fetch, or execute reusable workflow gates (regression, confidence, etc).

forensicsC

Failure dataset & replays — close the learning loop on failed agent runs.

installB

Register this kit-mcp server into an IDE's MCP config (Claude/Cursor/Codex/Gemini/Windsurf).

Prompts

Interactive templates invoked by user choice

NameDescription

No prompts

Resources

Contextual data attached and managed by the client

NameDescription

No resources

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/luanpdd/kit-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server