Skip to main content
Glama

Server Configuration

Describes the environment variables required to run the server.

NameRequiredDescriptionDefault

No arguments

Capabilities

Features and capabilities supported by this server

CapabilityDetails
tools
{
  "listChanged": false
}
prompts
{
  "listChanged": false
}
resources
{
  "subscribe": false,
  "listChanged": false
}
experimental
{}

Tools

Functions exposed to the LLM to take actions

NameDescription
cluster_of_fileC

Get architectural cluster data for a file path.

boundary_edgesC

Get fuse boundary edges touching a file or cluster.

high_coupling_seamsC

Get strongest cross-cluster seams for a file/cluster or whole repo.

impact_neighborsC

Get likely architectural blast radius from symbol/file.

search_symbolsC

Search symbols via index DB if available, else graph fallback.

architecture_summaryC

Get compact architecture summary for context injection.

classify_promptC

Classify a prompt into scope and retrieval requirements.

guided_arch_contextC

Get staged architecture context with scope-first escalation gates.

task_fingerprintC

Translate natural-language task text into COV tokens.

behavioral_twinsC

Return top behavioral twin candidates ranked by COV overlap.

twin_contextD

Return task COV + top twins + seam + rubric for implementation guidance.

reload_artifactsA

Reload graph and fuse artifacts from disk.

Prompts

Interactive templates invoked by user choice

NameDescription

No prompts

Resources

Contextual data attached and managed by the client

NameDescription

No resources

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/ahmedxuhri/bigindexer'

If you have feedback or need assistance with the MCP directory API, please join our Discord server