Skip to main content
Glama

llm-context

by cyberchitta
exc-base.md546 B
--- description: Base excerpt mode mappings and default configurations excerpt-modes: "*.py": code-outliner "*.js": code-outliner "*.ts": code-outliner "*.jsx": code-outliner "*.tsx": code-outliner "*.java": code-outliner "*.cpp": code-outliner "*.c": code-outliner "*.cs": code-outliner "*.go": code-outliner "*.rs": code-outliner "*.rb": code-outliner "*.php": code-outliner "*.ex": code-outliner "*.elm": code-outliner "*.svelte": sfc excerpt-config: sfc: with-style: false with-template: false ---

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/cyberchitta/llm-context.py'

If you have feedback or need assistance with the MCP directory API, please join our Discord server