Skip to main content
Glama

llm-context

by cyberchitta
exceptions.py339 B
class LLMContextError(Exception): def __init__(self, message: str, error_type: str): self.message = message self.error_type = error_type super().__init__(self.message) class RuleResolutionError(LLMContextError): def __init__(self, message: str): super().__init__(message, "RULE_RESOLUTION_ERROR")

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/cyberchitta/llm-context.py'

If you have feedback or need assistance with the MCP directory API, please join our Discord server