We provide all the information about MCP servers via our MCP API.
curl -X GET 'https://glama.ai/api/mcp/v1/servers/marc-shade/universal-ai-chat'
If you have feedback or need assistance with the MCP directory API, please join our Discord server
__init__.py•311 B
"""
Universal AI Chat MCP Server
Real-time communication between Claude Code, OpenAI Codex CLI, and Gemini CLI.
"""
__version__ = "1.0.0"
__author__ = "Marc"
from .server import main
from .shared_memory import SharedMemoryStore, get_shared_memory
__all__ = ["main", "SharedMemoryStore", "get_shared_memory"]