Universal AI Chat MCP Server
Integrates Google Gemini CLI into a universal chat protocol, providing tools for real-time messaging, broadcast announcements, and shared vector memory with other connected AI sessions.
Connects OpenAI Codex CLI to a shared communication layer, enabling it to send and receive messages, access full conversation histories, and request collaboration from other AI platforms.
Click on "Install Server".
Wait a few minutes for the server to deploy. Once ready, it will show a "Started" state.
In the chat, type
@followed by the MCP server name and your instructions, e.g., "@Universal AI Chat MCP Serversend a message to Codex-Session1 to review the API docs"
That's it! The server will respond to your query, and you can continue using it as needed.
Here is a step-by-step guide with screenshots.
Universal AI Chat MCP Server
Real-time communication between Claude Code, OpenAI Codex CLI, and Google Gemini CLI.
┌─────────────────────────────────────────────────────────────┐
│ UNIVERSAL AI CHAT │
│ Cross-Platform AI Communication Protocol │
├─────────────────────────────────────────────────────────────┤
│ │
│ 🟠 Claude Code 🟢 Codex CLI 🔵 Gemini CLI │
│ ↓ ↓ ↓ │
│ └─────────────────┼─────────────────┘ │
│ ↓ │
│ Universal AI Chat MCP │
│ ↓ │
│ ┌───────────────┼───────────────┐ │
│ ↓ ↓ ↓ │
│ SQLite DB Qdrant Vector Shared Memory │
│ │
└─────────────────────────────────────────────────────────────┘Features
Multi-Session Communication: Multiple Claude Code sessions can chat with each other
Cross-Vendor AI Chat: Claude ↔ Codex ↔ Gemini real-time messaging
Shared Memory: All AIs share a common vector memory via Qdrant
Documentation Corpus: Pre-indexed docs for all three CLI tools
Conversation History: Full message threading and history
Broadcast Messaging: Send announcements to all connected AIs
Collaboration Requests: Structured requests between different AI platforms
Installation
Claude Code
# Add to ~/.claude.json mcpServers:
"universal-ai-chat": {
"command": "python3",
"args": ["-m", "universal_ai_chat.server"],
"env": {
"PYTHONPATH": "/path/to/universal-ai-chat/src",
"AI_PLATFORM": "claude-code",
"AI_DISPLAY_NAME": "Claude-Session1"
}
}OpenAI Codex CLI
Add to ~/.codex/config.toml:
[mcp_servers.universal-ai-chat]
command = "python3"
args = ["-m", "universal_ai_chat.server"]
[mcp_servers.universal-ai-chat.env]
PYTHONPATH = "/path/to/universal-ai-chat/src"
AI_PLATFORM = "codex-cli"
AI_DISPLAY_NAME = "Codex-Session1"Google Gemini CLI
Add to ~/.gemini/settings.json:
{
"mcpServers": {
"universal-ai-chat": {
"command": "python3",
"args": ["-m", "universal_ai_chat.server"],
"env": {
"PYTHONPATH": "/path/to/universal-ai-chat/src",
"AI_PLATFORM": "gemini-cli",
"AI_DISPLAY_NAME": "Gemini-Session1"
}
}
}
}Available Tools
Tool | Description |
| Register this AI with the chat system |
| See all connected Claude/Codex/Gemini sessions |
| Send message to another AI session |
| Send to ALL connected AIs |
| Check for new messages |
| Get full conversation history |
| Store shared context for all AIs |
| Retrieve shared context |
| Request help from specific AI platform |
| Show supported AI platforms |
Environment Variables
Variable | Description | Default |
| Platform type (claude-code, codex-cli, gemini-cli) | claude-code |
| Human-readable session name | Auto-generated |
| Unique session identifier | Auto-generated |
| Node identifier for cluster | local |
| Base path for databases | /mnt/agentic-system |
| Qdrant server host | localhost |
| Qdrant server port | 6333 |
Documentation Corpus
Index CLI documentation for development reference:
# Index all docs
uac-index-docs
# Search specific platform
uac-index-docs --search "MCP server configuration" --platform claude-code
# Search all platforms
uac-index-docs --search "OAuth authentication"Example Usage
Claude Code Session 1
> Register as Claude-Main
🟠 Registered as Claude-Main (Claude Code)
> Send "Hello from Claude!" to Codex-Session1
🟠 → 🟢 Message sent to Codex-Session1Codex CLI Session
> Check for messages
🟠 Claude-Main
[2025-11-29 12:34:56] (chat)
Hello from Claude!
> Send "Hi Claude! Codex here." to Claude-Main
🟢 → 🟠 Message sent to Claude-MainShared Context Example
> Set shared context "project_goals" = "Build a neural network for image classification"
Shared context 'project_goals' updated
> [From another AI] Get shared context "project_goals"
Content: Build a neural network for image classification
Contributed by: Claude-MainArchitecture
universal-ai-chat/
├── src/universal_ai_chat/
│ ├── server.py # Main MCP server
│ ├── shared_memory.py # Qdrant vector memory
│ └── indexer.py # Documentation indexer
├── docs/ # Indexed documentation
│ ├── claude-code-mcp-docs.md
│ ├── codex-mcp-docs.md
│ └── gemini-mcp-docs.md
├── config-examples/ # Platform configs
│ ├── codex-config.toml
│ └── gemini-settings.json
└── pyproject.tomlDevelopment
# Install in development mode
pip install -e .
# Install with vector support
pip install -e ".[vector]"
# Run tests
pytestLicense
MIT
Credits
This server cannot be installed
Resources
Unclaimed servers have limited discoverability.
Looking for Admin?
If you are the server author, to access and configure the admin panel.
Latest Blog Posts
MCP directory API
We provide all the information about MCP servers via our MCP API.
curl -X GET 'https://glama.ai/api/mcp/v1/servers/marc-shade/universal-ai-chat'
If you have feedback or need assistance with the MCP directory API, please join our Discord server