Uses Drizzle ORM for database interactions with the SQLite usage tracking database
References experience with GitHub Copilot in author background, but no actual integration is provided
Provides access to Google Gemini 2.5 Pro models with real-time web search capabilities for investigation and research
Integrates OpenAI models (including O3) to enable complex problem-solving and reasoning capabilities through a unified MCP interface
Provides a React-based web dashboard for viewing usage statistics and managing configurations
Mentioned as part of the dashboard technology stack for the web UI
Uses local SQLite database (via libSQL) for tracking usage data, token counts and costs of LLM requests
Uses Tailwind CSS for styling the web dashboard interface
Built with TypeScript for full type safety and better developer experience compared to Python-based alternatives
Utilizes Vercel AI SDK for real-time streaming responses from different model providers
Ultra MCP
🚀 Ultra MCP - A Model Context Protocol server that exposes OpenAI and Gemini AI models through a single MCP interface for use with Claude Code and Cursor.
Inspiration
This project is inspired by:
- Agent2Agent (A2A) by Google - Thank you Google for pioneering agent-to-agent communication protocols
- Zen MCP - The AI orchestration server that enables Claude to collaborate with multiple AI models
Why Ultra MCP?
While inspired by zen-mcp-server, Ultra MCP offers several key advantages:
🚀 Easier to Use
- No cloning required - Just run
npx ultra-mcp
to get started - NPM package - Install globally with
npm install -g ultra-mcp
- Interactive setup - Guided configuration with
npx ultra-mcp config
- Zero friction - From zero to AI-powered coding in under a minute
📊 Built-in Usage Analytics
- Local SQLite database - All usage data stored locally using libSQL
- Automatic tracking - Every LLM request is tracked with token counts and costs
- Usage statistics - View your AI usage with
npx ultra-mcp db:stats
- Privacy first - Your data never leaves your machine
🌐 Modern Web Dashboard
- Beautiful UI - React dashboard with Tailwind CSS
- Real-time stats - View usage trends, costs by provider, and model distribution
- Easy access - Just run
npx ultra-mcp dashboard
- Configuration UI - Manage API keys and model priorities from the web
🔧 Additional Benefits
- Simplified tools - Maximum 4 parameters per tool (vs zen's 10-15)
- Smart defaults - Optimal model selection out of the box
- TypeScript first - Full type safety and better developer experience
- Regular updates - Active development with new features weekly
Features
- 🤖 Multi-Model Support: Integrate OpenAI (O3), Google Gemini (2.5 Pro), and Azure AI models
- 🔌 MCP Protocol: Standard Model Context Protocol interface
- 🧠 Deep Reasoning Tools: Access O3 models for complex problem-solving
- 🔍 Investigation & Research: Built-in tools for thorough investigation and research
- 🌐 Google Search Integration: Gemini 2.5 Pro with real-time web search
- ⚡ Real-time Streaming: Live model responses via Vercel AI SDK
- 🔧 Zero Config: Interactive setup with smart defaults
- 🔑 Secure Configuration: Local API key storage with
conf
library - 🧪 TypeScript: Full type safety and modern development experience
Quick Start
Installation
Configuration
Set up your API keys interactively:
This will:
- Show current configuration status
- Allow you to set/update API keys for OpenAI, Google Gemini, and Azure
- Store configuration securely on your system
- Auto-load API keys when the server starts
Running the Server
CLI Commands
Ultra MCP provides several powerful commands:
config
- Interactive Configuration
Configure API keys interactively with a user-friendly menu system.
dashboard
- Web Dashboard
Launch the web dashboard to view usage statistics, manage configurations, and monitor AI costs.
install
- Install for Claude Code
Automatically install Ultra MCP as an MCP server for Claude Code.
doctor
- Health Check
Check installation health and test API connections.
chat
- Interactive Chat
Chat interactively with AI models from the command line.
Database Commands
db:show
- Show Database Info
Display database file location and basic statistics.
db:stats
- Usage Statistics
Show detailed usage statistics for the last 30 days including costs by provider.
db:view
- Database Viewer
Launch Drizzle Studio to explore the usage database interactively.
Integration with Claude Code
Automatic Installation (Recommended)
This command will:
- Detect Claude Code installation
- Add Ultra MCP as an MCP server
- Configure for user or project scope
- Verify API key configuration
Manual Installation
Add to your Claude Code settings:
Integration with Cursor
Add to your Cursor MCP settings:
MCP Tools
Ultra MCP provides powerful AI tools accessible through Claude Code and Cursor:
🧠 Deep Reasoning (deep-reasoning
)
Leverage advanced AI models for complex problem-solving and analysis.
- Default: O3-mini for OpenAI/Azure, Gemini 2.5 Pro with Google Search
- Use Cases: Complex algorithms, architectural decisions, deep analysis
🔍 Investigate (investigate
)
Thoroughly investigate topics with configurable depth levels.
- Depth Levels: shallow, medium, deep
- Google Search: Enabled by default for Gemini
- Use Cases: Research topics, explore concepts, gather insights
📚 Research (research
)
Conduct comprehensive research with multiple output formats.
- Output Formats: summary, detailed, academic
- Use Cases: Literature reviews, technology comparisons, documentation
📋 List Models (list-ai-models
)
View all available AI models and their configuration status.
Example Usage
Development
Architecture
Ultra MCP acts as a bridge between multiple AI model providers and MCP clients:
- MCP Protocol Layer: Implements Model Context Protocol for Claude Code/Cursor communication
- Model Providers: Integrates OpenAI, Google (Gemini), and Azure AI via Vercel AI SDK
- Unified Interface: Single MCP interface to access multiple AI models
- Configuration Management: Secure local storage with schema validation
Key Components
src/cli.ts
- CLI entry point with commandersrc/server.ts
- MCP server implementationsrc/config/
- Configuration management with schema validationsrc/handlers/
- MCP protocol handlerssrc/providers/
- Model provider implementationssrc/utils/
- Shared utilities for streaming and error handling
Configuration Storage
Ultra MCP stores configuration in your system's default config directory:
- macOS:
~/Library/Preferences/ultra-mcp-nodejs/
- Linux:
~/.config/ultra-mcp/
- Windows:
%APPDATA%\ultra-mcp-nodejs\
Environment Variables
You can also set API keys via environment variables:
OPENAI_API_KEY
GOOGLE_API_KEY
AZURE_API_KEY
AZURE_ENDPOINT
Note: Configuration file takes precedence over environment variables.
Roadmap
Phase 1: Zero Config Setup
- Interactive mode for seamless first-time setup
- Auto-detection of available API keys
- Smart defaults and configuration recommendations
- One-command installation and setup
Phase 2: Integration Helpers
- Helper commands to integrate Ultra MCP into Claude Code
- Cursor IDE integration utilities
- Auto-generation of MCP server configuration files
- Integration validation and troubleshooting tools
Phase 3: Cost Dashboard & Analytics
- Web UI dashboard using React, shadcn/ui, and Tremor
- SQLite database for usage tracking via Drizzle ORM
- Real-time cost monitoring and budget alerts
- Usage analytics and model performance insights
- Export capabilities for billing and reporting
Phase 4: Workflow Optimization
- Use Ultra MCP to 100x your current LLM coding workflows
- Advanced prompt templates and automation
- Multi-model orchestration and fallback strategies
- Workflow optimization recommendations
- Performance monitoring and optimization tools
Contributing
- Fork the repository
- Create a feature branch:
git checkout -b feature-name
- Make your changes and add tests
- Run tests:
npm test
- Commit changes:
git commit -m "Add feature"
- Push to the branch:
git push origin feature-name
- Submit a pull request
Testing
License
MIT License - see LICENSE file for details.
Acknowledgments
- Google for the Agent2Agent (A2A) Protocol inspiring agent interoperability
- BeehiveInnovations for Zen MCP demonstrating AI model orchestration
- Anthropic for the Model Context Protocol
- Vercel for the excellent AI SDK
About the Author
👋 Mike Chong - Building tools to amplify human potential through AI.
As one of the earliest users of GitHub Copilot (personally invited by Nat Friedman, former GitHub CEO), I've witnessed firsthand how AI-assisted development can transform the way we build software. My journey as a former engineer on Outlook iOS/Android taught me the importance of creating tools that genuinely improve people's daily lives.
Ultra MCP represents my vision of democratizing access to the best AI models, making cutting-edge AI capabilities accessible to every developer through a unified, simple interface. I believe that by removing barriers between developers and AI models, we can accelerate innovation and create a better world for everyone.
"The future belongs to those who can seamlessly orchestrate human creativity with AI capabilities."
Why Ultra MCP is Different from Zen MCP Server
While both projects aim to enhance AI development workflows, Ultra MCP brings unique advantages:
- Written in TypeScript - Full type safety, better IDE support, and more maintainable codebase compared to Python-based alternatives
- Built-in Usage Analytics - Lightweight SQLite database powered by libsql for automatic LLM usage tracking and cost monitoring. Without knowing your bill, it's not great to use AI by AI IMHO.
These features make Ultra MCP particularly suited for developers who want robust tooling with built-in cost visibility for responsible AI usage.
Links
This server cannot be installed
A Model Context Protocol server that exposes OpenAI and Gemini AI models through a single interface, allowing tools like Claude Code and Cursor to access multiple AI providers with built-in usage analytics.
Related MCP Servers
- -securityAlicense-qualityA Model Context Protocol server that enables Claude to collaborate with Google's Gemini AI models, providing tools for question answering, code review, brainstorming, test generation, and explanations.Last updated -PythonMIT License
- -securityFlicense-qualityA Model Context Protocol server that enables Claude to interact with Google's Gemini AI models, allowing users to ask Gemini questions directly from Claude.Last updated -Python
- -securityFlicense-qualityA Model Context Protocol server that gives Claude access to multiple AI models (Gemini, OpenAI, OpenRouter) for enhanced code analysis, problem-solving, and collaborative development through AI orchestration with conversations that continue across tasks.Last updated -3,494Python
- AsecurityAlicenseAqualityA secure Model Context Protocol server that enables Claude Code to connect with OpenAI and Google Gemini models, allowing users to query multiple AI providers through a standardized interface.Last updated -31JavaScriptMIT License