A generic Model Context Protocol framework for building AI-powered applications that provides standardized ways to create MCP servers and clients for integrating LLMs with support for Ollama and Supabase.
A server that enables seamless integration between local Ollama LLM instances and MCP-compatible applications, providing advanced task decomposition, evaluation, and workflow management capabilities.
Enables AI agents to interact with multiple LLM providers (OpenAI, Anthropic, Google, DeepSeek) through a standardized interface, making it easy to switch between models or use multiple models in the same application.
A Model Context Protocol server that loads multiple OpenAPI specifications and exposes them to LLM-powered IDE integrations, enabling AI to understand and work with your APIs directly in development tools like Cursor.
A simple server that acts as a Master Control Program (MCP) for unified interaction with OpenAI and Anthropic (Claude) AI models through a single API endpoint.