Provides tools for managing media assets including uploading, transforming, and organizing images and videos, configuring Cloudinary environments, handling structured metadata, and leveraging AI-powered content analysis through Cloudinary's platform.
Enables integration with OpenAI's Responses API to incorporate Cloudinary's media management capabilities in real-time, allowing AI models to access and manipulate media assets during conversations.
Cloudinary MCP Servers
Model Context Protocol (MCP) is a new, standardized protocol for managing context between large language models (LLMs) and external systems. This repository provides comprehensive MCP servers for Cloudinary's media management platform, enabling you to use natural language to upload, transform, analyze, and organize your media assets directly from AI applications like Cursor and Claude.
With these MCP servers, you can seamlessly manage your entire media workflow through conversational AI - from uploading and transforming images and videos, to configuring automated processing pipelines, analyzing content with AI-powered tools, and organizing assets with structured metadata. Whether you're building media-rich applications, managing large asset libraries, or automating content workflows, these servers provide direct access to Cloudinary's full suite of media optimization and management capabilities.
The following MCP servers are available for Cloudinary:
Server Name | Description | GitHub Repository | Install |
---|---|---|---|
Asset Management | Upload, manage, and transform your media assets with advanced search and organization capabilities | @cloudinary/asset-management | |
Environment Config | Configure and manage your Cloudinary environment settings, upload presets, and transformations | @cloudinary/environment-config | |
Structured Metadata | Create, manage, and query structured metadata fields for enhanced asset organization and searchability | @cloudinary/structured-metadata | |
Analysis | Leverage AI-powered content analysis, moderation, and auto-tagging capabilities for your media assets | @cloudinary/analysis | |
MediaFlows | Build and manage low-code workflow automations for images and videos with AI-powered assistance | MediaFlows MCP |
Table of Contents
- Documentation
- Configuration
- Using Cloudinary's MCP servers with OpenAI Responses API
- Authentication
- Features by Server
- Need access to more Cloudinary tools?
- Troubleshooting
- Paid Features
- License
Documentation
For detailed guides, tutorials, and comprehensive documentation on using Cloudinary's MCP servers:
- Cloudinary MCP and LLM Tool Documentation - Complete guide to integrating Cloudinary with AI/LLM applications
- MediaFlows MCP Documentation - Setup instructions and guidelines for using the MediaFlows (MCP) server
Configuration
Quick Install with Cursor Deeplinks
For Cursor users, you can install MCP servers with one click using Cursor deeplinks:
Asset Management
Environment Config
Structured Metadata
Analysis
MediaFlows
Note: You'll need to update the environment variables (CLOUDINARY_CLOUD_NAME
, CLOUDINARY_API_KEY
, CLOUDINARY_API_SECRET
) with your actual credentials after installation.
Manual Configuration
You can also run the MCP servers using the individual npm packages. There are several ways to configure authentication:
Option 1: Using individual environment variables
Option 2: Using command line arguments
Option 3: Using CLOUDINARY_URL environment variable
Apply the same configuration pattern to all other servers by replacing @cloudinary/asset-management
with the respective package names:
@cloudinary/environment-config
@cloudinary/structured-metadata
@cloudinary/analysis
MediaFlows MCP Server Configuration
For MediaFlows, use the following configuration:
Using Cloudinary's MCP servers with OpenAI Responses API
OpenAI's Responses API allows you to integrate MCP servers directly into your OpenAI API calls, enabling AI models to access Cloudinary's media management capabilities in real-time.
Setup Overview
- Install the MCP server: Use Cursor deeplinks or manual configuration
- Configure authentication: Provide your Cloudinary credentials
- Add server to your OpenAI API request: Include MCP server configuration in your API call
- Use Cloudinary tools: The AI model can now call Cloudinary functions during the conversation
Single Server Configuration
Multiple Server Configuration
You can use multiple Cloudinary MCP servers in a single API call:
Authentication
When running MCP servers locally, authentication can be configured in several ways:
Option 1: Individual environment variables (Recommended)
Option 2: CLOUDINARY_URL environment variable
Option 3: Command line arguments
Pass credentials directly as arguments (see configuration examples above)
You can find your Cloudinary credentials in your Cloudinary Console Dashboard under Settings > Security.
Features by Server
Asset Management Server
- Upload and manage media assets (images, videos, raw files)
- Search and organize assets with advanced filtering capabilities
- Handle asset operations and transformations
- Manage folders, tags, and asset relationships
- Generate archives and download links
Environment Config Server
- Configure upload presets and transformation settings
- Manage streaming profiles and webhook notifications
- Set up upload mappings
Structured Metadata Server
- Create and manage structured metadata fields
- Configure conditional metadata rules and validation
- Organize and search metadata configurations
- Handle metadata field relationships and ordering
Analysis Server
- AI-powered content analysis including tagging, moderation, and captioning
- Object detection and recognition with multiple AI models
- Image quality analysis and watermark detection
- Content moderation and safety analysis
- Fashion, text, and anatomy detection capabilities
MediaFlows Server
- Build and manage workflow automations using natural language
- Query existing PowerFlow automations in your environment
- Create conditional logic based on metadata, tags, and asset properties
- Automate asset moderation, approval, and notification workflows
- Debug and understand existing automation configurations
Need access to more Cloudinary tools?
We're continuing to add more functionality to these MCP servers. If you'd like to leave feedback, file a bug or provide a feature request, please open an issue on this repository.
Troubleshooting
"Claude's response was interrupted..."
If you see this message, Claude likely hit its context-length limit and stopped mid-reply. This happens most often on servers that trigger many chained tool calls such as the asset management server with large asset listings.
To reduce the chance of running into this issue:
- Try to be specific, keep your queries concise.
- If a single request calls multiple tools, try to break it into several smaller tool calls to keep the responses short.
- Use filtering parameters to limit the scope of asset searches and listings.
Authentication Issues
Ensure your Cloudinary credentials are correctly configured and have the necessary permissions for the operations you're trying to perform.
Paid Features
Some features may require a paid Cloudinary plan. Ensure your Cloudinary account has the necessary subscription level for the features you intend to use, such as:
- Advanced AI analysis features
- High-volume API usage
- Custom metadata fields
- Advanced transformation capabilities
License
Licensed under the MIT License. See LICENSE file for details.
This server cannot be installed
Cloudinary MCP server enables AI agents to upload, manage, analyze transform, optimize and deliver media using Cloudinary’s API.
Related MCP Servers
- AsecurityAlicenseAqualityA Model Context Protocol server that exposes Cloudinary Upload & Admin API methods as tools by AI assistants. This integration allows AI systems to trigger and interact with your Cloudinary cloud.Last updated -5506JavaScriptMIT License
- AsecurityFlicenseAqualityAn MCP server implementation that enables AI assistants to interact with and manage Sakura Cloud infrastructure, including servers, disks, networks, and containerized applications.Last updated -462JavaScript
hostinger-api-mcpofficial
AsecurityAlicenseAqualityThe Hostinger MCP server enables seamless integration of Hostinger’s API with AI tools. This server exposes Hostinger API endpoints as callable tools, allowing AI models to fetch live data or perform real-time actions on hosting infrastructure.Last updated -6913217JavaScriptMIT License- -securityFlicense-qualityA universal, production-ready MCP server that provides AI assistants like Claude with direct access to Confluence Cloud functionality for creating, reading, updating, and managing content through multiple transport protocols.Last updated -1Python