Skip to main content
Glama

Cloudinary

by cloudinary
MIT License

Cloudinary MCP Servers

Model Context Protocol (MCP) is a new, standardized protocol for managing context between large language models (LLMs) and external systems. This repository provides comprehensive MCP servers for Cloudinary's media management platform, enabling you to use natural language to upload, transform, analyze, and organize your media assets directly from AI applications like Cursor and Claude.

With these MCP servers, you can seamlessly manage your entire media workflow through conversational AI - from uploading and transforming images and videos, to configuring automated processing pipelines, analyzing content with AI-powered tools, and organizing assets with structured metadata. Whether you're building media-rich applications, managing large asset libraries, or automating content workflows, these servers provide direct access to Cloudinary's full suite of media optimization and management capabilities.

The following MCP servers are available for Cloudinary:

Server NameDescriptionGitHub RepositoryInstall
Asset ManagementUpload, manage, and transform your media assets with advanced search and organization capabilities@cloudinary/asset-management
Environment ConfigConfigure and manage your Cloudinary environment settings, upload presets, and transformations@cloudinary/environment-config
Structured MetadataCreate, manage, and query structured metadata fields for enhanced asset organization and searchability@cloudinary/structured-metadata
AnalysisLeverage AI-powered content analysis, moderation, and auto-tagging capabilities for your media assets@cloudinary/analysis
MediaFlowsBuild and manage low-code workflow automations for images and videos with AI-powered assistanceMediaFlows MCP

Table of Contents

Documentation

For detailed guides, tutorials, and comprehensive documentation on using Cloudinary's MCP servers:

Configuration

For Cursor users, you can install MCP servers with one click using Cursor deeplinks:

Asset Management

Environment Config

Structured Metadata

Analysis

MediaFlows

Note: You'll need to update the environment variables (CLOUDINARY_CLOUD_NAME, CLOUDINARY_API_KEY, CLOUDINARY_API_SECRET) with your actual credentials after installation.

Manual Configuration

You can also run the MCP servers using the individual npm packages. There are several ways to configure authentication:

Option 1: Using individual environment variables
{ "mcpServers": { "cloudinary-asset-mgmt": { "command": "npx", "args": ["-y", "--package", "@cloudinary/asset-management", "--", "mcp", "start"], "env": { "CLOUDINARY_CLOUD_NAME": "cloud_name", "CLOUDINARY_API_KEY": "api_key", "CLOUDINARY_API_SECRET": "api_secret" } } } }
Option 2: Using command line arguments
{ "mcpServers": { "cloudinary-asset-mgmt": { "command": "npx", "args": [ "-y", "--package", "@cloudinary/asset-management", "--", "mcp", "start", "--cloud-name", "cloud_name", "--api-key", "api_key", "--api-secret", "api_secret" ] } } }
Option 3: Using CLOUDINARY_URL environment variable
{ "mcpServers": { "cloudinary-asset-mgmt": { "command": "npx", "args": ["-y", "--package", "@cloudinary/asset-management", "--", "mcp", "start"], "env": { "CLOUDINARY_URL": "cloudinary://api_key:api_secret@cloud_name" } } } }

Apply the same configuration pattern to all other servers by replacing @cloudinary/asset-management with the respective package names:

  • @cloudinary/environment-config
  • @cloudinary/structured-metadata
  • @cloudinary/analysis
MediaFlows MCP Server Configuration

For MediaFlows, use the following configuration:

{ "mcpServers": { "mediaflows": { "url": "https://mediaflows.mcp.cloudinary.com/v2/mcp", "headers": { "cld-cloud-name": "cloud_name", "cld-api-key": "api_key", "cld-secret": "api_secret" } } } }

Using Cloudinary's MCP servers with OpenAI Responses API

OpenAI's Responses API allows you to integrate MCP servers directly into your OpenAI API calls, enabling AI models to access Cloudinary's media management capabilities in real-time.

Setup Overview

  1. Install the MCP server: Use Cursor deeplinks or manual configuration
  2. Configure authentication: Provide your Cloudinary credentials
  3. Add server to your OpenAI API request: Include MCP server configuration in your API call
  4. Use Cloudinary tools: The AI model can now call Cloudinary functions during the conversation

Single Server Configuration

const response = await openai.chat.completions.create({ model: "gpt-4o", messages: [ { role: "user", content: "Analyze the content of my uploaded images" } ], tools: [ { type: "mcp_server", mcp_server: { name: "cloudinary-analysis", command: "npx", args: ["-y", "--package", "@cloudinary/analysis", "--", "mcp", "start"], env: { "CLOUDINARY_URL": "cloudinary://api_key:api_secret@cloud_name" } } } ] });

Multiple Server Configuration

You can use multiple Cloudinary MCP servers in a single API call:

const response = await openai.chat.completions.create({ model: "gpt-4o", messages: [ { role: "user", content: "Upload these images, analyze their content, and create structured metadata" } ], tools: [ { type: "mcp_server", mcp_server: { name: "cloudinary-asset-mgmt", command: "npx", args: ["-y", "--package", "@cloudinary/asset-management", "--", "mcp", "start"], env: { "CLOUDINARY_URL": "cloudinary://api_key:api_secret@cloud_name" } } }, { type: "mcp_server", mcp_server: { name: "cloudinary-analysis", command: "npx", args: ["-y", "--package", "@cloudinary/analysis", "--", "mcp", "start"], env: { "CLOUDINARY_URL": "cloudinary://api_key:api_secret@cloud_name" } } }, { type: "mcp_server", mcp_server: { name: "cloudinary-smd", command: "npx", args: ["-y", "--package", "@cloudinary/structured-metadata", "--", "mcp", "start"], env: { "CLOUDINARY_URL": "cloudinary://api_key:api_secret@cloud_name" } } } ] });

Authentication

When running MCP servers locally, authentication can be configured in several ways:

export CLOUDINARY_CLOUD_NAME="cloud_name" export CLOUDINARY_API_KEY="api_key" export CLOUDINARY_API_SECRET="api_secret"

Option 2: CLOUDINARY_URL environment variable

export CLOUDINARY_URL="cloudinary://api_key:api_secret@cloud_name"

Option 3: Command line arguments

Pass credentials directly as arguments (see configuration examples above)

You can find your Cloudinary credentials in your Cloudinary Console Dashboard under Settings > Security.

Features by Server

Asset Management Server

  • Upload and manage media assets (images, videos, raw files)
  • Search and organize assets with advanced filtering capabilities
  • Handle asset operations and transformations
  • Manage folders, tags, and asset relationships
  • Generate archives and download links

Environment Config Server

  • Configure upload presets and transformation settings
  • Manage streaming profiles and webhook notifications
  • Set up upload mappings

Structured Metadata Server

  • Create and manage structured metadata fields
  • Configure conditional metadata rules and validation
  • Organize and search metadata configurations
  • Handle metadata field relationships and ordering

Analysis Server

  • AI-powered content analysis including tagging, moderation, and captioning
  • Object detection and recognition with multiple AI models
  • Image quality analysis and watermark detection
  • Content moderation and safety analysis
  • Fashion, text, and anatomy detection capabilities

MediaFlows Server

  • Build and manage workflow automations using natural language
  • Query existing PowerFlow automations in your environment
  • Create conditional logic based on metadata, tags, and asset properties
  • Automate asset moderation, approval, and notification workflows
  • Debug and understand existing automation configurations

Need access to more Cloudinary tools?

We're continuing to add more functionality to these MCP servers. If you'd like to leave feedback, file a bug or provide a feature request, please open an issue on this repository.

Troubleshooting

"Claude's response was interrupted..."

If you see this message, Claude likely hit its context-length limit and stopped mid-reply. This happens most often on servers that trigger many chained tool calls such as the asset management server with large asset listings.

To reduce the chance of running into this issue:

  • Try to be specific, keep your queries concise.
  • If a single request calls multiple tools, try to break it into several smaller tool calls to keep the responses short.
  • Use filtering parameters to limit the scope of asset searches and listings.

Authentication Issues

Ensure your Cloudinary credentials are correctly configured and have the necessary permissions for the operations you're trying to perform.

Some features may require a paid Cloudinary plan. Ensure your Cloudinary account has the necessary subscription level for the features you intend to use, such as:

  • Advanced AI analysis features
  • High-volume API usage
  • Custom metadata fields
  • Advanced transformation capabilities

License

Licensed under the MIT License. See LICENSE file for details.

Related MCP Servers

  • A
    security
    A
    license
    A
    quality
    A Model Context Protocol server that exposes Cloudinary Upload & Admin API methods as tools by AI assistants. This integration allows AI systems to trigger and interact with your Cloudinary cloud.
    Last updated -
    5
    506
    JavaScript
    MIT License
  • A
    security
    F
    license
    A
    quality
    An MCP server implementation that enables AI assistants to interact with and manage Sakura Cloud infrastructure, including servers, disks, networks, and containerized applications.
    Last updated -
    46
    2
    JavaScript
    • Apple
    • Linux
  • A
    security
    A
    license
    A
    quality
    The Hostinger MCP server enables seamless integration of Hostinger’s API with AI tools. This server exposes Hostinger API endpoints as callable tools, allowing AI models to fetch live data or perform real-time actions on hosting infrastructure.
    Last updated -
    69
    132
    17
    JavaScript
    MIT License
    • Apple
    • Linux
  • -
    security
    F
    license
    -
    quality
    A universal, production-ready MCP server that provides AI assistants like Claude with direct access to Confluence Cloud functionality for creating, reading, updating, and managing content through multiple transport protocols.
    Last updated -
    1
    Python
    • Linux
    • Apple

View all related MCP servers

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/cloudinary/mcp-servers'

If you have feedback or need assistance with the MCP directory API, please join our Discord server