Skip to main content
Glama
JaxonDigital

Optimizely DXP MCP Server

by JaxonDigital

db_export

Export databases from Optimizely DXP environments for backup or migration purposes. Supports automatic monitoring and downloading of exports when complete.

Instructions

šŸ’¾ Start database export from specified environment. ASYNC: 10-60min depending on database size. Set autoMonitor=true to automatically poll status every 30s. Set autoDownload=true to automatically download when export completes. Returns exportId for tracking. Required: environment, database (epicms or epicommerce). Use db_export_status() to check progress. Agent workflow: start export → monitor status → download when complete.

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
environmentNoEnvironment to export from: prod/production, staging/preproduction, int/integration (default: auto-select based on permissions)
databaseNoDatabase name: epicms or epicommerce (default: epicms)epicms
previewOnlyNoPreview export without executing - shows what would happen, includes capability check
forceNewNoForce new export - skip existing local backup check
useExistingNoUse existing local backup if available (returns immediately)
autoDownloadNoAutomatically download export when complete
monitorNoAutomatically monitor export progress until complete (polls every 30s)
downloadPathNoDirectory to save downloaded export (default: configured download path)
backgroundNoDownload in background vs wait for completion (default: true)
skipConfirmationNoSkip download confirmation prompts
retentionHoursNoHow long Azure retains export in hours (default: 168 = 7 days)
projectNoProject name (default: current project from environment)
projectNameNoAlternative to project parameter
databaseNameNoLegacy: use database parameter instead
projectIdNoProject UUID (if providing inline credentials)
apiKeyNoAPI key (if providing inline credentials)
apiSecretNoAPI secret (if providing inline credentials)

Implementation Reference

  • Primary handler function executing the db_export tool. Initiates database exports via DXP REST API, handles preview mode, existing backups check, background monitoring, and structured responses with exportId.
    static async handleExportDatabase(args: ExportDatabaseArgs): Promise<any> {
        // DXP-81: Support new 'database' parameter (replaces 'databaseName')
        const databaseName = args.database || args.databaseName;
    
        if (!args.apiKey || !args.apiSecret || !args.projectId) {
            return ResponseBuilder.invalidParams('Missing required parameters: apiKey, apiSecret, projectId');
        }
    
        // DXP-81: Preview mode with capability check
        if (args.previewOnly) {
            return this.handleCheckCapabilities(args);
        }
    
        const projectConfig: ProjectConfig = {
            name: args.projectName || 'Unknown',
            projectId: args.projectId,
            apiKey: args.apiKey,
            apiSecret: args.apiSecret
        };
    
        try {
            // DXP-183 Bug #3: Check for existing backups if useExisting is true
            // Track if we checked for existing backups but found none
            let useExistingChecked = false;
            let useExistingSearchPath = '';
    
            if (args.useExisting) {
                // Use provided downloadPath or default to current directory
                const searchPath = args.downloadPath || process.cwd();
                useExistingSearchPath = searchPath;
                useExistingChecked = true;
    
                const existingBackups = await this.checkForExistingBackups(
                    searchPath,
                    projectConfig.name,
                    args.environment || 'Production',
                    databaseName || 'epicms'
                );
    
                if (existingBackups.length > 0) {
                    const backup = existingBackups[0]; // Most recent
                    return ResponseBuilder.success(
                        `āœ… Found existing backup (useExisting=true):\\n\\n` +
                        `šŸ“¦ **File**: ${backup.fileName}\\n` +
                        `šŸ“ **Location**: ${backup.filePath}\\n` +
                        `šŸ“Š **Size**: ${this.formatBytes(backup.fileSize)}\\n` +
                        `ā±ļø  **Age**: ${backup.formattedAge}\\n\\n` +
                        `šŸ’” Use this file path for database restore operations.`
                    );
                } else {
                    // No existing backup found - will inform user in response
                    OutputLogger.info(`āš ļø  No existing backup found in ${searchPath}. Starting new export...`);
                }
            }
    
            // Start export
            const result = await this.internalStartExport(args);
    
            // Check if result is structured response
            if (result && typeof result === 'object' && 'data' in result && 'message' in result) {
                // DXP-183 Bug #3: Add useExisting feedback to message
                if (useExistingChecked) {
                    const useExistingNote = `\\n\\nšŸ“‹ **Note**: useExisting=true was specified but no local backup was found in \`${useExistingSearchPath}\`.\\n` +
                                           `Starting new export. Future calls with useExisting=true will find this backup after download completes.`;
                    result.message = result.message + useExistingNote;
                }
    
                // If autoMonitor is enabled, start background monitoring
                if (args.autoMonitor && result.data && result.data.exportId) {
                    this.startBackgroundMonitoring(
                        result.data.exportId,
                        projectConfig,
                        result.data.environment || args.environment || 'Production',
                        result.data.databaseName || databaseName || 'epicms',
                        args.downloadPath
                    );
    
                    // Update message to indicate monitoring started
                    result.message = result.message + '\\n\\nāœ… Background monitoring started. Use check_export_status to view progress.';
                }
    
                return ResponseBuilder.successWithStructuredData(result.data, result.message);
            }
    
            return ResponseBuilder.success(result);
        } catch (error: any) {
            console.error('Export database error:', error);
            return ResponseBuilder.internalError('Failed to export database', error.message);
        }
    }
  • TypeScript interface defining input parameters for the db_export tool, including optional API credentials, project details, environment, database selection, download options, preview mode, and monitoring flags.
    interface ExportDatabaseArgs {
        apiKey?: string;
        apiSecret?: string;
        projectId?: string;
        projectName?: string;
        environment?: string;
        databaseName?: string;
        database?: string; // DXP-81: New parameter replacing databaseName
        downloadPath?: string;
        retentionHours?: number;
        useExisting?: boolean;
        autoMonitor?: boolean;
        autoDownload?: boolean;
        skipConfirmation?: boolean;
        previewOnly?: boolean;
        waitBeforeCheck?: number;
        monitor?: boolean;
        incremental?: boolean;
    }
  • Core helper method that performs the actual DXP API call to start the database export, parses response, manages state, and handles queuing for concurrent exports.
    static async internalStartExport(args: ExportDatabaseArgs): Promise<StatusResult> {
        const databaseName = args.database || args.databaseName || 'epicms';
        const environment = args.environment || 'Production';
        const retentionHours = args.retentionHours || 24;
    
        const projectConfig: ProjectConfig = {
            name: args.projectName || 'Unknown',
            projectId: args.projectId!,
            apiKey: args.apiKey!,
            apiSecret: args.apiSecret!
        };
    
        // Check for active exports that might conflict
        const activeExport = await this.detectAndOfferRecovery(projectConfig);
        if (activeExport) {
            // There's an active export - check if it's for the same database
            const isSameDatabase = activeExport.environment === environment &&
                                    activeExport.databaseName === databaseName;
    
            if (isSameDatabase) {
                // Same database - offer to resume monitoring
                return {
                    data: {
                        status: 'InProgress',
                        exportId: activeExport.exportId,
                        environment: activeExport.environment,
                        databaseName: activeExport.databaseName,
                        message: 'Resuming existing export'
                    },
                    message: `Found existing export in progress for ${environment} ${databaseName}.\\n` +
                             `Export ID: ${activeExport.exportId}\\n\\n` +
                             `Use check_export_status with this exportId to monitor progress.`
                };
            } else {
                // Different database - resolve conflict
                const resolution = await this.resolveExportConflict(
                    environment,
                    databaseName,
                    projectConfig,
                    args.downloadPath
                );
    
                if (resolution.action === 'queue') {
                    return {
                        data: {
                            status: 'Queued',
                            exportId: resolution.queuedExportId || 'pending',
                            environment,
                            databaseName,
                            message: 'Export queued - will start when current export completes'
                        },
                        message: resolution.message || 'Export queued behind active export'
                    };
                } else if (resolution.action === 'cancel') {
                    return {
                        data: {
                            status: 'Cancelled',
                            environment,
                            databaseName,
                            message: 'Export cancelled by user'
                        },
                        message: 'Export request cancelled'
                    };
                }
            }
        }
    
        // Start new export via REST API (DXP-101: No PowerShell)
        try {
            const result = await DXPRestClient.startDatabaseExport(
                projectConfig.projectId,
                projectConfig.apiKey!,
                projectConfig.apiSecret!,
                environment,
                databaseName,
                retentionHours
            );
    
            // Parse export ID from result
            const exportId = this.extractExportId(result);
    
            // Save export state
            const exportInfo: ExportInfo = {
                exportId,
                projectId: projectConfig.projectId,
                projectName: projectConfig.name,
                environment,
                databaseName,
                status: 'InProgress',
                startedAt: new Date().toISOString(),
                downloadPath: args.downloadPath,
                autoMonitor: args.autoMonitor,
                autoDownload: args.autoDownload,
                incremental: args.incremental
            };
    
            await this.saveCurrentExportState(exportInfo);
    
            // DXP-155: Emit export started event
            try {
                ExportResourceHandler.emitStarted(exportId, {
                    project: projectConfig.name,
                    environment,
                    databaseName,
                    retentionHours
                });
            } catch (eventError: any) {
                console.error(`Failed to emit export started event: ${eventError.message}`);
                // Don't fail the operation if event emission fails
            }
    
            return {
                data: {
                    status: 'InProgress',
                    exportId,
                    environment,
                    databaseName,
                    message: 'Export started successfully'
                },
                message: `Database export started successfully\\n` +
                         `Export ID: ${exportId}\\n` +
                         `Environment: ${environment}\\n` +
                         `Database: ${databaseName}\\n` +
                         `Retention: ${retentionHours} hours\\n\\n` +
                         `Use check_export_status to monitor progress.`
            };
        } catch (error: any) {
            throw new Error(`Failed to start export: ${error.message}`);
        }
    }
  • Default export of the DatabaseSimpleTools class, making its static handler methods available for registration in the MCP tools registry.
    export default DatabaseSimpleTools;
  • Central tools index re-exports DatabaseSimpleTools, facilitating its import and registration as MCP tools (db_export handler) in the main server.
    import DatabaseSimpleTools from './database-simple-tools';
    
    export {
        DeploymentTools,
        StorageTools,
        ContentTools,
        SimpleTools,
        DatabaseSimpleTools
Behavior5/5

Does the description disclose side effects, auth requirements, rate limits, or destructive behavior?

With no annotations provided, the description carries the full burden of behavioral disclosure and excels at this. It clearly describes the async nature (10-60min duration), monitoring behavior (polls every 30s), auto-download capability, return value (exportId), and the complete workflow. This provides comprehensive behavioral context beyond what parameters alone would indicate.

Agents need to know what a tool does to the world before calling it. Descriptions should go beyond structured annotations to explain consequences.

Conciseness4/5

Is the description appropriately sized, front-loaded, and free of redundancy?

The description is efficiently structured with emoji, clear sections, and front-loaded critical information. Every sentence adds value: async timing, monitoring behavior, return value, requirements, and workflow. It could be slightly more concise by combining some workflow details, but overall it's well-organized and information-dense.

Shorter descriptions cost fewer tokens and are easier for agents to parse. Every sentence should earn its place.

Completeness5/5

Given the tool's complexity, does the description cover enough for an agent to succeed on first attempt?

For a complex tool with 17 parameters, no annotations, and no output schema, the description provides excellent contextual completeness. It covers the async nature, timing estimates, monitoring behavior, return value, required parameters, database options, companion tools, and the complete agent workflow. This gives the agent sufficient understanding despite the parameter complexity.

Complex tools with many parameters or behaviors need more documentation. Simple tools need less. This dimension scales expectations accordingly.

Parameters4/5

Does the description clarify parameter syntax, constraints, interactions, or defaults beyond what the schema provides?

With 100% schema description coverage, the baseline is 3, but the description adds meaningful context about key parameters: it explains the purpose of autoMonitor and autoDownload parameters, clarifies the required environment and database parameters, and provides workflow context that helps understand parameter interactions. However, it doesn't cover all 17 parameters individually.

Input schemas describe structure but not intent. Descriptions should explain non-obvious parameter relationships and valid value ranges.

Purpose5/5

Does the description clearly state what the tool does and how it differs from similar tools?

The description clearly states the specific action ('Start database export') with the target resource ('from specified environment'), distinguishing it from sibling tools like db_export_status and db_export_download. It explicitly mentions the required parameters (environment, database) and the async nature of the operation.

Agents choose between tools based on descriptions. A clear purpose with a specific verb and resource helps agents select the right tool.

Usage Guidelines5/5

Does the description explain when to use this tool, when not to, or what alternatives exist?

The description provides explicit guidance on when to use this tool vs alternatives: it names the companion tool db_export_status() for checking progress and outlines the complete agent workflow (start export → monitor status → download). It also specifies required parameters and database options.

Agents often have multiple tools that could apply. Explicit usage guidance like "use X instead of Y when Z" prevents misuse.

Install Server

Other Tools

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/JaxonDigital/optimizely-dxp-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server