Skip to main content
Glama

aws_s3

Manage AWS S3 buckets and objects directly from Ansible. Perform actions like creating, deleting, listing buckets, and uploading or downloading objects with specified configurations.

Instructions

Manage AWS S3 buckets and objects

Input Schema

NameRequiredDescriptionDefault
aclNo
actionYes
bucketNo
contentTypeNo
localPathNo
metadataNo
objectKeyNo
regionYes
tagsNo

Input Schema (JSON Schema)

{ "$schema": "http://json-schema.org/draft-07/schema#", "additionalProperties": false, "properties": { "acl": { "type": "string" }, "action": { "enum": [ "list_buckets", "create_bucket", "delete_bucket", "list_objects", "upload", "download" ], "type": "string" }, "bucket": { "type": "string" }, "contentType": { "type": "string" }, "localPath": { "type": "string" }, "metadata": { "additionalProperties": { "type": "string" }, "type": "object" }, "objectKey": { "type": "string" }, "region": { "minLength": 1, "type": "string" }, "tags": { "additionalProperties": { "type": "string" }, "type": "object" } }, "required": [ "action", "region" ], "type": "object" }

Implementation Reference

  • Main handler function for the aws_s3 tool. Generates dynamic Ansible playbooks for S3 operations (list_buckets, create_bucket, delete_bucket, list_objects, upload, download) and executes them using executeAwsPlaybook.
    export async function s3Operations(args: S3Options): Promise<string> { await verifyAwsCredentials(); const { action, region, bucket, objectKey, localPath, acl, tags, metadata, contentType } = args; let playbookContent = `--- - name: AWS S3 ${action} operation hosts: localhost connection: local gather_facts: no tasks:`; switch (action) { case 'list_buckets': playbookContent += ` - name: List S3 buckets amazon.aws.s3_bucket_info: region: "${region}" register: s3_buckets - name: Display buckets debug: var: s3_buckets.buckets`; break; case 'create_bucket': playbookContent += ` - name: Create S3 bucket amazon.aws.s3_bucket: region: "${region}" name: "${bucket}" state: present ${formatYamlParams({ tags, acl })} register: s3_create - name: Display creation result debug: var: s3_create`; break; case 'delete_bucket': playbookContent += ` - name: Delete S3 bucket amazon.aws.s3_bucket: region: "${region}" name: "${bucket}" state: absent force: true register: s3_delete - name: Display deletion result debug: var: s3_delete`; break; case 'list_objects': playbookContent += ` - name: List S3 objects amazon.aws.s3_object: region: "${region}" bucket: "${bucket}" mode: list register: s3_objects - name: Display objects debug: var: s3_objects.keys`; break; case 'upload': playbookContent += ` - name: Upload file to S3 amazon.aws.s3_object: region: "${region}" bucket: "${bucket}" object: "${objectKey}" src: "${localPath}" mode: put ${formatYamlParams({ acl, tags, metadata, content_type: contentType })} register: s3_upload - name: Display upload result debug: var: s3_upload`; break; case 'download': playbookContent += ` - name: Download file from S3 amazon.aws.s3_object: region: "${region}" bucket: "${bucket}" object: "${objectKey}" dest: "${localPath}" mode: get register: s3_download - name: Display download result debug: var: s3_download`; break; default: throw new AnsibleError(`Unsupported S3 action: ${action}`); } // Execute the generated playbook return executeAwsPlaybook(`s3-${action}`, playbookContent); }
  • Zod schema definition for aws_s3 tool input validation, including action enum and parameters like region, bucket, objectKey, etc.
    export const S3Schema = z.object({ action: S3ActionEnum, region: z.string().min(1, 'AWS region is required'), bucket: z.string().optional(), objectKey: z.string().optional(), localPath: z.string().optional(), acl: z.string().optional(), tags: z.record(z.string()).optional(), metadata: z.record(z.string()).optional(), contentType: z.string().optional() }); export type S3Options = z.infer<typeof S3Schema>;
  • Registration of the aws_s3 tool in the toolDefinitions map, linking description, schema from aws.S3Schema, and handler aws.s3Operations.
    aws_s3: { description: 'Manage AWS S3 buckets and objects', schema: aws.S3Schema, handler: aws.s3Operations, },
  • Enum definition for S3 actions used in the aws_s3 schema.
    export const S3ActionEnum = z.enum(['list_buckets', 'create_bucket', 'delete_bucket', 'list_objects', 'upload', 'download']); export type S3Action = z.infer<typeof S3ActionEnum>;
  • Helper function executeAwsPlaybook used by all AWS handlers including s3Operations to execute the generated Ansible playbook in a temp directory.
    async function executeAwsPlaybook( operationName: string, playbookContent: string, extraParams: string = '', tempFiles: { filename: string, content: string }[] = [] // For additional files like templates, policies ): Promise<string> { let tempDir: string | undefined; try { // Create a unique temporary directory tempDir = await createTempDirectory(`ansible-aws-${operationName}`); // Write the main playbook file const playbookPath = await writeTempFile(tempDir, 'playbook.yml', playbookContent); // Write any additional temporary files for (const file of tempFiles) { await writeTempFile(tempDir, file.filename, file.content); } // Build the command const command = `ansible-playbook ${playbookPath} ${extraParams}`; console.error(`Executing: ${command}`); // Execute the playbook asynchronously const { stdout, stderr } = await execAsync(command); // Return stdout, or a success message if stdout is empty return stdout || `${operationName} completed successfully (no output).`; } catch (error: any) { // Handle execution errors const errorMessage = error.stderr || error.message || 'Unknown error'; throw new AnsibleExecutionError(`Ansible execution failed for ${operationName}: ${errorMessage}`, error.stderr); } finally { // Ensure cleanup happens even if errors occur if (tempDir) { await cleanupTempDirectory(tempDir); } } }

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/tarnover/mcp-ansible'

If you have feedback or need assistance with the MCP directory API, please join our Discord server