kjlahsdjkashdjhkasdkajshd

Integrations

  • Enables an automated MLOps pipeline for Stable Diffusion model fine-tuning using multiple Google Cloud services, including Vertex AI, Cloud Storage, Cloud Build, PubSub, Firestore, Cloud Run, and Cloud Functions.

  • Handles image storage for training data, maintains predefined bucket paths for uploads, and stores compiled pipeline artifacts for Stable Diffusion fine-tuning jobs.

  • Used to create notebooks that outline pipeline workflows and components for Stable Diffusion model fine-tuning on Vertex AI.

sd para diseñadores

Un flujo de trabajo totalmente automatizado para activar, ejecutar y gestionar el ajuste fino, el entrenamiento y la implementación de modelos de difusión estables personalizados utilizando Vertex AI

Descripción

Sd-aa-S es una canalización de MLOps completamente automatizada para activar, gestionar y rastrear trabajos de ajuste fino de la difusión estable en GCP mediante componentes de GCP como Google Cloud Storage, Cloud Build, Cloud PubSub, Firestore, Cloud Run, Cloud Functions y Vertex AI. Su objetivo es simplificar los flujos de trabajo de aprendizaje automático (ML) para ajustar la difusión estable mediante diferentes técnicas, comenzando con Dreambooth. Próximamente se ofrecerá compatibilidad con Lora, ControlNet, etc. El proyecto está dirigido a ingenieros de datos/ML, científicos de datos y cualquier persona interesada o en proceso de desarrollo de una plataforma para el ajuste fino de la difusión estable a escala.

Tres partes

1. La parte de la aplicación

1. Set up your Cloud Environment 2. Create a backend service for handling uploads to a GCS bucket - Receive images from clients and store them under a predefined GCS bucket path - Track the status of individual uploads in a Firestore collection - Track the status of the overall upload job in a separate Firestore collection - Once the job is compelted, publish the jobID as the message on a predefined PubSub topic 3. Deploy this backend service as a Cloud Run endpoint using Cloud build 4. Create a frontend portal to upload images using ReactJs 5. Deploy the frontend service on Cloud Run

2. La parte Vertex AI

1. Set up your Cloud Environment 2. Create a new custom container artifact for running the pipeline components 3. Create a new custom container artifact for running the training job itself 4. Create a Jupyter notebook outlining the Pipeline flow & components 5. Compile a YAML file from a Vertex AI workbench and store the precompiled YAML file under a GCS bucket path

3. La parte de plomería

1. Set up your Cloud Environment 2. Create a cloud function that gets triggered every time the jobID is published on a predefinied topic (from 1st part) 3. Within the cloud function, the python code subscribes to the topic and triggers a Vertex AI pipeline job using the precomiled YAML file (from 2nd part) 4. The pipeline jobs finetunes the stable diffusion model using Dreambooth, uploads the new custom model to Model registy & deploys an endpoint 5. The job also updates Firestore with the status of the pipeline job from start to end
-
security - not tested
F
license - not found
-
quality - not tested

kjlahsdjkashd

  1. Description
    1. Three Parts
      1. 1. The App part
        1. 2. The Vertex AI part
          1. 3. The Plumbing part
            ID: d30pjs03s9