testRembg.png•989 kB
�PNG
IHDR @ � V�C
tEXtprompt {"3": {"inputs": {"seed": 473696204377748, "steps": 8, "cfg": 1.5, "sampler_name": "lcm", "scheduler": "karras", "denoise": 1.0, "model": ["30", 0], "positive": ["6", 0], "negative": ["7", 0], "latent_image": ["16", 0]}, "class_type": "KSampler"}, "4": {"inputs": {"ckpt_name": "animagine-xl-3.1.safetensors"}, "class_type": "CheckpointLoaderSimple"}, "6": {"inputs": {"text": "depth of field, cinematic composition, masterpiece, best quality,looking at viewer,(solo:1.1),(1 girl:1.1),damaged shirt,denim jeans,long socks,pigtails,blond hair,(simple background:1.2),(cowboy shot:1.3),laughing,standing,(salute:1.3),posing,(tilt my head:1.2),(cowboy shot:1.3),laughing,standing,(salute:1.3),posing,(tilt my head:1.2),1. Guide the girl to sit at the table with the meal in front, ensuring she's centered and at a comfortable distance from the food.\n2. Encourage a warm smile or a look of pleasant surprise, as if she's about to enjoy a delicious meal.\n3. Direct her to place one hand gently on the table near the food, while using the other to hold chopsticks or a small piece of food, as if she's about to take a bite.\n4. Ask her to sit upright and relaxed, slightly leaning forward to convey engagement with the meal.\n5. Instruct her to look at the camera for a direct and engaging photo, or down at the food to express anticipation.", "clip": ["29", 1]}, "class_type": "CLIPTextEncode"}, "7": {"inputs": {"text": "nsfw,lowres,bad anatomy,bad hands,text,error,missing fingers,extra digit,fewer digits,cropped,worst quality,low quality,normal quality,jpeg artifacts,signature,watermark,username,blurry,artist name,elf", "clip": ["29", 1]}, "class_type": "CLIPTextEncode"}, "8": {"inputs": {"samples": ["3", 0], "vae": ["10", 0]}, "class_type": "VAEDecode"}, "9": {"inputs": {"filename_prefix": "ComfyUI", "images": ["8", 0]}, "class_type": "SaveImage"}, "10": {"inputs": {"vae_name": "diffusion_pytorch_model.safetensors"}, "class_type": "VAELoader"}, "14": {"inputs": {"model": "isnet-anime", "alpha_matting": false, "alpha_matting_foreground_threshold": 270, "alpha_matting_background_threshold": 21, "alpha_matting_erode_size": 11, "images": ["8", 0]}, "class_type": "rembg for mi"}, "15": {"inputs": {"filename_prefix": "ComfyUI", "images": ["14", 0]}, "class_type": "SaveImage"}, "16": {"inputs": {"width": 832, "height": 1216, "batch_size": 1}, "class_type": "EmptyLatentImage"}, "29": {"inputs": {"lora_name": "pytorch_lora_weights.safetensors", "strength_model": 1.0, "strength_clip": 1.0, "model": ["4", 0], "clip": ["4", 1]}, "class_type": "LoraLoader"}, "30": {"inputs": {"sampling": "lcm", "zsnr": false, "model": ["29", 0]}, "class_type": "ModelSamplingDiscrete"}}���� IDATx^��ے$I�$2�{DfV���9�a��h����߰_ 3�9�����w�}`���Yd��T����\XDUE��#"�����o��@ �ruD���I$%�m# �' H��ҧ� A�(0��=�m��Yi2�B~�o QlJ4)�At��Ȩ��#"�@��H���(��m���y��5�H����فt�� ��HdT�y" [r�Y�}"0�8d��pF�(A�⦱ ����D�?D� JXBF d��%
(y MX� �>���i�L �6Q�.�̺gHj��3M)'����4�� �c@��c�����`P��L���s�T4��$W!��ZN�R�qڀz��@'FR+�[��U)헇#��E��C�p�v@�9D���XA ���D�s��d�(C��������N/��cc�6 QA6�n�,~ ��VQy��F�%(��\�2��`�u���_�A�s ��J��#B��K; #j�iI#qB�9��1��F� �#O�S�s��@�DDnh�4Y��������H'\��do��^zK\i�l\UkX�i�H���j�d�!3Q�څj
Q�b!���y:$�Qq�l
^�m%�Wj��� 7�0�sbԎ� }!��nF��= hj�Ǿ�����3���F�@���S�ã�E%ı���t��a*M
�᪑��J;0h*m+`��V�$����[ k�R��/hG�N*�&d�����M���f@�JAlj ��p�Wj�]̢� :�e�O�������؆8 $�F^�-���ɝ�È�����갚13"��!8�P�d��v x��5�����d���\&aF�f�y���� ��4[kt1%�y�/�Ъ�7��֙���㶇���J
:òR��H�P��D��q�� ��ix
'ip�)W�q�|H���ٙ�W&4�P�i��|"+
�3b�U�N� �#��p싁fn �����![�Q�L�@��
�ns ��hP]�?� U���kF�ط��% �T�1���}艕W_�8˵�h�#��T%�={�,��+a"����b+-���b�(�4�`��U=��
��A�[��Fz@A��
�iγ�!q�B�������y��tW5�A�Sլ�L\��Ҙ�}�])2��'*/6@Dpۀ`��m�C��)��Y�-�{l"ό���u�[%|nd��%d��%��Ҵ8l�K���c !�"P|�H̐�,�%Ʌ�PHwZʖ䆅�T>r$sγ����4�esm�41�i �~K�?$>�R;���=餦r�=� �g�]�k��e|9��Y��\Q��f/�aY&EA ���D�z{�D[I�e�pe�s�DF"��x���Jh�^TG ��"��H���>9.�)����x ��]�(7�a2hv#�
9��z�\���=��������͝|��Tȉt���������%��Z8�vY\���4f�]��@��P.ZE�Pӭ��y�l8�9�銞��(
���e3
X�5�� �� �VmȽyV��
��<͵A ���R*{u�AOA�Xu~#�&���*QX�c
���UZ�F����o�H $��rc�Aӟ#F���ai(3�:�QU'