Huge FLUX LoRA vs Fine Tuning / DreamBooth Experiments Completed, Moreover Batch Size 1 vs 7 Fully Tested as Well, Not Only for Realism But Also for Stylization — 15 vs 256 images having datasets compared as well (expressions / emotions tested too) — Used Kohya GUI for training
-
Full files and article : https://www.patreon.com/posts/112099700
Full Article Link
-
Images provided at the bottom of the article
Details
-
Download images in full resolution to see prompts and model names
-
All trainings are done with Kohya GUI, perfectly can be done locally on Windows, and all trainings were 1024x1024 pixels
-
Fine Tuning / DreamBooth works as low as 6 GB GPUs (0 quality degrade totally same as 48 GB config)
-
Best quality of LoRA requires 48 GB GPUs , 24 GB also works really good and minimum 8 GB GPU is necessary for LoRA (lots of quality degrade)
-
Full size grids are also shared for the followings: https://www.patreon.com/posts/112099700
-
Training used 15 images dataset : 15_Images_Dataset.png
-
Training used 256 images dataset : 256_Images_Dataset.png
-
15 Images Dataset, Batch Size 1 Fine Tuning Training : 15_imgs_BS_1_Realism_Epoch_Test.jpg , 15_imgs_BS_1_Style_Epoch_Test.jpg
-
15 Images Dataset, Batch Size 7 Fine Tuning Training : 15_imgs_BS_7_Realism_Epoch_Test.jpg , 15_imgs_BS_7_Style_Epoch_Test.jpg
-
256 Images Dataset, Batch Size 1 Fine Tuning Training : 256_imgs_BS_1_Realism_Epoch_Test.jpg , 256_imgs_BS_1_Stylized_Epoch_Test.jpg
-
256 Images Dataset, Batch Size 7 Fine Tuning Training : 256_imgs_BS_7_Realism_Epoch_Test.jpg , 256_imgs_BS_7_Style_Epoch_Test.jpg
-
15 Images Dataset, Batch Size 1 LoRA Training : 15_imgs_LORA_BS_1_Realism_Epoch_Test.jpg , 15_imgs_LORA_BS_1_Style_Epoch_Test.jpg
-
15 Images Dataset, Batch Size 7 LoRA Training : 15_imgs_LORA_BS_7_Realism_Epoch_Test.jpg , 15_imgs_LORA_BS_7_Style_Epoch_Test.jpg
-
256 Images Dataset, Batch Size 1 LoRA Training : 256_imgs_LORA_BS_1_Realism_Epoch_Test.jpg , 256_imgs_LORA_BS_1_Style_Epoch_Test.jpg
-
256 Images Dataset, Batch Size 7 LoRA Training : 256_imgs_LORA_BS_7_Realism_Epoch_Test.jpg , 256_imgs_LORA_BS_7_Style_Epoch_Test.jpg
Comparisons
-
Fine Tuning / DreamBooth 15 vs 256 images and Batch Size 1 vs 7 for Realism : Fine_Tuning_15_vs_256_imgs_BS1_vs_BS7.jpg
-
Fine Tuning / DreamBooth 15 vs 256 images and Batch Size 1 vs 7 for Style : 15_vs_256_imgs_BS1_vs_BS7_Fine_Tuning_Style_Comparison.jpg
-
LoRA Training 15 vs 256 images vs Batch Size 1 vs 7 for Realism : LoRA_15_vs_256_imgs_BS1_vs_BS7.jpg
-
LoRA Training 15 vs 256 images vs Batch Size 1 vs 7 for Style : 15_vs_256_imgs_BS1_vs_BS7_LoRA_Style_Comparison.jpg
-
Testing smiling expression for LoRA Trainings : LoRA_Expression_Test_Grid.jpg
-
Testing smiling expression for Fine Tuning / DreamBooth Trainings : Fine_Tuning_Expression_Test_Grid.jpg
Fine Tuning / DreamBooth vs LoRA Comparisons
-
15 Images Fine Tuning vs LoRA at Batch Size 1 : 15_imgs_BS1_LoRA_vs_Fine_Tuning.jpg
-
15 Images Fine Tuning vs LoRA at Batch Size 7 : 15_imgs_BS7_LoRA_vs_Fine_Tuning.jpg
-
256 Images Fine Tuning vs LoRA at Batch Size 1 : 256_imgs_BS1_LoRA_vs_Fine_Tuning.jpg
-
256 Images Fine Tuning vs LoRA at Batch Size 7 : 256_imgs_BS7_LoRA_vs_Fine_Tuning.jpg
-
15 vs 256 Images vs Batch Size 1 vs 7 vs LoRA vs Fine Tuning : 15_vs_256_imgs_BS1_vs_BS7_LoRA_vs_Fine_Tuning_Style_Comparison.jpg
-
Full conclusions and tips are also shared : https://www.patreon.com/posts/112099700
-
Additionally, I have shared full training entire logs that you can see each checkpoint took time. I have shared best checkpoints, their step count and took time according to being either LoRA, Fine Tuning or Batch size 1 or 7 or 15 images or 256 images, so a very detailed article regarding completed.
-
Check the images to see all shared files in the post.
-
Furthermore, a very very detailed analysis having article written and all latest DreamBooth / Fine Tuning configs and LoRA configs are shared with Kohya GUI installers for both Windows, Runpod and Massed Compute.
-
Moreover, I have shared new 28 realism and 37 stylization testing prompts.
Current tutorials are as below:
-
Windows requirements CUDA, Python, cuDNN, and such : https://youtu.be/DrhUHnYfwC0
-
How to use SwarmUI : https://youtu.be/HKX8_F1Er_w
-
How to use FLUX on SwarmUI : https://youtu.be/bupRePUOA18
-
How to use Kohya GUI for FLUX training : https://youtu.be/nySGu12Y05k
-
How to use Kohya GUI for FLUX training on Cloud (RunPod and Massed Compute) : https://youtu.be/-uhL2nW7Ddw
-
A new tutorial hopefully coming soon for this research and Fine Tuning / DreamBooth tutorial
I have done the following trainings and thoroughly analyzed and compared all:
-
Fine Tuning / DreamBooth: 15 Training Images & Batch Size is 1
-
Fine Tuning / DreamBooth: 15 Training Images & Batch Size is 7
-
Fine Tuning / DreamBooth: 256 Training Images & Batch Size is 1
-
Fine Tuning / DreamBooth: 256 Training Images & Batch Size is 7
-
LoRA : 15 Training Images & Batch Size is 1
-
LoRA : 15 Training Images & Batch Size is 7
-
LoRA : 256 Training Images & Batch Size is 1
-
LoRA : 256 Training Images & Batch Size is 7
-
For each batch size 1 vs 7, a unique new learning rate (LR) is researched and best one used
-
Then compared all these checkpoints against each other very carefully and very thoroughly, and shared all findings and analysis
-
Huge FLUX LoRA vs Fine Tuning / DreamBooth Experiments Completed, Moreover Batch Size 1 vs 7 Fully Tested as Well, Not Only for Realism But Also for Stylization : https://www.patreon.com/posts/112099700
Some Part of Research as Images
Top comments (0)