Using tori29umai’s LoRA to automatically split facial parts, results from batching 28 images, and a log of running into the limits when attempting finer hair separation
Went 0-for-13 trying to train an Illustrious-XL LoRA on a Mac Studio M1 Max 64GB. With help from multiple AI agents, pinpointed the root causes and finally succeeded on a RunPod RTX 4090. The full record: three fatal parameters and the sd-scripts trap.
Configuration for running a Qwen-Image-Layered LoRA that automatically separates facial parts on RunPod. Comparison of RTX 6000 Ada (48GB) and RTX PRO 6000 (96GB).
Step-by-step guide to building a LoRA training environment on Windows 11 with an RTX 3060 Laptop (6 GB VRAM) using kohya_ss — from caption writing to VRAM-saving settings.
Introducing a Mac mini M4 Pro to build an in-house RAG system. A plan for setting up a LoRA training environment during downtime while waiting for specs to be finalized.