Everything I read about training LORAs with 12gb of vram says it will take 15 hours and that is only at 512x512. Is there a different way of doing this?