>>105672139
I am not using sage attention or flash attention. Some other AMD anon figured out how to install these, but I haven't figured it out yet. this kind of issue is obviously the big drawback of AMD for the time being.

>>105672090
that makes sense because SVDQuant is a lot faster.