Search Results

Found 1 results for "221b1b46fb7b85b347e1a157d6c6687c" across all boards searching md5.

Anonymous /g/106120583#106121552
8/3/2025, 1:59:34 AM
>>106121510
Theoretically sounds amazing for memory usage :

>Radial Attention reduces the computational complexity of attention from O(n2) to O(nlogn). When generating a 500-frame 720p video with HunyuanVideo, it reduces the attention computation by 9×, achieves 3.7× speedup, and saves 4.6× tuning costs.