Search Results
8/3/2025, 1:59:34 AM
>>106121510
Theoretically sounds amazing for memory usage :
>Radial Attention reduces the computational complexity of attention from O(n2) to O(nlogn). When generating a 500-frame 720p video with HunyuanVideo, it reduces the attention computation by 9×, achieves 3.7× speedup, and saves 4.6× tuning costs.
Theoretically sounds amazing for memory usage :
>Radial Attention reduces the computational complexity of attention from O(n2) to O(nlogn). When generating a 500-frame 720p video with HunyuanVideo, it reduces the attention computation by 9×, achieves 3.7× speedup, and saves 4.6× tuning costs.
Page 1