Search Results
6/26/2025, 8:47:57 AM
https://github.com/mit-han-lab/radial-attention
>We present Radial Attention, a sparse attention mechanism with O(nlogn) computational complexity. Radial Attention accelerates pre-trained HunyuanVideo by 1.9× at its default video length while maintaining comparable video quality. When generating 4× longer videos, it reduces tuning costs by up to 4.4× and speeds up inference by up to 3.7× versus dense attention.
ok now we're talking
>We present Radial Attention, a sparse attention mechanism with O(nlogn) computational complexity. Radial Attention accelerates pre-trained HunyuanVideo by 1.9× at its default video length while maintaining comparable video quality. When generating 4× longer videos, it reduces tuning costs by up to 4.4× and speeds up inference by up to 3.7× versus dense attention.
ok now we're talking
Page 1