Search Results
6/26/2025, 9:48:53 AM
https://github.com/mit-han-lab/radial-attention
>We present Radial Attention, a sparse attention mechanism with O(nlogn) computational complexity. Radial Attention accelerates pre-trained HunyuanVideo by 1.9× at its default video length while maintaining comparable video quality. When generating 4× longer videos, it reduces tuning costs by up to 4.4× and speeds up inference by up to 3.7× versus dense attention.
wen comfyui?
>We present Radial Attention, a sparse attention mechanism with O(nlogn) computational complexity. Radial Attention accelerates pre-trained HunyuanVideo by 1.9× at its default video length while maintaining comparable video quality. When generating 4× longer videos, it reduces tuning costs by up to 4.4× and speeds up inference by up to 3.7× versus dense attention.
wen comfyui?
Page 1