Search Results

Found 1 results for "4f43037e3f8675b03e0f33c95ba2bf90" across all boards searching md5.

Anonymous /g/105703501#105707973
6/26/2025, 8:47:57 AM
https://github.com/mit-han-lab/radial-attention
>We present Radial Attention, a sparse attention mechanism with O(nlogn) computational complexity. Radial Attention accelerates pre-trained HunyuanVideo by 1.9× at its default video length while maintaining comparable video quality. When generating 4× longer videos, it reduces tuning costs by up to 4.4× and speeds up inference by up to 3.7× versus dense attention.
ok now we're talking