sparse-attention topic

List sparse-attention repositories

MoA

80
Stars
5
Forks
Watchers

The official implementation of the paper <MoA: Mixture of Sparse Attention for Automatic Large Language Model Compression>

Sparse-VideoGen

616
Stars
34
Forks
616
Watchers

[ICML2025, NeurIPS2025 Spotlight] Sparse VideoGen 1 & 2: Accelerating Video Diffusion Transformers with Sparse Attention

radial-attention

574
Stars
32
Forks
574
Watchers

[NeurIPS 2025] Radial Attention: O(nlogn) Sparse Attention with Energy Decay for Long Video Generation