Here are 16 public repositories matching this topic...
RWKV (pronounced RwaKuv) is an RNN with great LLM performance, which can also be directly trained like a GPT transformer (parallelizable). We are at RWKV-7 "Goose". So it's combining the best of RNN and transformer - great performance, linear time, constant space (no kv-cache), fast training, infinite ctx_len, and free sentence embedding.
Updated Apr 23, 2025 Python [NeurIPS 2024] Official code of ”LION: Linear Group RNN for 3D Object Detection in Point Clouds“
Updated Oct 8, 2024 Python Explorations into the recently proposed Taylor Series Linear Attention
Updated Aug 18, 2024 Python Implementation of Agent Attention in Pytorch
Updated Jul 10, 2024 Python The semantic segmentation of remote sensing images
Updated Jul 29, 2022 Python The semantic segmentation of remote sensing images
Updated Jul 29, 2022 Python CUDA implementation of autoregressive linear attention, with all the latest research findings
Updated May 23, 2023 Python Reference implementation of "Softmax Attention with Constant Cost per Token" (Heinsen, 2024)
Updated Jun 6, 2024 Python Offical implementation of "MetaLA: Unified Optimal Linear Approximation to Softmax Attention Map" (NeurIPS2024 Oral)
Updated Jan 18, 2025 Python Code for the paper "Cottention: Linear Transformers With Cosine Attention"
Updated Jan 8, 2023 Python RWKV Wiki website (archived, please visit official wiki)
Updated Mar 26, 2023 Shell [ICML 2024] Official implementation of "LeaPformer: Enabling Linear Transformers for Autoregressive and Simultaneous Tasks via Learned Proportions."
Updated Nov 12, 2024 Python Official Implementation of SEA: Sparse Linear Attention with Estimated Attention Mask (ICLR 2024)
Updated Mar 24, 2025 Python LEAP: Linear Explainable Attention in Parallel for causal language modeling with O(1) path length, and O(1) inference
Updated Jun 18, 2023 Jupyter Notebook Taming Transformers for High-Resolution Image Synthesis
Updated May 2, 2022 Jupyter Notebook Improve this page Add a description, image, and links to the linear-attention topic page so that developers can more easily learn about it.
Curate this topic
Add this topic to your repo To associate your repository with the linear-attention topic, visit your repo's landing page and select "manage topics."
Learn more
You can’t perform that action at this time.