Shifan Han

Results 3 comments of Shifan Han

Thank you very much for your patience in addressing my questions! I have two additional questions I’d like to ask: 1. Is this [repo](https://github.com/OpenNLPLab/lightning-attention) an implementation of Lightning Attention 2?...

2. Yes, I am referring to [this file](https://huggingface.co/OpenNLPLab/TransNormerLLM-385M/blob/main/lightning_attention2.py). In this [repo](https://huggingface.co/OpenNLPLab/TransNormerLLM-385M/tree/main), the content in [lightning_attention2.py](https://huggingface.co/OpenNLPLab/TransNormerLLM-385M/blob/main/lightning_attention2.py) is same as the content in [lightning_attention.py](https://huggingface.co/OpenNLPLab/TransNormerLLM-385M/blob/main/lightning_attention.py).

Ok, Thank you very much for your patience in addressing my questions!