Abstractive-Text-Summarization
Abstractive-Text-Summarization copied to clipboard
Contrastive Attention Mechanism for Abstractive Text Summarization
Can you give me a command for running or necessary args for training the model for summarization? The code is prepared for machine translation instead of summarization, so we are...
when I run `train.py`, an error occured and said 'no module named data', then I copied the `data` module from pytorch's fairseq(version 0.8), more errors come out, which fairseq`s version...
As I understood your paper, opponent attention is trained through `softmin`. `softmin` is actually the **reason** why conventional attention and opponent attention are trained in an **opposite fashion**. --- **However,...
I'm confused about one detail in your paper. From Figure 1, it seems the contrastive attention mecanism is applied to each decoder layer :  But from the text, it...
Hi, Thanks for open-sourcing your code ! How can we reproduce your paper's results ?
你好,可以分享一下LCSTS2.0原始数据集吗?我按官方的填写了申请,一直没有收到他们的回信,CSDN的链接也失效的,多谢。[[email protected]](mailto:[email protected])