structural-transformer icon indicating copy to clipboard operation
structural-transformer copied to clipboard

Code corresponding to our paper "Modeling Graph Structure in Transformer for Better AMR-to-Text Generation" in EMNLP-IJCNLP-2019

Results 7 structural-transformer issues
Sort by recently updated
recently updated
newest added

Hi, Could you please release the preprocessing codes for generating the structural sequence and the commands for applying bpe? i.e., how to get the files in [corpus_sample/all_path_corpus](https://github.com/Amazing-J/structural-transformer/tree/master/corpus_sample/all_path_corpus) and [corpus_sample/five_path_corpus](https://github.com/Amazing-J/structural-transformer/tree/master/corpus_sample/five_path_corpus). Thanks.

Hi, Thanks for your meaningful works, i'm sorry to ask you about how to preprocess our data, which consist of the amr parsing result by using "stog", the sample as...

Hi @Amazing-J, I tried to replicate the results of the baseline, feature-based and CNN/SA models. Here's the results that I got (I'm not sure which BLEU score you use, so...

Hi, I am trying to reproduce the result of your baseline model. I find that your tokenized target texts seems to be different from previous works (e.g., https://github.com/freesunshine0316/neural-graph-to-seq-mp). In your...

Hi, thanks for the great work! I try to run the code. However, I don't know how to do data preprocessing for AMR corpus. May I ask how can I...

Hi, I often ran into the following error when starting a multi-GPU training. ``` Traceback (most recent call last): File "train.py", line 116, in main(opt) File "train.py", line 44, in...

Hi, I am trying to reproduce the results of the baseline NMT system in your paper, however, I found the examples given in [Readme](https://github.com/Amazing-J/structural-transformer#baseline-input) and that in [corpus_example](https://github.com/Amazing-J/structural-transformer/blob/master/corpus_sample/baseline_corpus/train_source_bpe) are not...