acl2018_graph2seq
acl2018_graph2seq copied to clipboard
Seeking clarificaitons on Syntax-based NMT implementation
Hello, I have read the paper, "Graph-to-Sequence Learning using Gated Graph Neural Networks", and I am interested in implementing the model for NMT task. Can you please explain where is the Levi graph created in your code? Also, how are the sequential connections created between words in dependency tree? Thanks.