Konstantin Rusch
Konstantin Rusch
We presented the one best result in the paper. I only tested it on one specific GPU with one specific pytorch version (note that seeds change throughout different machines and...
Thanks for reaching out. It is a multi-layer GNN, however, in our case we share the same parameters among the different layers. That's why we do the for-loop over the...
No, it's absolutely not! You can extend it to using different parameters among different layers by simply: ``` self.convs = nn.ModuleList() for i in range(nlayers): self.convs.append(GCNConv(nhid, nhid)) ``` and then...
I re-ran my code on a different machine and could reproduce the result we stated in the paper. Can you elaborate more on the issue? What MAE do you obtain...
I'm having the same issue. Any updates on that?
My apologies for the delayed reply. Were you able to compile it after all?