[Bug] RelGraphConv one-hot-encoding 'not enough values to unpack'
🐛 Bug
I am having some problems using the RelGraphConv layer with one-hot encodings as node features. Apparently there is some problem eitther with the 'message' method of the RelGraphConv module or the 'TypedLinear' layer used in it.
To Reproduce
Simply run the code of the example found in the docs here
import dgl import numpy as np import torch as th from dgl.nn import RelGraphConv g = dgl.graph(([0,1,2,3,2,5], [1,2,3,4,0,3])) conv = RelGraphConv(10, 2, 3, regularizer='basis', num_bases=2) one_hot_feat = th.tensor(np.array([0,1,2,3,4,5]).astype(np.int64)) etype = th.tensor(np.array([0,1,2,0,1,2]).astype(np.int64)) res = conv(g, feat, etype) res
Expected behavior
Expected to work the same as with dense tensor as node features.
Environment
- DGL Version : 0.9
- Backend Library & Version : Pytorch 1.11
- OS : Linux
- How you installed DGL : conda
- Python version: 3.9.12
Hi, v0.8 has deprecated the usage of one-hot encoding input. As a replacement, please use torch.nn.Embedding.
import dgl
import numpy as np
import torch as th
from dgl.nn import RelGraphConv
g = dgl.graph(([0,1,2,3,2,5], [1,2,3,4,0,3]))
conv = RelGraphConv(10, 2, 3, regularizer='basis', num_bases=2)
embed = th.nn.Embedding(g.num_nodes(), 10)
etype = th.tensor(np.array([0,1,2,0,1,2]).astype(np.int64))
res = conv(g, embed.weight, etype)
print(res.shape)
P.S. Please check out the latest doc page at https://docs.dgl.ai/generated/dgl.nn.tensorflow.conv.RelGraphConv.html
I see, sorry I didn't notice I was looking into old docs. So, anyway, you still have to go with dense tensors as initial embedding of the nodes, right?
Yes. The approach is actually the same for the internal implementation.
Alright, thanks for the help, I guess we can close.