diffpool icon indicating copy to clipboard operation
diffpool copied to clipboard

Results 31 diffpool issues
Sort by recently updated
recently updated
newest added

I am confused about the exact layers in the architecture from the paper. It states that: " We use the “mean” variant of GRAPHSAGE [16] and apply a DIFFPOOL layer...

Thank you for your nice work. Like in your paper fig.2 , How can I visualize diffpooling process? What method did you use? Best, Daeun Lee

hello, I have a question, why the DD datasets had high validation accuracy at the begging of the training. Could you please answer my question? ![无标题](https://user-images.githubusercontent.com/26483129/136696775-003b9d69-20a4-4efc-bb67-d3c544529bf1.png)

Hi Rex, I have a couple of questions regarding the implementation of the auxiliary losses. 1. In the paper it says that 'at each layer l, we minimize ``` LLP...

Thanks for your contribution! Excuse me, can DiffPool be used on Bipartite graph? Do you have any idea of how to do this? Thanks a lot!

Hello, could you elaborate the rationale behind the adoption of link prediction loss: nearby nodes to be pooled together? It's a bit confusing.

Hi, I have a question on the module in your paper, which says "apply a Diffpool layer after two Graphsage layers", "A total of 2 Dffpool layers are used ",...

I got an `FileNotFoundError` when executing `plt.savefig(gen_train_plt_name(args), dpi=600)` in `train.py/train()` ``` File "/home/LAB/penghao/.conda/envs/torch-sparse/lib/python3.6/site-packages/matplotlib/pyplot.py", line 842, in savefig res = fig.savefig(*args, **kwargs) File "/home/LAB/penghao/.conda/envs/torch-sparse/lib/python3.6/site-packages/matplotlib/figure.py", line 2311, in savefig self.canvas.print_figure(fname, **kwargs) File...

Can I ask what are the parameters for running REDDIT-MULTI-12K? I ran into memory error when trying: python -m train --bmname=REDDIT-MULTI-12K --assign-ratio=0.1 --hidden-dim=5 --output-dim=11 --cuda=0 --batch-size=10 --num-gc-layers=1 --num-classes=11 --method=soft-assign thanks!