GraphMAE2 icon indicating copy to clipboard operation
GraphMAE2 copied to clipboard

GraphMAE2: A Decoding-Enhanced Masked Self-Supervised Graph Learner in WWW'23

Results 6 GraphMAE2 issues
Sort by recently updated
recently updated
newest added

你好,感谢您出色的工作和开源的代码,我注意到GraphMAE2和GraphMAE在remask token上有所不同,在GraphMAE2中remask token是可学习的嵌入而在GraphMAE中是直接置为0,我想知道这有什么区别吗?

我们注意到,您在论文中使用了ogbn-Arxiv和ogbn-Papers100M等大型数据集,但是在我们自己的服务器上测试时,发现内存超了。 请问您ogbn-Arxiv数据集上训练过程中,内存消耗大概是多少? 关于运行GraphMAE2或者处理百万节点级别的大规模图,您有什么建议吗? ```python (graphmb) [jialh@gpu07 GraphMAE2]$ sh 01run_ogbn-arxiv.sh 2024-04-22 21:08:20,721 - INFO - ----- Using best configs from configs/ogbn-arxiv.yaml ----- Namespace(seeds=[0], dataset='ogbn-arxiv', device=0, max_epoch=60, warmup_steps=-1, num_heads=8, num_out_heads=1, num_layers=4, num_dec_layers=1,...

在之前的Issue中讨论了self._momentum参数的问题 def ema_update(self): def update(student, teacher): with torch.no_grad(): # m = momentum_schedule[it] # momentum parameter m = self._momentum for param_q, param_k in zip(student.parameters(), teacher.parameters()): param_k.data.mul_(m).add_((1 - m) * param_q.detach().data) update(self.encoder,...

Thanks for sharing your codes. But if I adopt random seeds (not these 0-19 provided by this repository), even with your suggested hyper-parameters, I get not too good results with...

May I ask which version of torch and dgl you have installed? Currently, you are encountering incompatibility problems

Thanks for your interesting work! I have a concern about the methodology of GraphMAE/GraphMAE2. Since node feature are updated iteratively through message passing in encoder and decoder in GNN, I...