Cannot reproduce your results on Cora, Citeseer and Pubmed, with random seeds.
Thanks for sharing your codes.
But if I adopt random seeds (not these 0-19 provided by this repository), even with your suggested hyper-parameters, I get not too good results with 100 runs: Cora 83.6, Citeseer 73.3, and Pubmed 80.5.
Why this happen? and why you have to fix seeds [0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19] in this repository?
Hi @beckvision , sorry for the late response. We use seed 0-19 just for 20 random experiments to obtain the average performance following the setting of previous works, rather than for other special purposes. The performance in Cora / Citeseer / PubMed are more unstable among different seeds since the training and test set are too small. We run the experiments on 2080 Ti and A100 (40G) and could get almost the same average performance for more seeds. Could you provide more details about your enviroment?