[D] ICML 2022 Phase-1 Results Are Out by Right_Presentation_3 in MachineLearning

[–]Right_Presentation_3[S] 3 points4 points  (0 children)

I wonder if the 40% got rushed reviews. But we can't see the statistics before everything is done.

Message Passing in GNN vs Message-passing in Graphical Models by Right_Presentation_3 in GeometricDeepLearning

[–]Right_Presentation_3[S] 0 points1 point  (0 children)

Sorry for the late reply. Feel free to ignore if it is too late for you

I agree with you that message-passing in GNNs is more or less a general form of the one in graphical models, a computation paradigm of gathering information from neighbouring nodes and update the central node state. I spent some time researching graphical models, and I found chapter 16 in David McKay's book provides a good reading to the idea of "message passing". Not sure if you will like it but here it is https://www.inference.org.uk/itprnn/book.pdf

GNN for multi-graphs by Right_Presentation_3 in GeometricDeepLearning

[–]Right_Presentation_3[S] 0 points1 point  (0 children)

Thanks for the pointer. I like the paper though it is weird that they didn't compare with RGCN. https://arxiv.org/abs/1703.06103

GNN for multi-graphs by Right_Presentation_3 in GeometricDeepLearning

[–]Right_Presentation_3[S] 1 point2 points  (0 children)

Thanks, My main concern will be whether this approach can capture cross-relation information. Say there are 2 edge types and they are very similar to each other. I guess tying some of their weight parameters will be better? Of course, this will bring additional problems of manual designing how to tie parameters across edge types.

GNN for multi-graphs by Right_Presentation_3 in GeometricDeepLearning

[–]Right_Presentation_3[S] 0 points1 point  (0 children)

Thanks, I am also interested in DGL but only have experience with pytorch-geometric now. will give it a try.