NeurIPS 2019
Sun Dec 8th through Sat the 14th, 2019 at Vancouver Convention Center
Paper ID:850
Title:HyperGCN: A New Method For Training Graph Convolutional Networks on Hypergraphs

Reviewer 1


		
The relationships of many real-world networks are complex and go beyond pairwise associations. Hypergraphs provide a flexible and natural modeling tool to model such complex relationships. The authors propose HyperGCN, a novel way of training a GCN for semi-supervised learning on hypergraphs using tools from spectral theory of hypergraphs and introduce FastHyperGCN. They conduct some experiments on co-authorship and co-citation hypergraphs to demonstrate the effectiveness of HyperGCN, and provide theoretical analyses for the results. Strength: 1. The paper proposes 1-HyperGCN and HyperGCN using the hypergraph Laplacian and the generalized hypergraph Laplacian with mediators. FastHyperGCN is proposed for fast training, which computes the hypergraph Laplacian matrix only once before training. 2. HyperGCN is applied to combinatorial optimization such as densest k-subhypergraph problem. The paper is well presented and easy to follow, but it can be further improved by addressing the following potential issues: 1. The experimental setting is a little unclear. The training/test split ratio of the dataset is not reported in the paper. 2. HyperGCN should be compared with GCN models since GCN models are also designed for the semi-supervised task and can be easily applied to hypergraphs. 3. There are a few typos in the paper such as “Cora co-citaion” in line 203.

Reviewer 2


		
As the contributions detailed above, this paper has good originality, which nicely connects the recent developments of graph neural networks and spectral hypergraph theory. Moreover, extensive experiments in both semi-supervised learning and combinatorial optimization demonstrate the effectiveness of such a new connection. So I also think this paper has good quality and significance. The only issue I suspect is about clarity. As I am rather familiar with spectral hypergraph theory and kind of understand graph neural networks, this content reads acceptable for me. However, I do not think the review of new hypergraph Laplacian operators is adequate, especially for the NeurIPS community to better capture the idea. Along with the review of GCN, More background on hypergraph Laplacian operators should be introduced. --- I have read through the response. The authors have solved my questions. In the final version, please provide a better exposition of the new Laplacians that motivate your work and also additional evaluations on mediators.

Reviewer 3


		
This paper extends the previous GCNs to handle the hypergraph by proposing a HyperGCN variants. Although there are many works have done the similar effort, this paper reduces the computational complexity to the linear scale via a Hypergraph Laplacian operator. However, the main concern is that other advantages of the proposed method in this paper is not clear except the reduce of the computational complexity. This confuses the reviewer why the performance of HyperGCN outperforms previous Hypergraph neural networks since they use more complete information. Besides, the extension of HyperGCNs with mediators and the Fast version are incremental and straightforward without any effort based on previous works. The authors have not given some useful insight based on the extensions. Finally, the experiments lack of thorough comparison more recent Hypergraph models.