Structured Graph Convolutional Networks with Stochastic Masks for Recommender Systems

  • Huiyuan Chen ,
  • Lan Wang ,
  • Yusan Lin ,
  • Chin-Chia Michael Yeh ,
  • Fei Wang ,
  • Hao Yang ,
  • Hao Yang

2021 International ACM SIGIR Conference on Research and Development in Information Retrieval |

Publication

Graph Convolutional Networks (GCNs) are powerful for collaborative filtering. The key component of GCNs is to explore neighborhood aggregation mechanisms to extract high-level representations of users and items. However, real-world user-item graphs are often incomplete and noisy. Aggregating misleading neighborhood information may lead to sub-optimal performance if GCNs are not regularized properly. Also, the real-world user-item graphs are often sparse and low rank. These two intrinsic graph properties are widely used in shallow matrix completion models, but far less studied in graph neural models. Here we propose Structured Graph Convolutional Networks (SGCNs) to enhance the performance of GCNs by exploiting graph structural properties of sparsity and low rank. To achieve sparsity, we attach each layer of a GCN with a trainable stochastic binary mask to prune noisy and insignificant edges, resulting in a clean and sparsified graph. To preserve its low-rank property, the nuclear norm regularization is applied. We jointly learn the parameters of stochastic binary masks and original GCNs by solving a stochastic binary optimization problem. An unbiased gradient estimator is further proposed to better backpropagate the gradients of binary variables. Experimental results demonstrate that SGCNs achieve better performance compared with the state-of-the-art GCNs.