In this paper, the authors explore the idea of using Curriculum Learning (CL) to improve the optimization of Deep Graph Neural Networks (DGNNs). CL is a technique that encourages models to learn from easy examples first and gradually increase the difficulty as they progress. The authors argue that DGNNs are particularly suited for CL, given their ability to capture complex graph structures, but few prior works have focused on this area.
To address this gap, the authors propose a simple yet effective approach called Graph Curriculum Learning (GCL). GCL combines improved residual connections and a novel soft graph normalization technique, enabling the model to extract effective features and structural knowledge even with deep networks. The authors evaluate their method on twelve real-world benchmarks and show that it outperforms existing approaches in most cases.
The authors explain their approach using everyday metaphors to demystify complex concepts. For instance, they compare the process of learning in a DGNN to climbing a mountain, where the model starts with easy tasks near the base and gradually increases the difficulty as it progresses. They also liken the improved residual connections to a safety net, allowing the model to learn more robust features without falling off the graph.
In summary, the authors propose GCL, a simple yet effective approach for improving the optimization of DGNNs using CL. By combining improved residual connections and a novel soft graph normalization technique, GCL enables the model to capture complex graph structures and outperform existing approaches in most cases. The authors use everyday metaphors to demystify complex concepts and make the article accessible to a wide audience.
Computer Science, Machine Learning