Abstract:In the domain of graph analytics, the task of link prediction in heterogeneous graphs remains a formidable challenge. Heterogeneous Graph Neural Networks (HGNNs) have been developed to learn representations of nodes within such graphs, which are then used to predict links based on the representations of the nodes at the endpoints of those links. However, a significant issue arises with metapath-based HGNNs, which often cannot adequately balance efficiency with performance. Additionally, traditional relation-based heterogeneous graph models struggle to process complex relations or to fully learn and leverage the type information embedded in heterogeneous graphs.To address these limitations, we propose a novel, streamlined model designed for link prediction in HGNNs, termed LightREGNN. This model utilizes learnable relational type embeddings to characterize the heterogeneous type information present in graphs. By adopting the structural design of the TTPP model, LightREGNN effectively alleviates the problem of model degradation.Moreover, the model incorporates innovative strategies such as jumping links and L2 normalization to further enhance its performance capabilities. Through rigorous experimental validation, it is demonstrated that LightREGNN outperforms classical node representation based models for link prediction in heterogeneous graphs. Our findings indicate that LightREGNN presents a more favorable trade-off between efficiency and performance, making it a suitable candidate for heterogeneous graph neural network applications with an emphasis on link prediction tasks.