MA Jian, WANG Yifei, MENG Li, HE Yunfei, YANG Fei. Label-independent Information Compression for Heterogeneous Graph Representation[J]. Computer and Modernization, 2025, 0(04): 36-41.
[1] SCARSELLI F, GORI M, TSOI A C, et al. The graph neural network model[J]. IEEE Transactions on Nneural Networks, 2008,20(1): 61-80.
[2] KIPF T N, WELLING M. Semi-supervised classification with graph convolutional network[C]// 5th International Conference on Learning Representations. ICLR, 2017. DOI:
10.48550/arXiv.1609.02907.
[3] 姚春华,张学磊,宋馨宇,等.一种基于图卷积神经网络和依存分析的财经新闻情感分析方法[J]. 计算机与现代化, 2022(5):33-39.
[4] 胡鹏,阚红星,王永康,等. 基于异构图注意力网络的药物相互作用预测[J]. 中国医疗设备, 2023,38(5):110-114.
[5] 夏义春,李汪根,李豆豆,等. 结合注意力机制和图神经网络的CTR预估模型[J]. 计算机与现代化, 2023(3):29-37.
[6] WEI X M, LIU Y Z, SUN J S, et al. Dual subgraph-based graph neural network for friendship prediction in location-based social networks[J]. ACM Transactions on Knowledge Discovery from Data, 2023,17(3). DOI: 10.1145/355
4981.
[7] LI M, CAI X R, XU S H, et al. Metapath-aggregated heterogeneous graph neural network for drug-target interaction prediction[J]. Briefings in Bioinformatics, 2023,24(1). DOI: 10.1093/bib/bbac578.
[8] LI Z F, LIU H, ZHANG Z L, et al. Learning knowledge graph embedding with heterogeneous relation attention networks[J]. IEEE Transactions on Neural Networks and Learning Systems, 2022,33(8):3961-3973.
[9] WANG L, LI P P, XIONG K, et al. Modeling heterogeneous graph network on fraud detection: A community-based framework with attention mechanism[C]// Proceedings of the 30th ACM International Conference on Information & Knowledge Management. ACM, 2021:1959-1968.
[10] HAMILTION W L, YING R, LESKOVEC J. Inductive representation learning on large graphs[C]// Proceedings of the 31st International Conference on Neural Information Processing Systems. ACM, 2017:1025-1035.
[11] VELIC[C]KOVIC P, CUCURULL G, CASANOVA A, et al. Graph attention networks[C]// The 6th International Conference on Learning Representations. ICLR, 2018. DOI: 10.
48550/arXiv.1710.10903.
[12] DONG X R, ZHANG Y J, PANG K, et al. Heterogeneous graph neural networks with denoising for graph embeddings[J]. Knowledge-Based Systems, 2022,238(C). DOI: 10.1016
/j.knosys.2021.107899.
[13] WU T L, REN H Y, LI P, et al. Graph information bottleneck[C]// Procedings of te 34th International Conference on Neural Information Processing Systems. ACM, 2020:20437-20448.
[14] XU K Y L, HU W H, LESKOVEC J, et al. How powerful are graph neural networks?[C]// The 7th International Conference on Learning Representations. ICLR, 2019. DOI:10.
48550/arXiv.1810.00826.
[15] WANG X, JI H Y, SHI C, et al. Heterogeneous graph attention network[C]// Proceedings of the World Wide Web Conference 2019. ACM, 2019:2022-2032.
[16] FANG Y, LIN W Q, ZHENG V W, et al. Metagraph-based learning on heterogeneous graphs[J]. IEEE Transactions on Knowledge and Data Engineering, 2019,33(1):154-168.
[17] WANG Z H, YU D H, LI Q, et al. SR-HGN: Semantic-and relation-aware heterogeneous graph neural network[J]. Expert Systems with Applications, 2023, 224(C). DOI: 10.
1016/j.eswa.2023.119982.
[18] JI H Y, WANG X, SHI C, et al. Heterogeneous graph propagation network[J]. IEEE Transactions on Knowledge and Data Engineering, 2021,35(1):521-532.
[19] FU X Y, ZHANG J N, MENG Z Q, et al. MAGNN: Metapath aggregated graph neural network for heterogeneous graph embedding[C]// Proceedings of the Web Conference 2020. ACM, 2020:2331-2341.
[20] TISHBY N, ZASLAVSKY N. Deep learning and the information bottleneck principle[C]// Proceedings of the 2015 IEEE Information Theory Workshop. IEEE, 2015. DOI: 10.
1109/ITW.2015.7133169.
[21] GRETTON A, BOUSQUET O, SMOLA A, et al. Measuring statistical dependence with Hilbert-Schmidt norms[C]// Proceedings of the 16th International Conference on Algorithmic Learning Theory. Springer, 2005:63-77.
[22] MA W D K, LEWIS J P, KLEIJN W B. The HSIC bottleneck: Deep learning without back-propagation[C]// Proceedings of the 34th AAAI Conference on Artificial Intelligence. AAAI, 2020,34(4):5085-5092.
[23] HE Y F, ZHANG Y W, QI L Y, et al. Outer product enhanced heterogeneous information network embedding for Recommendation[J]. Expert Systems with Applications, 2021,169. DOI: 10.1016/j.eswa.2020.114359.
[24] GREENFELD D, SHALIT U. Robust learning with the Hilbert-Schmidt independence criterion[C]// Proceedings of the 37th International Conference on Machine Learning. ACM, 2020,352:3759-3768.
[25] HE Y F, YAN D C, XIE W X, et al. Optimizing graph neural network with multiaspect Hilbert-Schmidt independence criterion[J]. IEEE Transactions on Neural Networks and Learning Systems, 2023,34(12):10775-10788.
[26] HUANG Q, YAMADA M, TIAN Y, et al. GraphLIME: Local interpretable model explanations for graph neural networks[J]. IEEE Transactions on Knowledge and Data Engineering, 2023,35(7):6968-6972.
[27] WANG M J, ZHENG D, YE Z H, et al. Deep graph library:A graph-centric,highly-performant package for graph neural networks[J]. arXiv preprint arXiv:1909.01315, 2019.