Abstract
Graph Neural Networks (GNNs) are powerful to learn representation of graph-structured data, which fuse both attributive and topological information. Prior researches have investigated the expressive power of GNNs by comparing it with Weisfeiler-Lehman algorithm. In spite of having achieved promising performance for the isomorphism test, existing methods assume overly restrictive requirement, which might hinder the performance on other graph-level tasks, e.g., graph classification and graph regression. In this paper, we argue the rationality of adaptively emphasizing important information. We propose a novel global attention module from two levels: channel level and node level. Specifically, we exploit second-order channel correlation to extract more discriminative representations. We validate the effectiveness of the proposed approach through extensive experiments on eight benchmark datasets. The proposed method performs better than the other state-of-the-art methods in graph classification and graph regression tasks. Notably, It achieves 2.7% improvement on DD dataset for graph classification and 7.1% absolute improvement on ZINC dataset for graph regression.
S. Wu—To whom correspondence should be addressed.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Chen, F., Chen, X., Meng, F., Li, P., Zhou, J.: GoG: relation-aware graph-over-graph network for visual dialog. arXiv preprint arXiv:2109.08475 (2021)
Corso, G., Cavalleri, L., Beaini, D., Liò, P., Veličković, P.: Principal neighbourhood aggregation for graph nets. In: Advances in Neural Information Processing Systems (2020)
Cui, Z., et al.: DyGCN: dynamic graph embedding with graph convolutional network. IEEE Trans. Neural Netw. Learn. Syst. (2022)
Dwivedi, V.P., Joshi, C.K., Laurent, T., Bengio, Y., Bresson, X.: Benchmarking graph neural networks. arXiv preprint arXiv:2003.00982 (2020)
Gao, H., Ji, S.: Graph representation learning via hard and channel-wise attention networks. In: Proceedings of the 25th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining (2019)
Hamilton, W., Ying, Z., Leskovec, J.: Inductive representation learning on large graphs. In: Advances in Neural Information Processing Systems (2017)
Hu, F., Liping, W., Qiang, L., Wu, S., Wang, L., Tan, T.: GraphDIVE: graph classifcation by mixture of diverse experts. In: Proceedings of the 31st International Joint Conference on Artificial Intelligence (2022)
Hu, F., Zhu, Y., Wu, S., Huang, W., Wang, L., Tan, T.: GraphAIR: graph representation learning with neighborhood aggregation and interaction. Pattern Recogn. 112, 107745 (2021)
Hu, W., et al.: Open graph benchmark: datasets for machine learning on graphs. arXiv preprint arXiv:2005.00687 (2020)
Ioffe, S., Szegedy, C.: Batch normalization: accelerating deep network training by reducing internal covariate shift. In: International Conference on Machine Learning (2015)
Irwin, J.J., Sterling, T., Mysinger, M.M., Bolstad, E.S., Coleman, R.G.: ZINC: a free tool to discover chemistry for biology. J. Chem. Inf. Model. 52, 1757–1768 (2012)
Kersting, K., Kriege, N.M., Morris, C., Mutzel, P., Neumann, M.: Benchmark data sets for graph kernels (2016)
Kingma, D.P., Ba, J.: Adam: a method for stochastic optimization. arXiv preprint arXiv:1412.6980 (2014)
Kipf, T.N., Welling, M.: Semi-supervised classification with graph convolutional networks. In: International Conference on Learning Representations (ICLR) (2017)
Lee, J., Lee, I., Kang, J.: Self-attention graph pooling. In: Proceedings of the 36th International Conference on Machine Learning (2019)
Li, P., Xie, J., Wang, Q., Zuo, W.: Is second-order information helpful for large-scale visual recognition? In: Proceedings of the IEEE International Conference on Computer Vision (2017)
Lin, T.Y., RoyChowdhury, A., Maji, S.: Bilinear CNN models for fine-grained visual recognition. In: Proceedings of the IEEE International Conference on Computer Vision (2015)
Maron, H., Ben-Hamu, H., Serviansky, H., Lipman, Y.: Provably powerful graph networks. In: Advances in Neural Information Processing Systems (2019)
Monti, F., Boscaini, D., Masci, J., Rodola, E., Svoboda, J., Bronstein, M.M.: Geometric deep learning on graphs and manifolds using mixture model CNNs. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (2017)
Morris, C., et al.: Weisfeiler and Leman go neural: higher-order graph neural networks. In: Proceedings of the AAAI Conference on Artificial Intelligence (2019)
Murphy, R.L., Srinivasan, B., Rao, V., Ribeiro, B.: Relational pooling for graph representations. In: Proceedings of the 36th International Conference on Machine Learning (2019)
Veličković, P., Cucurull, G., Casanova, A., Romero, A., Liò, P., Bengio, Y.: Graph attention networks. In: International Conference on Learning Representations (2018)
Wang, Z., Ji, S.: Second-order pooling for graph neural networks. IEEE Trans. Pattern Anal. Mach. Intell. (2020)
Weisfeiler, B., Lehman, A.A.: A reduction of a graph to a canonical form and an algebra arising during this reduction. Nauchno-Technicheskaya Informatsia (1968)
Wu, J., Liu, Q., Xu, W., Wu, S.: Bias mitigation for evidence-aware fake news detection by causal intervention. In: Proceedings of the 45th International ACM SIGIR Conference on Research and Development in Information Retrieval (2022)
Xia, B.N., Gong, Y., Zhang, Y., Poellabauer, C.: Second-order non-local attention networks for person re-identification. In: Proceedings of the IEEE International Conference on Computer Vision (2019)
Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? In: International Conference on Learning Representations (2019)
Xu, W., Wu, J., Liu, Q., Wu, S., Wang, L.: Evidence-aware fake news detection with graph neural networks. In: Proceedings of the ACM Web Conference 2022 (2022)
Zhang, D., Chen, X., Xu, S., Xu, B.: Knowledge aware emotion recognition in textual conversations via multi-task incremental transformer. In: Proceedings of the 28th International Conference on Computational Linguistics (2020)
Zhang, M., Wu, S., Gao, M., Jiang, X., Xu, K., Wang, L.: Personalized graph neural networks with attention mechanism for session-aware recommendation. IEEE Trans. Knowl. Data Eng. 34, 3946–3957 (2022)
Zhang, M., Wu, S., Yu, X., Liu, Q., Wang, L.: Dynamic graph neural networks for sequential recommendation. IEEE Trans. Knowl. Data Eng. (2022)
Zhang, Y., Yu, X., Cui, Z., Wu, S., Wen, Z., Wang, L.: Every document owns its structure: inductive text classification via graph neural networks. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics (2020)
Acknowledgements
This work is jointly supported by National Natural Science Foundation of China (62141608, U19B2038) and CAAI Huawei MindSpore Open Fund.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Hu, F. et al. (2022). Second-Order Global Attention Networks for Graph Classification and Regression. In: Fang, L., Povey, D., Zhai, G., Mei, T., Wang, R. (eds) Artificial Intelligence. CICAI 2022. Lecture Notes in Computer Science(), vol 13605. Springer, Cham. https://doi.org/10.1007/978-3-031-20500-2_41
Download citation
DOI: https://doi.org/10.1007/978-3-031-20500-2_41
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-20499-9
Online ISBN: 978-3-031-20500-2
eBook Packages: Computer ScienceComputer Science (R0)