Skip to main content

Dual-Branch Contrastive Learning for Network Representation Learning

  • Conference paper
  • First Online:
Neural Information Processing (ICONIP 2023)

Part of the book series: Communications in Computer and Information Science ((CCIS,volume 1966))

Included in the following conference series:

  • 797 Accesses

Abstract

Graph Contrastive Learning (GCL) is a self-supervised learning algorithm designed for graph data and has received widespread attention in the field of network representation learning. However, existing GCL-based network representation methods mostly use a single-branch contrastive approach, which makes it difficult to learn deeper semantic relationships and is easily affected by noisy connections during the process of obtaining global structural information embedding. Therefore, this paper proposes a network representation learning method based on a dual-branch contrastive approach. Firstly, the clustering idea is introduced into the process of embedding global structural information, and irrelevant nodes are selected and removed based on the clustering results, effectively reducing the noise in the embedding process. Then, a dual-branch contrastive method, similar to ensemble learning, is proposed, in which the two generated views are compared with the original graph separately, and the joint optimization method is used to continuously update the two views, allowing the model to learn more discriminative feature representations. The proposed method was evaluated on three datasets, Cora, Citeseer, and Pubmed, for node classification and dimensionality reduction visualization experiments. The results show that the proposed method achieved better performance compared to existing baseline models.

Project supported by the National Natural Science Foundation of China (62176145).

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 79.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 99.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  1. Ju, W., Fang, Z., Gu, Y., et al.: A Comprehensive Survey on Deep Graph Representation Learning. arXiv preprint arXiv:2304.05055 (2023)

  2. Zhou, J., Cui, G., Hu, S., Zhang, Z., Yang, C., Liu, Z., et al.: Graph neural networks: a review of methods and applications. AI Open 1, 57–81 (2020)

    Article  Google Scholar 

  3. Hafidi, H., Ghogho, M., Ciblat, P., Swami, A.: Graphcl: contrastive self-supervised learning of graph representations. arXiv 2020. arXiv preprint arXiv:2007.0802 (2020)

  4. Qiu, J., et al.: Gcc: graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020)

    Google Scholar 

  5. Velickovic, P., Fedus, W., Hamilton, W.L., Li, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019)

    Google Scholar 

  6. Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126. PMLR (2020)

    Google Scholar 

  7. Zhu, Y., Xu, Y., Yu, F., Liu, Q., Wu, S., Wang, L.: Deep graph contrastive representation learning. arXiv preprint arXiv:2006.04131 (2020)

  8. Zhu, Y., Xu, Y., Yu, F., Liu, Q., Wu, S., Wang, L.: Graph contrastive learning with adaptive augmentation. In: Proceedings of the Web Conference 2021, pp. 2069–2080 (2021)

    Google Scholar 

  9. Peng, Z., et al.: Graph representation learning via graphical mutual information maximization. In: Proceedings of the Web Conference 2020, pp. 259–270 (2020)

    Google Scholar 

  10. Xia, J., Wu, L., Chen, J., Hu, B., Li, S.Z.: Simgrace: a simple framework for graph contrastive learning without data augmentation. In: Proceedings of the ACM Web Conference 2022, pp. 1070–1079 (2022)

    Google Scholar 

  11. Zhang, Y., Zhu, H., Song, Z., Koniusz, P., King, I.: COSTA: covariance-preserving feature augmentation for graph contrastive learning. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 2524–2534 (2022)

    Google Scholar 

  12. Zeng, L., Li, L., Gao, Z., Zhao, P., Li, J.: ImGCL: revisiting graph contrastive learning on imbalanced node classification. arXiv preprint arXiv:2205.11332 (2022)

  13. Goldberg, Y., Levy, O.: word2vec explained: deriving Mikolov et al.’s negative-sampling word-embedding method. arXiv preprint arXiv:1402.3722 (2014)

  14. Xia, J., Zhu, Y., Du, Y., Li, S.Z.: A survey of pretraining on graphs: taxonomy, methods, and applications. arXiv preprint arXiv:2202.07893 (2022)

  15. Lee, N., Lee, J., Park, C.: Augmentation-free self-supervised learning on graphs. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, no. 7, pp. 7372–7380 (2022)

    Google Scholar 

  16. Wang, C., Liu, Z.: Graph representation learning by ensemble aggregating subgraphs via mutual information maximization. arXiv preprint arXiv:2103.13125 (2021)

  17. Hjelm, R.D., et al.: Learning deep representations by mutual information estimation and maximization. arXiv preprint arXiv:1808.06670 (2018)

  18. Grill, J.B., Strub, F., Altch, F., Tallec, C., Richemond, P., Buchatskaya, E., et al.: Bootstrap your own latent-a new approach to self-supervised learning. Adv. Neural. Inf. Process. Syst. 33, 21271–21284 (2020)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Hu Zhang .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2024 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Zhang, H., Cao, J., Li, K., Wang, Y., Li, R. (2024). Dual-Branch Contrastive Learning for Network Representation Learning. In: Luo, B., Cheng, L., Wu, ZG., Li, H., Li, C. (eds) Neural Information Processing. ICONIP 2023. Communications in Computer and Information Science, vol 1966. Springer, Singapore. https://doi.org/10.1007/978-981-99-8148-9_15

Download citation

  • DOI: https://doi.org/10.1007/978-981-99-8148-9_15

  • Published:

  • Publisher Name: Springer, Singapore

  • Print ISBN: 978-981-99-8147-2

  • Online ISBN: 978-981-99-8148-9

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics