Skip to main content
Log in

An integrated simplicial neural network with neuro-fuzzy network for graph embedding

  • Original Article
  • Published:
International Journal of Machine Learning and Cybernetics Aims and scope Submit manuscript

Abstract

In recent years, graph neural network (GNN) has become the main stream for most of recent researches due to its powers in dealing with complex graph data learning problems. However, as most of the recent GNN-based architectures have been mainly designed to only evaluate direct relational structures between nodes. As the results, these techniques are unable to capture the sophisticated multi-way relationships in graph. The multi-way relationships can be represented by both explicit graph-based and complex topological structures. In general, the multi-way relationships in graph can be modelled as simplicial complexes, hyper-graphs, e.g., and can be efficiently preserved under the simplicial neural networks (SNN). There are several notable SNN-based architectures have been proposed recently, such as the well-known simplicial convolutional neural network (SCNN). The SNN-based techniques have shown the competitive performances in handling graph learning. However, most of recent proposed SNN-based architectures are designed upon the deep neural learning paradigm. Therefore, they still encountered several challenges with regard to the feature noise and data uncertainty. To overcome these limitations, in this paper, we proposed a novel integrated SNN and neuro-fuzzy network (NFN) graph embedding technique, called as: SFGE. Our SFGE model is designed to better capture the multi-way structural representation of graph by taking advances of different advanced graph-based and fuzzy-based neural learning techniques. By taking advances of neuro-fuzzy learning approach, our model can efficiently support to eliminate the feature uncertainty/ambiguity during the task-driven fine-tuning process. In addition, it also supports to better capture the rich multi-way relational structures of the input graphs under the topology-enhanced graph analysis approach. Extensive empirical studies within a real-world molecular graph dataset have effectiveness of our SFGE in dealing with graph classification task.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7

Similar content being viewed by others

Explore related subjects

Discover the latest articles, news and stories from top researchers in related subjects.

Data availability

There is no data associated with our studies in this paper.

Notes

  1. AERU (University of Hertfordshire): https://sitem.herts.ac.uk/aeru/

  2. PAN: https://www.pesticideinfo.org/

  3. ECOTOX: https://cfpub.epa.gov/ecotox

  4. BeeTox dataset: http://chemyang.ccnu.edu.cn/ccb/server/beetox/index.php/home/index

  5. RDKit: https://www.rdkit.org/

References

  1. Wu Z, Pan S, Chen F, Long G, Zhang C, Philip SY (2021) A comprehensive survey on graph neural networks. IEEE Trans Neural Netw Learn Syst 32(1):4–24

    Article  MathSciNet  MATH  Google Scholar 

  2. Han M, Wu H, Chen Z, Li M, Zhang X (2023) A survey of multi-label classification based on supervised and semi-supervised learning. Int J Mach Learn Cybern 14(3):697–724

    Article  MATH  Google Scholar 

  3. Cui P, Wang X, Pei J, Zhu W (2018) A survey on network embedding. IEEE Trans Knowl Data Eng 31(5):833–852

    Article  MATH  Google Scholar 

  4. Pham P, Do P (2020) W-Metagraph2Vec: a novel approval of enriched schematic topic-driven heterogeneous information network embedding. Int J Mach Learn Cybern 11(8):1855–1874

    Article  MATH  Google Scholar 

  5. Pham P, Nguyen LT, Pedrycz W, Vo B (2022) Deep learning, graph-based text representation and classification: a survey, perspectives and challenges. Artif Intell Rev 56(6):4893–4927

    Article  MATH  Google Scholar 

  6. Alzubi JA, Jain R, Nagrath P, Satapathy S, Taneja S, Gupta P (2021) Deep image captioning using an ensemble of CNN and LSTM based deep neural networks. J Intell Fuzzy Syst 40(4):5761–5769

    Article  Google Scholar 

  7. Browne F, Wang H, Zheng H (2018) Investigating the impact human protein–protein interaction networks have on disease-gene analysis. Int J Mach Learn Cybern 9:455–464

    Article  MATH  Google Scholar 

  8. Sharma A, Rani R (2020) Drug sensitivity prediction framework using ensemble and multi-task learning. Int J Mach Learn Cybern 11(6):1231–1240

    Article  MATH  Google Scholar 

  9. Wei Y, Ma H, Wang Y, Li Z, Chang L (2023) Dual graph attention networks for multi-behavior recommendation. Int J Mach Learn Cybern 14(8):2831–846

    Article  MATH  Google Scholar 

  10. Pham P, Nguyen LT, Nguyen NT, Pedrycz W, Yun U, Lin JCW, Vo B (2023) An approach to semantic-aware heterogeneous network embedding for recommender systems. IEEE Trans Cybern 53(9):6027–6040

    Article  MATH  Google Scholar 

  11. Mikolov T, Chen K, Corrado G, Dean J (2013) Efficient estimation of word representations in vector space. ICLR

  12. Pennington J, Socher R, Manning CD (2014) Glove: Global vectors for word representation. EMNLP, pp 1532–1543

  13. Perozzi B, Al-Rfou R, Skiena S (2014) Deepwalk: Online learning of social representations. SIGKDD, pp 701–710

  14. Grover A, Leskovec J (2016) Node2vec: Scalable feature learning for networks. SIGKDD, pp 855–864

  15. Xu W, Guo D, Mi J, Qian Y, Zheng K, Ding W (2023) Two-way concept-cognitive learning via concept movement viewpoint. IEEE Trans Neural Netw Learn Syst. https://doi.org/10.1109/TNNLS.2023.3235800

    Article  MathSciNet  MATH  Google Scholar 

  16. Alzubi JA, Alzubi OA, Beseiso M, Budati AK, Shankar K (2022) Optimal multiple key-based homomorphic encryption with deep neural networks to secure medical data transmission and diagnosis. Expert Syst 39(4):e12879

    Article  MATH  Google Scholar 

  17. Guo D, Jiang C, Sheng R, Liu S (2022) A novel outcome evaluation model of three-way decision: a change viewpoint. Inf Sci 607:1089–1110

    Article  MATH  Google Scholar 

  18. Alzubi OA, Alzubi JA, Alweshah M, Qiqieh I, Al-Shami S, Ramachandran M (2020) An optimal pruning algorithm of classifier ensembles: dynamic programming approach. Neural Comput Appl 32:16091–16107

    Article  Google Scholar 

  19. Movassagh AA, Alzubi JA, Gheisari M, Rahimi M, Mohan S, Abbasi AA, Nabipour N (2021) Artificial neural networks training algorithm integrating invasive weed optimization with differential evolutionary model. J Ambient Intell Humaniz Comput. https://doi.org/10.1007/s12652-020-02623-6

    Article  Google Scholar 

  20. Kipf TN, Welling M (2017) Semi-supervised classification with graph convolutional networks. ICLR

  21. Hamilton W, Ying Z, Leskovec J (2017) Inductive representation learning on large graphs, vol 30. NeurIPS

  22. Veličković P, Cucurull G, Casanova A, Romero A, Lio P, Bengio Y (2018) Graph attention networks.ICLR

  23. Feng W, Zhang J, Dong Y, Han Y, Luan H, Xu Q, Tang J (2020) Graph random neural networks for semi-supervised learning on graphs. NeurIPS 33:22092–22103

    MATH  Google Scholar 

  24. Brody S, Alon U, Yahav E (2022) How Attentive are Graph Attention Networks?. ICLR

  25. Xu K, Hu W, Leskovec J, Jegelka S (2019) How Powerful are graph neural networks?. ICLR

  26. Hu W, Liu B, Gomes J, Zitnik M, Liang P, Pande V, Leskovec J (2019) Strategies for pre-training graph neural networks. ICLR

  27. Ying Z, You J, Morris C, Ren X, Hamilton W, Leskovec J (2018) Hierarchical graph representation learning with differentiable pooling. NeurIPS 31

  28. Zhang M, Cui Z, Neumann M, Chen Y (2018) An end-to-end deep learning architecture for graph classification. AAAI 32(1)

  29. Hofer C, Graf F, Rieck B, Niethammer M, Kwitt R (2020) Graph filtration learning. ICML, pp 4314–4323

  30. Zhao Q, Ye Z, Chen C, Wang Y (2020) Persistence enhanced graph neural network. AISTATS, pp 2896–2906

  31. Horn M, De Brouwer E, Moor M, Moreau Y, Rieck B, Borgwardt K (2022) Topological graph neural networks. ICLR

  32. Hensel F, Moor M, Rieck B (2021) A survey of topological machine learning methods. Front Artif Intell 4:681108

    Article  MATH  Google Scholar 

  33. Ebli S, Defferrard M, Spreemann G (2020) Simplicial neural networks. NeurIPS 2020 Workshop TDA and Beyond

  34. Bunch E, You Q, Fung G, Singh V (2020) Simplicial 2-complex convolutional neural nets. NeurIPS 2020 Workshop TDA and Beyond

  35. Yang M, Isufi E, Leus G (2022) Simplicial convolutional neural networks. ICASSP, pp 8847–8851

  36. Cinque DM, Battiloro C, Di Lorenzo P (2023) Pooling strategies for simplicial convolutional networks. ICASSP, pp 1–5

  37. Xu W, Yuan K, Li W, Ding W (2022) An emerging fuzzy feature selection method using composite entropy-based uncertainty measure and data distribution. IEEE Trans Emerg Topics Comput Intell 7(1):76–88

    Article  MATH  Google Scholar 

  38. Xu W, Guo D, Qian Y, Ding W (2022) Two-way concept-cognitive learning method: a fuzzy-based progressive learning. IEEE Trans Fuzzy Syst. https://doi.org/10.1109/TFUZZ.2022.3216110

    Article  MATH  Google Scholar 

  39. Yuan K, Xu W, Li W, Ding W (2022) An incremental learning mechanism for object classification based on progressive fuzzy three-way concept. Inf Sci 584:127–147

    Article  MATH  Google Scholar 

  40. Guo D, Xu W (2023) Fuzzy-based concept-cognitive learning: An investigation of novel approach to tumor diagnosis analysis. Inf Sci 639:118998

    Article  MATH  Google Scholar 

  41. Guo D, Xu W, Qian Y, Ding W (2023) M-FCCL: Memory-based concept-cognitive learning for dynamic fuzzy data classification and knowledge fusion. Inf Fusion 100:101962

    Article  MATH  Google Scholar 

  42. Zhang L, Shi Y, Chang YC, Lin CT (2023) Robust fuzzy neural network with an adaptive inference engine. IEEE Trans Cybern. https://doi.org/10.1109/TCYB.2023.3241170

    Article  MATH  Google Scholar 

  43. Guo D, Xu W, Qian Y, Ding W (2023) Fuzzy-granular concept-cognitive learning via three-way decision: performance evaluation on dynamic knowledge discovery. IEEE Trans Fuzzy Syst. https://doi.org/10.1109/TFUZZ.2023.3325952

    Article  MATH  Google Scholar 

  44. Pham P, Nguyen LT, Nguyen NT, Kozma R, Vo B (2023) A hierarchical fused fuzzy deep neural network with heterogeneous network embedding for recommendation. Inf Sci 620:105–124

    Article  MATH  Google Scholar 

  45. Pham P, Nguyen LT, Vo B, Yun U (2022) Bot2Vec: A general approach of intra-community oriented representation learning for bot detection in different types of social networks. Inf Syst 103:101771

    Article  Google Scholar 

  46. Vaswani A, Shazeer N, Parmar N, Uszkoreit J, Jones L, Gomez AN, Polosukhin, I (2017) Attention is all you need. NeurIPS 30

  47. Lee J, Lee I, Kang J (2019) Self-attention graph pooling. ICML, pp 3734–3743

  48. Yang M, Isufi E, Schaub MT, Leus G (2021) Finite impulse response filters for simplicial complexes. EUSIPCO, pp 2005–2009

  49. Barbarossa S, Sardellitti S (2020) Topological signal processing over simplicial complexes. IEEE Trans Signal Process 68:2992–3007

    Article  MathSciNet  MATH  Google Scholar 

  50. Schaub MT, Zhu Y, Seby JB, Roddenberry TM, Segarra S (2021) Signal processing on higher-order networks: Livin’on the edge... and beyond. Signal Process 187:108149

    Article  MATH  Google Scholar 

  51. Bodnar C, Frasca F, Wang Y, Otter N, Montufar GF, Lio P, Bronstein M (2021) Weisfeiler and lehman go topological: Message passing simplicial networks. ICML, pp 2005–2009

  52. Giusti L, Battiloro C, Di Lorenzo P, Sardellitti S, Barbarossa S (2022) Simplicial attention networks. ICLR 2022 Workshop on Geometrical and Topological Representation Learning

  53. Goldberg TE (2002) Combinatorial Laplacians of simplicial complexes. Senior Thesis, Bard College vol 6

  54. Lim LH (2020) Hodge Laplacians on graphs. SIAM Rev 62(3):685–715

    Article  MathSciNet  MATH  Google Scholar 

  55. Wang F, Yang JF, Wang MY, Jia CY, Shi XX, Hao GF, Yang GF (2020) Graph attention convolutional neural network model for chemical poisoning of honey bees’ prediction. Sci Bull 65(14):1184–1191

    Article  MATH  Google Scholar 

Download references

Acknowledgements

This research is funded by HUTECH University, Ho Chi Minh City, Vietnam.

Funding

This research was funded by HUTECH University.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Phu Pham.

Ethics declarations

Conflicts of interest

Employment: Phu Pham is currently working as full-time researcher and lecturer at HUTECH University. Financial interests: Phu Pham has received research support from HUTECH University. Non-financial interests: None.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Pham, P. An integrated simplicial neural network with neuro-fuzzy network for graph embedding. Int. J. Mach. Learn. & Cyber. 16, 233–251 (2025). https://doi.org/10.1007/s13042-024-02201-8

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s13042-024-02201-8

Keywords